View Full Version : Delay Required After setInputChannel Call

August 16, 2005, 02:47:22
I'm using Imaging Control 2.1 with Microsoft Visual C++ (as packaged with Visual Studio .NET 2003). My DFG/1394 Video-To-Firewire converter has a single analog camera connected to COMP1.

After Grabber::setInputChannel has returned, there appears to be a slight delay before the set operation fully completes.

See the following code snippet. Toggling back and forth between setting channelIndex to 0 (the index of the channel with the analog camera connected) and to 1 (channel with no camera) produces inconsistent results if I remove the Sleep statement.

if ( ! mGrabber->setInputChannel( DWORD(channelIndex) ) )
return FAILURE;


if ( ( mGrabber->isSignalDetectedAvailable() ) && ( ! ( mGrabber->getSignalDetected() ) )
return NO_CAMERA;

return SUCCESS;

EDIT: Just found a related posting. It appears that the DFG/1394 chip needs time to synchronize to the new input channel.


Stefan Geissler
August 16, 2005, 08:27:08
Hello Barry,

if you change the input channel due to running live video, the grabber chip inside the DFG/1394-1 needs some frame to synchronize on the new video signal. This can be up to three frames. This is the slight delay you noted. It is normal behaviour. It may can be changed, if the cameras are synchronized, but i never tested this.
You may use following to detect the video signal as fast as possible:

int iTry = 4;

while( !(mGrabber->getSignalDetected()) && iTry > 0)
Sleep(40); // Frame rate duration, a little bit longer.

if( iTry <= 0 ) return NO_CAMERA;

August 16, 2005, 18:42:18
Hi Stefan,

Thanks for the quick response!

The code that you provided fails if I'm switching from a channel with a camera connected (channel 0), to a channel without a camera (channel 1). On the first pass of the while loop, getSignalDetected returns TRUE as the DFG/1394-1 is still operating with channel 0.

It appears that I'm required to insert a Sleep statement for the duration of synchronization period after calling setInputChannel. You stated that the synchronization period can be up to three frames so I'm altering my original delay of 200ms to be a function of the current frame rate (which I've limited to a range of 1-15 FPS):

delayInMilliSec = (1 / currentFPS * 1000ms/sec) * 4 frames

Let me know if I'm off-track. I appreciate the support.


Stefan Geissler
August 17, 2005, 08:24:05
Hello Barry,

I did not test the source code, it was only to give you an idea. If your approach works, then it will be well.