View Full Version : Strange problem when start more than 1 video stream

November 9, 2004, 23:52:19

I have a strange problem when write my program to capture synchronized video from several Sony DFW-X700 cameras using the IC Imaging Control 1.41 library.

The driver is downloaded trial version from imagingsource. I modify the program based on the example MemBufferCollection. It setup all the video capture devices (3 sony camera) using a single video format and allocate a set of buffers to store frames. The camera is set to external trigger mode and the program set out tigger signal through parallel port to trigger the cameras.

The problem shows up when try to capture videos in 1024x768 format, the startLive() for the second device will generate problem. It seems it is calling an DShow filter CalibFilter.ax (comes from OpenCV) and generate an access vilation. And it only happen in 1024x768 and more than one camera were used. 640 * 480 with 3 camere or 1024*768 with 1 camera all works fine.

The code looks like this:

printf("Initializing cameras\n");

Grabber grabber;

Grabber::tVidCapDevListPtr pVidCapDevList = grabber.getAvailableVideoCaptureDevices();
if( pVidCapDevList == 0 || pVidCapDevList->empty() )
return false;

grabber.openDev( pVidCapDevList->at(0) );

// Get the list of all available video formats.
printf( "\n\nAvailable video formats: \n" );
Grabber::tVidFmtListPtr pVidFmtList = grabber.getAvailableVideoFormats();
if ( pVidFmtList == 0 )
if ( grabber.getLastError() )
fprintf( stderr, "Error: %s\n", grabber.getLastError().c_str() );
return false;

// Iterate the list of available video formats and print the name of each
// format.
int i = 0;
for ( Grabber::tVidFmtListPtr::value_type::iterator fmt_it =
fmt_it != pVidFmtList->end();
++fmt_it )
printf( "[%i] %s\n", i++, fmt_it->c_str() );

int input=0;
int choice;

// Prompt the user to select a video format.
printf( "Your choice: ");
input=scanf( "%i", &choice );
if ( input == 0 || choice < 0 || choice >= pVidFmtList->size() )
return -1;

// Set the selected video format.
grabber.setVideoFormat( pVidFmtList->at( choice ) );
if ( grabber.getLastError() )
fprintf( stderr, "Error: ", grabber.getLastError().c_str() );
return -1;

// do the same thing for the rest of the camera
int camNum = pVidCapDevList->size();

pGrabber *pGrabArray = new pGrabber[camNum];
//pGrabArray[0] = &grabber;

for ( i = 0; i < camNum; i++) {
pGrabArray[i] = new Grabber();
//pVidCapDevList = pGrabArray[i]->getAvailableVideoCaptureDevices();
pGrabArray[i]->openDev( pVidCapDevList->at(i) );

// not set the video norm, what is this?
//pVidFmtList = pGrabArray[i]->getAvailableVideoFormats();
pGrabArray[i]->setVideoFormat( pVidFmtList->at( choice ) );

#define FRAME_NUM 25

Buf3Array pBufArray = new Buf2Array[camNum];

int size_buffer = grabber.getAcqSizeMaxX() * grabber.getAcqSizeMaxY() * 3; //3 byte per pixel.
Grabber::tMemBufferCollectionPtr *pMemBufferCollection = new Grabber::tMemBufferCollectionPtr[camNum];

// set the external trigger
for (i = 0; i < camNum; i++) {

if (!pGrabArray[i]->setExternalTrigger(true)) {
printf("Error - pGrabArray[%d]'s External Trigger\n", i);
return -1;

pGrabArray[i]->setSinkType( FrameGrabberSink( FrameGrabberSink::eGRAB) );

pBufArray[i] = new Buf1Array[FRAME_NUM];
for (int j = 0; j < FRAME_NUM; j++) {
pBufArray[i][j] = new BYTE[size_buffer];

// Create a new membuffer collection that uses our own image buffers.
pMemBufferCollection[i] = pGrabArray[i]->newMemBufferCollection( size_buffer, pBufArray[i], FRAME_NUM);

// Make the collection the active one.
pGrabArray[i]->setActiveMemBufferCollection( pMemBufferCollection[i] );

if( pGrabArray[i]->getLastError() ) {
printf("%s\n", pGrabArray[i]->getLastError().c_str());
return -1;


std::cout << "Press any key to start grabbing" << std::endl;

for (i = 0; i < camNum; i++) {
if (!pGrabArray[i]->startLive(false)) {

/************ Problem here !!!! **************
// startLive for the second camera calls the CalibFilter.ax from OpenCV, and generate an access vilation error. This is really strange, it only happens when VideoFormat set to 1024x768 (640*480 works fine for the whole program) I have tried to uninstall OpenCV (or unregister the CalibFilter.ax ), then the program complains about the "CFilter->getInputPin()" in DShowLib, says "no pin is found" Please Help!


printf("Error - Start grabber Streaming\n");



Stefan Geissler
November 10, 2004, 12:10:10
Hello Tianli,

First of all, i can not say whether third party filters are not error free, so i would suggest to unregister the CalibFilter .ax.

The X710 camera needs a lot of bandwidth on FireWire bus, so i suggest to use more than one FireWire boards. You can connect two of these cameras parallel into one FireWire board. Please see here: http://www.1394imaging.com/resources/backgnd/1394/video_bandwidth

November 10, 2004, 16:26:44
Thanks for the reply. The problem is, when I unregister the filter "CalibFilter.ax", using command "regsvr32 /u CalibFilter.ax". I will get Exception when call the "startLive(false)" for the second and third camera.

The detailed error message is like this:

Exception DEBUG: in C:\CSource\core\TISUDSHL\Grabber.cpp at line 982:

CFilter::getInputPin( unsigned int i) const : pin not found
In file: C:\CSource\core\DShowLib\Filter.cpp at line: 359

Continue? (say "no" to rethrow catched exception or "yes" to continue without doing anything)

This happens for the second and third camera, and only happens when 1024*768 is choosed. I suspect that the DShowLib calls the wrong filter. Please give me some idea on how to fix this. Thanks!


Stefan Geissler
November 11, 2004, 08:54:07
Tianli ,

The bandwidth allocation on the FireWire bus is over 100%. As i wrote in my last post, check the bandwidth allocation. You can not use two or more of these cameras with 1024*768 resolution on one FireWire board. You would need for each camera one FireWire board.

Please follow the link i posted in my last post.

November 13, 2004, 03:21:43
But that table shows bandwidth also depends on the frame rate? I can see at least I can use 1024*768 at 7.5fps for two cameras. Please tell me how to set the correct frame rate. The choice in the VideoFormat only let you select the frame size.

Also, since I am using external trigger signal, the frame rate is completely controlled by my program. I don't think bandwidth is a problem here. I cannot even start to capture a single synchronized frames.


Stefan Geissler
November 15, 2004, 08:14:57
Hello Tianli,

It plays no rule, whether you use an external trigger or not. The frame rate on the FireWire bus is allocated, when you set the video format, set the frame rate and call startLive(). This is determined by the FireWire and DCam specification. This should guarantee, that the all images are delivered over the FireWire bus.

You use a SONY DFW X710. This camera has per default 15 fps. This means you will need 77% of bandwidth for one camera. Except you slow down the frame rate.

Have a look to the User's Guide: http://www.imagingcontrol.com/ic/docs/html/class/meth_descGrabber_setFrameRate.htm
You may set your video format. Then you call

grabber.setFrameRate(285); to set the slowest available frame rate. This should be 3.5 fps. With this, you should be able to start all three cameras on one bus. (May be you should retrieve a list of all available frame rates and select the slowest of them. This is anyway the correct method, especially when you will use other cameras later without changing your source code.)
The following code sample for IC Imaging Control 2.0 sets the video format and the slowest available frame rate:

grabber->setVideoFormat("UYVY (1024x768)");
Grabber::tFrameRateListPtr FrameRateList = grabber->getAvailableFrameRates();
if ( !(FrameRateList == NULL || FrameRateList->empty()) )

The code implies, that the frame rates are sorted ascendend. This is done by our drivers.

You may install IC Capture. It has a toolbar, which is used to set easily the video format and frame rate. You can display all of your cameras synchronously.

November 16, 2004, 19:48:04
Thanks! Stefan.
Set the frame rate to 7.5fps solve the problem.