PDA

View Full Version : TIS USB Cameras and YUY2



jdbethun
April 22, 2008, 01:25:21
Hi Stefan,

We're testing out some of your USB cameras (actually we asked for the firewire equivalents and somehow received the USB versions instead, so I thought I'd give them a go while we have them here). At any rate, I'm noticing the format is showing up as YUY2. The spec sheets for the DFK21AU04 and the DFK31AU03 say it's supposed to be UYVY. I installed this on two different computers using your newest 3.06 software, and the same thing. My problem is your tColorformatEnum only has UYVY which is not the same macropixel ordering as YUY2. So what should I use for it?

Cheers,

Jeff

Stefan Geissler
April 22, 2008, 10:08:05
Jeff,

do you need the YUY2 format in memory? The tColorformatEnum contains only the pixel formats for the converted images in the memory. There is no conversion from YUY2 to UYVY available, in case you work on this format.

I also apologize for the wrong documentation.

jdbethun
April 22, 2008, 20:23:28
Yeah we need them in memory unfortunately. We lock, unlock, and unlock save images in the ring buffer, so unless I'm missing a way to go about it, I don't think we'll be able to use these guys.

Stefan Geissler
April 23, 2008, 09:27:45
Jeff,

You can create a sink with the sink with the MEDIASUBTYPE_YUY2:



tFrameHandlerSinkPtr pSink = FrameHandlerSink::create( MEDIASUBTYPE_YUY2, 3 );


Related topics in the documentation are:
FrameHandlerSink::create
http://www.imagingcontrol.com/support/documentation/class/meth_descFrameHandlerSink_create.htm

FrameTypeInfo:
http://www.imagingcontrol.com/support/documentation/class/meth_descFrameTypeInfo_FrameTypeInfo.htm

Standard Mediasubtype GUIDs
http://www.imagingcontrol.com/support/documentation/class/enum_descconstants_Standard_Mediasubtype_GUIDs.htm
(Unfortunately the YUY2 is not listed there, but it exists at least in IC V 3.0.6)

jdbethun
April 24, 2008, 21:39:59
Thanks Stefan.... worked like a charm. I'm able to simultaneously start live our two cameras:

640x480 YUY2 15 fps
1024x768 YUY2 15 fps

Which is a requirement for us, so that's great. BUT, if I plug in a 3rd camera (just plug it in, not connecting to it via software nor started live) then I can no longer do this. This is regardless if I'm using a powered hub or not, and in different combinations (2 on hub, one in laptop... 2 in laptop, one in hub, etc). I realize I can always switch to BY8, but i'm a bit reluctant to do so as it has an impact on processing that I haven't quantified yet for our product.

At any rate, thanks for you help as usual, and if you have any insight on the "3 camera" problem that'd be great.

Cheers,
jeff

Stefan Geissler
April 25, 2008, 10:54:50
Hi Jeff,

I think, it is a bandwdith limitation problem, as we have at FireWire too:

640x480 YUY2 15 fps app 32%
1024x768 YUY2 15 fps app 77%

Thus a third camera can not be started.
The use of BY8 reduces the bandwidth allocation to the half, because only one byte per pixel is transfered. YUY2 transfers two bytes per pixel.

jdbethun
April 25, 2008, 20:15:16
...back to firewire for a moment as we received the firewire cameras now:

This is what confuses me about those numbers (77% and 32%) which obviously implies that I'm using 109% of the bandwidth which shouldn't be possible. Usually in your software when I try to open two cameras that overrun the bandwidth, the second camera will error and actually say there's not enough bandwidth. But in this configuration it'll allow me to open them fine, but you start to see artifacts in the image stream (chopping, etc).

Two Sony cameras (unfortunately the ones that are discontinued now) in that configuration work fine (no artifacts), as well as two Unibrain cameras in that configuration also work fine (their docs have those numbers at 68.3% and 28.4% respectively).

So which one of you is right? I know how they're calculating it:
ie. 1024x768 = 3072 bytes/packet / 4500 bytes/packet max = 68.2%.

Whether or not that's correct I'm not sure as I've seen the number 4096 thrown around as the maximum per packet allowance too. Then that'd be 75% not 77%.

Anyways, I'm just curious how you arrive at those numbers and what's the calculation. I realize that only 80% of the bus is allowed for isochronous transfer (393.216 Mbps * 0.8 = 314.573 Mbps or 39 MB/s max). So I'm guessing you shouldn't be allowed to go over that. I also believe that the firewire header/CRC for isochronous mode is about a total of 12 Bytes I believe (header, header CRC, data CRC 4 bytes each I believe). So I know that's also going to take up some of the bandwidth (but not much).

Sorry for my ramblings here, but this issue has bothered me for some time as to why the Sony's and other cameras seem to work fine in our configuration.

Cheers,
Jeff

Stefan Geissler
April 28, 2008, 12:11:12
Jeff,

The values we have published, are reference values. They are calculate by using the direct values and some space, that is necessary if the devices are chained instead connected to one hub. This may is the reason, why your cameras work fine.

I hope I repeated correctly what my engineers told me.