PDA

View Full Version : Some questions about ImagingControl SDK



ofr
November 24, 2006, 16:49:47
I have a few question about the Imaging Control SDK.
I'm using VS2005/C++ + Win2000, IC 3 + VS2005 patch:

I want to grab frames from a device as fast as possible and
pass the frames to my own processing functions which run in an
extra thread. If the device produces color frames, I want them in
a 3x8 bit format, and if it is mono then in a 1x8 bit format.

This is what I do now:
- create grabber
- setOverlayBitmapPathPosition( ePP_NONE )
- openDevByDisplayName(selected device)
- setVideoFormat(selected format)
- create FrameHandlerSink(DShowLib::eRGB8,0) (also tried other formats)
- setSnapMode(false)
- setSinkType(the sink)
- setDeviceFrameFilters(my frame interceptor instance)
- startLive(false)

The filter's transform() function copies the frame data to an
internal double buffer and signals the processing thread.

This works, but with high CPU load the frames come with a delay.


The manual recommands under "Conecepts and Components" a few
things to achieve higher performance, but I'm not sure that I have
understand them.
"prevent automatic colorspace conversion."
Since devices do not produce color formats which are always either
RGB24 or Y8, one must at least once convert them to the desired
format, right? If so, how do I do it most efficiently?

"do not use the ring buffer".
OK, but how? Maybe I'm doing it right but at the moment, when
processing of the frames produces a high CPU load, the frames
come in delayed, sometimes 1-2 seconds (but they are never lost).
This must mean, there is a ring of mem buffers involved, right?
How do I turn it off?
Since low latency is more important than procesing of all frames,
I would prefer to drop frames that to have a delay.

Is there a color format, that encodes the channels as separated planes?

How can the program detect, that a camera has color or not to set
a matching sink color type?

Should I intercept frames with a FrameFilter or a GrabberListener?
I'm not sure that I have understand the difference between the two
concepts.

Another problem: The debugger shows that some threads exit with code 2,
some with -2147024637 (0x80070103) and there are also memory leaks.


Oliver

Stefan Geissler
November 27, 2006, 09:58:58
Hi Oliver,

In general you are doing all things right. But first of all I would use the GrabberListener instead of a Framefilter, because on image copying is saved. Framefilters like to copy frames.


Since devices do not produce color formats which are always either
RGB24 or Y8, one must at least once convert them to the desired
format, right? If so, how do I do it most efficiently?

Conversion from RGB24 to Y8 could be expensive. I guess the Windows Colors Space Transform filter is used for this. This filter is slow. You may start your application and start graphedit.exe (DirectX SDK) parallel. With graphedit you can connect to the "Remote Graph" and check which filters are used by your application. If the Color Space Transform filter is in use, you have found one source of your CPU load. If your WDM driver provides UYVY or Y800 video formats, then please use these formats. The YUV transform filter that comes with IC Imaging Control is much faster than the Color Space Transform filter.


"do not use the ring buffer".
OK, but how? Maybe I'm doing it right but at the moment, when
processing of the frames produces a high CPU load, the frames
come in delayed, sometimes 1-2 seconds (but they are never lost).
This must mean, there is a ring of mem buffers involved, right?
How do I turn it off?
Since low latency is more important than procesing of all frames,
I would prefer to drop frames that to have a delay.

Please use the ring buffer, there is no other way to capture the incoming frames.



Is there a color format, that encodes the channels as separated planes?
There is no format like this supported directly by IC Imaging Control. The UYVY video format does this and might is interesting for you. This "Y" values represent the brightness of a pixel. Two pixels share on "U" and one "V" value to get the color. You can use UYVY as sink if then video format of the video capture device is UYVY too.


How can the program detect, that a camera has color or not to set
a matching sink color type?
Please have a look into the "callback" sample. It shows how to set the sink color format to the selected video format or the video capture device. At least you select the video format of the video capture device. DirectShow will try to convert this into the format that is needed in the sink or in the display.



Should I intercept frames with a FrameFilter or a GrabberListener?
I'm not sure that I have understand the difference between the two
concepts.

As far as I understand your application, you should use the GrabberListener, like it is shown in the "callback" sample. The "callback" sample shows a "worst case" situation. With the "Sleep(250)" command a very long image processing is simulated. Thus the "frameReady" method is only called after a previous call to "frameReady" has been completed an the method has returned. This could be use to throw away frames, that are not processed due to too long image processing times. Nevertheless even while "frameReady" is at work, incoming frames are saved into the ringbuffer. If you use the "callback" sample, please remove the "Sleep(250)" line :-).


Another problem: The debugger shows that some threads exit with code 2,
some with -2147024637 (0x80070103) and there are also memory leaks.

This sounds for a not called "ExitLibrary()" at the end of the application.