PDA

View Full Version : More control over cameras



voreno
January 28, 2014, 16:17:37
Hi Everyone,
I have recently ordered a pair of DFK 23UP021 cameras for a research project to develop a stereo camera system (cameras in parallel next to each other for 3D reconstruction purposes).

I have been using the IC Imaging Control SDK to control the cameras, however I am finding it difficult to have proper full control over the cameras: the SDK doesn't seem to allow the user to handle the data capture and download with freedom. To ensure the success of the project, both cameras need to be properly synchronized. I have gone through the documentation and the SDK reference and cannot find ways to control the cameras better. For example, to get a live video stream in C# you have to call the LiveStart method on a icCameraControl method. However that method doesn't allow me to get accurate time-stamps of when the individual frames are grabbed, nor when they are taken on the camera side. I can imagine this functionality might be hidden somewhere, but I haven't been able to find it.

My questions are the following:
1. is there an open-source alternative SDK that can be used to control the DFK camera series?
2. is there an IC Control SDK version that has more of its underlying private methods turned public such that the user can have more control over the hardware?
3. Alternatively, could someone advise me how I can exert more control over the cameras? Ideally I would be happy to write low-level C/C++ or even assembly routines to get the kind of access I need.

Please let me know if I haven't been clear enough!
All the best!

Stefan Geissler
January 28, 2014, 16:35:43
Hello


1. is there an open-source alternative SDK that can be used to control the DFK camera series?

Which camera models do you use? The USB and FireWire models have no internal clock and therefore do not send time stamps. That means, even another SDK wont help. You can write your own using DirectShow directly, so you do not need IC.

The time stamps of the image buffers are created by the driver, when he is informed by the operating system about a new frame. This is all information, we currently can add to the frame.


2. is there an IC Control SDK version that has more of its underlying private methods turned public such that the user can have more control over the hardware?
No, because you can not get more control over the hardware, as there already is.


3. Alternatively, could someone advise me how I can exert more control over the cameras? Ideally I would be happy to write low-level C/C++ or even assembly routines to get the kind of access I need.

Maybe there is a misunderstanding.

I just wonder, how you synchronize the camera, which frame rate you need for image delivery.


I can imagine this functionality might be hidden somewhere, but I haven't been able to find it.
If there was a functionality, we would save it in the image buffer timestamps.

voreno
January 28, 2014, 16:51:25
Hi Stefan,
Thank you for your reply!

The camera model I'm using is a USB3: DFK 23UP021


I just wonder, how you synchronize the camera, which frame rate you need for image delivery.
I am controlling both cameras via a single USB3 HUB which is then connected via one USB3 to my computer. As such, there will be a delay between asking for a frame to both cameras and then receiving the separate image feeds through one USB3 port. I haven't fully implemented this part, I'm currently working on this. I need a frame-rate of 5fps, which amounts to about 100MB/s with both cameras connected. USB3 can reach about 480MB/s assuming no delays.

I'll try and use directly the image buffers to try and have more control over the request/retrieval of data from the cameras.


You can write your own using DirectShow directly, so you do not need IC
I have performed a quick search on the web about DirectShow. Is there any chance you'd be able to recommend some resources/guides that provide examples of controlling theImagingSource cameras via DS? That would be very helpful!

Best regards!

Stefan Geissler
January 28, 2014, 17:49:06
IC Imaging Control encapsulates DirectShow, so the access is easier for you. Also IC gives access to the non DirectShow parameters of the camera. Therefore, if you program on your own using DirectShow, you wont win anything.

You run the cameras untriggerend and hope to receive images at the same point of time? This wont work.

If you want to receive synched pairs of images, you should run the cameras triggers and trigger them at the same point of time. Ususally this is done by harware trigger. You can also use software trigger. You fire the software triggers for both cameras in separate threads, because the the software trigger push() command needs some milli seconds to return.

You also use a callback to wait for the images, one callback per camera. After you received images in both callbacks, you can be sure, to have images taken at nearly the same point of time.

Some sample code how to use the software trigger in C# (VB will be similar.)


private TIS.Imaging.VCDButtonProperty SW_Trigger;

private void QuerySoftwareTrigger()
{
TIS.Imaging.VCDPropertyItem Trigger;
TIS.Imaging.VCDSwitchProperty TriggerEnable;

Trigger = icImagingControl1.VCDPropertyItems.FindItem(TIS.Im aging.VCDIDs.VCDID_TriggerMode);


TriggerEnable = (TIS.Imaging.VCDSwitchProperty)Trigger.Elements.Fi ndInterface(
TIS.Imaging.VCDIDs.VCDElement_Value + ":" +
TIS.Imaging.VCDIDs.VCDInterface_Switch);


SW_Trigger = (TIS.Imaging.VCDButtonProperty)Trigger.Elements.Fi ndInterface(
"{FDB4003C-552C-4FAA-B87B-42E888D54147}:" +
TIS.Imaging.VCDIDs.VCDInterface_Button);

if (SW_Trigger == null)
{
MessageBox.Show("Software Trigger is not supported by the current device!");
}

TriggerEnable.Switch = true;
}

public void Trigger()
{
if (icImagingControl1.DeviceValid && SW_Trigger != null)
{
try
{
SW_Trigger.Push();
}
catch (Exception Ex)
{
Console.WriteLine("Push failed : " + Ex.Message);
}
}
}


You should implement the ImageAvailable event as callback, if you use software trigger. Do not use MemorySnapImage().
How to calling the Trigger() function in an own thread can be found in the MSDN.

florixyz
January 30, 2014, 11:40:29
Hi,
I've just read this thread on timing and trigger. My issue is a bit related, but also a bit off. I'll try to post it here, though, as it might also be of interest to this thread:

In short: I've got the DFK 42BUC03 USB 2.0 camera, and occasionally get dropped frames.
My setup:
- I run Processor Idle State Manager (have an intel core i3 cpu on a samsung paris 480 pro laptop, 64-bit, Win 7)
- External trigger with a +5v pulse (rising edge - I assume this is the default?) from a reed switch triggered by a magnet
- Full 1.3 MP resolution, Y800 format
- Ring buffer Size of 200
- Using the ImageAvailable event to save buffer pointers in a queue, and lock the correpsonding buffers
- I compute the time between two ImageAvailable events using the system timer.
- It generally appears to be stable at around 120 ms which corresponds to the speed of the my trigger pulses.
- If the time between events is larger than 1.2 times the previous time, I increase a dropped frames counter and do not update the time.
- In a lower priority background thread, I save the queued locked buffers to an eSATA SSD, then unlock them.

Now my observations:
- I capture ~4000 frames.
- My estimated dropped frames counter estimates ~20 dropped frames there.
- I can say 100% that if the time between images is never lower than 0.8 * 120 ms because otherwise the lower time would be used as reference and the dropped frames counter would go up continuously.
- The dropped frames counter in icImagingControl is always 0
- The bufferIndex counters in the imagebuffers handed to imageAvailabel are always continuous (except for the reset to 0 at the ring buffer end, of course), which also indicates no missed events
- Occasionally there a 2-10 corrupted lines in some of the (non discarded) images.
- This leaves three possibilities:
- The reed switch fails sometimes (however I am quite certain that this is not the case, but I will investigate to rule this out)
- USB transmission errors occur which lead to discarded frames
- A lag in the camera that causes it to miss triggers

I have tried:
- 2 different usb cables (1 of it brand new, the other one from a USB 2.0 harddrive)
- Running the capture application with real-time priority as administrator (Windows 7)
- Disabling the write cache on the drive where the data is written to

None of the above has brought my estimated dropped frame count to 0. The estimated required transmission time should be < 50 ms and indeed I can decrease my trigger period down to ~50-55 ms without a significant increase in dropped frames.

Any ideas what could be going wrong and how to debug this any better? I assume there is not "missed triggers" counter on the camera or a "corrupted frames" counter in the driver or any way to identify this? I don't have an USB 3.0 port (yet) to test on that.

I think these issues might also be of interest to voreno, as dropped images might break his sync of the stereo images. On the other hand he is using an USB 3.0 camera, so things might be different there..

Thanks for your time!
Florian

Stefan Geissler
January 30, 2014, 12:41:44
Florian,

The stripes and frame drops happen, if the USB controller does not pick up all data packages from the camera and the camera therefore has a buffer overrun in its own USB controller.
The camera does not miss any trigger pulse, if it comes, while the camera is not busy with exposure and image data transfer. I tested this intensively. (I used a frequency generator, an external pulse counter hardware and a software, that counts incoming frames. I ran 150000 trigger pulses and received 150000 images.)
With USB 3 cameras or even an USB 3.0 port, the things are better, as far as I saw this.

florixyz
January 30, 2014, 13:42:40
Dear Stefan,

thanks for this answer. So you can confirm that not only the stripes (less of concern) happen when the controller fails to pick up some data, but also full frames are dropped sometimes even if part of the frame was received by the driver?

In that case, is there a way to control the behaviour? I.e. return the corrupted frame and not discard it?

Is it only a "host side" issue, i.e. camera is transmitting data w.o. feedback (like UDP) and if the data is lost it is lost.. Or is the camera also buffering the data re-trying the transmission of an image as long as it hasn't been successfully transferred (and discarding other images in the meantime, or aborting the transfer re-try on the next external trigger?)

Btw.: relating to my last post on the lost exposure settings, this still happens occasionally, and it's probably a mixture of too long delay of the setting and USB data loss. So thus, it would be a nifty feature in the future some time to have an exposure series per trigger in the firmware. Say you have a 2 bit exposure counter register and the camera config allows you to set 4 exposure values, one for every state of the register that are applied reliably in the camera without further host based interaction.

Stefan Geissler
January 31, 2014, 09:39:21
Hello


I have recently ordered a pair of DFK 23UP021 cameras for a research project to develop a stereo camera system (cameras in parallel next to each other for 3D reconstruction purposes).

This camera model has no trigger support. Therefore, exact synchronizing is not easy, if not impossible. You may decided for the wrong cameras models and should exchange them against one with trigger e.g. DFK 23U445

voreno
February 12, 2014, 16:14:05
You should implement the ImageAvailable event as callback, if you use software trigger. Do not use MemorySnapImage().
How to calling the Trigger() function in an own thread can be found in the MSDN.


Dear Stefan,
Thank you very much for this reply, it has been very useful. Being new to C# I am finding it a bit difficult to properly implement the ImageAvailable event as a callback and implementing multi-threading.

If I understood correctly, my program logic should be:

1. Initialize both cameras via two separate ICImagingControl objects
2. Enable software triggering for both cameras
3. Start a timer with a tick handler at the desired FPS (say 5fps). at the same time, call LiveStart() method for both ICImagingControl objects so that both cameras are running
4. once the timer ticks every 0.2s, call the Trigger method for both cameras in a separate thread
5. It is at this point that I have to handle the separate ImageAvailable events using a callback (via delegate).

I am correct in using this logic? Would you have a very basic example of how to implement ImagaAvailable callback? (I'm not too sure how to create the delegate and set everything up to work properly)!

All the best and thank you for your support!
Best wishes!

Stefan Geissler
February 12, 2014, 17:54:26
Hi

Your logic looks fine to me.


5. It is at this point that I have to handle the separate ImageAvailable events using a callback (via delegate).

On the forms editor, select the IC Imaging Control, show its properties and select the events. The ImageAvailble will be listed there. Doubleclick it and the ImageAvailable event handler will be added to your source. Perform in the same way with the second ImagingControl.