libdc1394 v1.x :: FAQ | |||||||||||||||||||||||||||||||||||||||||||||||||
This FAQ is provided on an "as is" basis. There is no warranty whatsoever, either express or implied, regarding the FAQ, including warranties with respect to its merchantability or fitness for any particular purpose. Recent ChangeLog
Table of Contents
Questions & AnswersWhat is libdc1394?libdc1394 is a library that is intended to provide a high level programming interface for application developers who wish to control IEEE 1394 based cameras that conform to the 1394-based Digital Camera Specification. The library development is an open-source project (homepage sourceforge.net/projects/libdc1394) licensed under the GNU Lesser General Public License (LGPL) www.gnu.org/copyleft/lesser.html. It was originally designed for the Linux platform but the new API version 2 is now also compatible with OSX. What is IEEE 1394?IEEE 1394 is the name of a serial bus standard designed to be a versatile, high-speed and inexpensive method of interconnecting a variety of consumer electronics devices and personal computers. Apple Computer calls exactly the same thing Firewire® (developer.apple.com/firewire) and Sony Corporation (www.sony.net) calls it iLink®. The IEEE 1394 standard is mostly about the electrical specifications of the drivers, receivers and cables. Different camera manufacturers implement different higher-level protocols for controlling their cameras, see "How do I know if my camera is supported?". Can I use it for my DV camcorder?No, libdc1394 is not for camcorders. It is meant for uncompressed video cameras which are mostly used for scientific, industrial, microscopy, webcam and photography applications. How do I know if my camera is supported?libdc1394 supports cameras that are compliant with the IIDC 1394-Based Digital Camera Specifications, also known as the DCAM spec. This is a CSR (control and status register) layer on top of IEEE 1394 and is used by many camera manufacturers. Check your camera user's manual or consult the excellent IEEE 1393 Digital Camera List (damien.douxchamps.net/ieee1394/cameras/). Can I run more than one camera simultaneously?Yes, as many 62 per interface card, subject to available IEEE 1394 bandwidth. The number 62 comes from 64 (the theoretical maximum) minus the interface card (which is also a device) and minus the reserved device 63 (which is a virtual broadcast device). Can I use any IEEE 1394 host card?The best set-up is for your host card to be OHCI (open host controller interface) compliant. The Adaptec AIC-5800 PCI-IEEE1394 and Texas Instruments PCILynx chips are also supported. You can get drivers (ohci1394, aic5800, pcilynx) from the IEEE 1394 for Linux homepage www.linux1394.org if they are not already in your distribution. Can I install more than one IEEE 1394 host card?Yes, as many as you have space for. Each card typically becomes a separate character device in /dev/video1394, numbered by its port number (for example /dev/video1394/0 for the first card). When you set up your cameras with one of the dc1394_dma_setup_...capture() functions, it will by default open the device corresponding to the port argument. If you need to, you can specify a device name explicitly in the dma_device_file member of the dc1394_cameracapture structure, otherwise you can set it to the null pointer for default behaviour. Will libdc1394 work with my version of the kernel?Kernel version 2.4.21 or later is strongly recommended. IEEE 1394 support in earlier kernel versions was incomplete and buggy. You are unlikely to get help from the mailing lists if you have an older kernel. Where can I find more documentation?
Where can I get the latest libdc1394?From the libdc1394 project homepage sourceforge.net/projects/libdc1394/. What else do I need apart from libdc1394?
Is there example code to get me started?
Why are there so many different set-up and capture functions?libdc1394 provides capture functions for
These combinations give rise to four set-up functions and six capture functions:
The table shows which capture functions go with which set-up functions: do not mix DMA and non-DMA functions. How does the DMA ring buffer work?This FAQ mostly assumes that you are using DMA to transfer frames from the host card to the frame buffers. DMA frame transfer is provided by the video1394 kernel module (and deeper levels of the Linux 1394 subsystem). Once the camera is set up (with dc1394_dma_setup_capture() or dc1394_dma_setup_format7_capture()), a contiguous block of memory-mapped frame buffers is waiting to be filled. It is organized as a ring buffer with each buffer internally set to a QUEUED state. Filling of the first buffer can start as soon as you start frame transmission with dc1394_start_iso_transmission(). Each buffer is set to the READY state as soon as it is filled. Frame transmission continues until you call dc1394_stop_iso_transmission(). If all of the buffers are filled then any further frames are thrown away (buffer overflow) until you stop transmission or make space by calling some capture functions. The first time you call a DMA capture function (see the list above) it returns a capture_buffer pointer to the first frame buffer (in the dc1394_cameracapture structure). Whether the function waits for a frame or returns immediately depends on whether the capturing is blocking or polling (see below). Each subsequent time you call a capture function it returns a pointer to the next frame in the DMA ring buffer (unless a polling call fails, see below). The accessed frame buffer is internally set to a FREE state. After a successful capture function call, the capture_buffer pointer and the frame buffer it points to are available to you for reading and writing. It will not be overwritten with a newer frame while it is allocated to you (FREE), even if the ring buffer overflows. Once you have finished with it, and before you make the next capture function call, you must release it with a call to dc1394_dma_done_with_buffer(). This resets the frame buffer to the QUEUED state so that it can again receive a new frame. DMA capture function calls and dc1394_dma_done_with_buffer() calls must alternate strictly. Expect undefined behaviour if you forget to call dc1394_dma_done_with_buffer(). What is polling (non-blocking) capturing?Polling applies only to the DMA capture functions (the ones with "_dma_" in their names). The polling capture functions (also called non-blocking capture functions) do not wait for frames if the DMA ring buffer is empty. They return immediately either with the first frame waiting in the DMA ring buffer or with a special return value of DC1394_NO_FRAME (instead of DC1394_SUCCESS). The polling capture functions are dc1394_dma_single_capture_poll() and dc1394_dma_multi_capture_poll(). What is non-polling (blocking) capturing?Polling applies only to the DMA capture functions (the ones with "_dma_" in their names). The non-polling capture functions (also called blocking capture functions) always return a valid pointer to a frame (barring some other error). If a frame is not yet available (because the camera is waiting for an external trigger or if the frame rate is slow or if the latest frame is still being transmitted from the camera to the computer) then the blocking capture functions wait (block) until a frame is ready. The blocking capture functions are dc1394_dma_single_capture() and dc1394_dma_multi_capture(). When should I use polling capture functions?Polling (non-blocking) capture functions are useful if you need to check whether a frame is available in the DMA ring buffer but you don't want to wait if there is none. A typical example is flushing (capturing and discarding) frames from the DMA ring buffer until it is empty. When should I use blocking capture functions?Blocking (non-polling) capture functions are the ones to use for most applications. It frees your program from worrying about whether a frame is available and automatically synchronizes your processing with the availability of frames. Your computer is not slowed down while a capture functions blocks on a frame. Other processes can continue while the capture process sleeps. Why is DMA capture always one or more frames behind?It depends what you mean by "behind". Somewhere in your initialization sequence you would call dc1394_start_iso_transmission(). This causes the camera to start spewing out images until you tell it to stop with dc1394_stop_iso_transmission(). The capture functions can't start or stop the camera. The expected behaviour is for the first frame to be current and the next frame to be one frame time later, no matter when you actually call the capture function. The word "capture" in the names of the capture functions is misleading. The frame would have been captured by the camera at some earlier time. All the "capture" functions do is to request a pointer to the next waiting frame in the DMA ring buffer. What happens when the DMA ring buffer overflows?If the camera is running (ISO transmission is on) and you request a frame after a snooze, the DMA buffer is probably overflowing and throwing away the latest frames arriving from the camera. The frames you get from the capture functions will always be in chronological order but they may be old and/or irregularly spaced in time. Buffered (READY) frames in the DMA ring buffer can never be overwritten by newer frames. What does drop_frames do?Setting the drop_frames member of the dc1394_cameracapture structure to a non-zero value causes the DMA capture functions to throw away all frames buffered in the DMA ring buffer except the latest one, which is returned to you. But, and this is a big but, you will only the the newest, freshest frame if the DMA ring buffer has not filled up completely. See "What happens when the DMA ring buffer overflows?" and "Why don't I get the latest frame even with drop frames set?". How can I make absolutely sure I am getting the latest frame?
DMA capture function calls and dc1394_dma_done_with_buffer() calls must alternate strictly. Expect undefined behaviour if you forget to call dc1394_dma_done_with_buffer(). If you need the very latest available frame every time you call a capture function, you must be sure to allocate a large enough number of DMA buffers and never allow them to fill up completely. See "What happens when the DMA ring buffer overflows?" and "Why don't I get the latest frame even with drop frames set?". Why don't I get the latest frame even with drop frames set?The DMA ring buffer may be overflowing because you have not requested enough buffers or you are not calling capture functions often enough. How do I capture just one fresh frame from the camera?Use your camera's one-shot function: Stop the camera with dc1394_stop_iso_transmission(); flush unwanted frames from the DMA ring buffer if necessary; call dc1394_set_one_shot() and a non-polling capture function every time you need a new frame. Things to watch out for: The one-shot register is ignored if ISO transmission is on. The frame will take at least one frame period (as set in the camera) to be transmitted to your computer. Remember to call dc1394_dma_done_with_buffer() once for every successful call to a DMA capture function. Don't I need to unset one-shot after using it?No. The one-shot register in the camera self-clears after the camera has transmitted the frame to the computer. The only purpose of the dc1394_unset_one_shot() function is to ensure that the camera does not transmit a spurious frame after it is stopped with dc1394_stop_iso_transmission(). How do I know if my camera supports the one-shot function?
How do I set the frame rate?In Formats 0 to 5, frame rate is set with dc1394_set_video_framerate(). Its frame_rate argument is an index to the actual frame rate: use the FRAMERATE_... enumeration in dc1394_control.h. In Format 7, frame rate is set (believe it or not) by setting the IEEE 1394 packet size. dc1394_set_video_framerate() will not work. The packet size is one of the arguments ("bytes_per_packet") supplied to the Format 7 set-up functions dc1394_setup_format7_capture() and dc1394_dma_setup_format7_capture(). See "How can I work out the packet size for a wanted frame rate?". One consequence of this design is that it is not possible to change the frame rate in Format 7 without releasing the camera (dc1394_...release_camera()) and setting it up again (dc1394_...setup_format7_capture()). This a limitation of libdc1394, not the IIDC DCAM specification, and is quite computationally expensive especially when using DMA transfer. In any Format, make sure that your integration time (shutter speed) is shorter (faster) than your frame period, otherwise frames will not be ready for transmission fast enough or your camera may misbehave. Unfortunately, the term "frame rate" in IEEE 1394 transmission is misleading. It should really be called "transfer rate", because it is mostly determined by the IEEE 1394 packet size, not the rate at which frames arrive in the computer. Since exactly one packet is transmitted per camera per bus cycle, the transfer rate is determined by the number of bytes in each packet. The transfer rate is meaningful even if the camera is using an external trigger or one-shot mode: it then determines the time it takes to transmit each frame. In this case you should make sure that the "frame rate" is faster than the fastest expected external trigger frequency or one-shot requests, otherwise the camera may produce frames faster than they are transmitted. Why is my measured frame rate slower than the one I set?Your camera is probably sending frames slower than you expect. A common cause is a shutter speed (integration time) slower than the reciprocal of the frame rate. If the camera is set up to produce frames faster than the transfer rate (see "How do I set the frame rate?"), then the frame rate will be equal to the transfer rate. If the camera produces frames slower than the transfer rate, the frame rate will be determined by the camera, not the transfer rate. How can I work out the packet size for a wanted frame rate?This applies only to Format 7 (scalable image size). In the other Formats the frame rate is set by dc1394_set_video_framerate(): see "How do I set the frame rate?". It is a two-step calculation: first work out how many packets are transmitted per frame period, and then work out how big each packet has to be to contain the frame:
The IIDC DCAM specification imposes some limitations on the allowed values of some of the variables:
The function dc1394_query_total_bytes() returns the needed frame size (width*height*depth) observing these limitations. You may still prefer to work it out yourself because it has been suggested that different manufacturers interpret the meaning of the underlying camera register differently. The IEEE 1394 bus cycle rate and period depends on the bus speed:
Note that "speed" does not equal "bandwidth". If you are interested in working out the bandwidth available for frame transmission from this table, keep in mind that 80% of the cycle period is reserved for isochronous transmission (image frames) and 20% for asynchronous transmission (camera commands). In Format 7 the total number of packets needed per frame (num_packets) is calculated by the camera (for current settings of frame size and pixel depth) and can be read with dc1394_query_format7_packet_per_frame(). This is less useful than it sounds because to use it you first have to set up the camera with a dc1394_...setup_format7_capture() function which needs the packet size as an argument. How can I work out the frame rate from the packet size?This is a two-step calculation: first work out how many packets are required to transmit a frame, and then work out how long the transmission takes:
The IIDC DCAM specification imposes some limitations on the allowed values of some of the variables: see "How can I work out the packet size for a wanted frame rate?". How do I minimize frame latency?It depends what you mean. Some systems only need accurate synchronization of past events, in which case it does not matter how long it takes to capture an image as long as you have an accurate timestamp for it. In other systems you need to process the image as quickly as possible in which case you probably don't care much about the accuracy of the timestamp but want a quick answer. For this discussion we can take latency to mean the time between the trigger instant and the instant the filltime is written into the dc1394_cameracapture structure at the end of transmission. Latency is dominated by:
You can use (1) the shortest practical integration time. You can get (2) the frame transmission time as low as possible by setting the number of packets per frame to a minimum (by setting the camera's frame rate as fast as possible: see "How do I set the frame rate?"). The fastest possible case has a single camera per IEEE 1394 host, using all the available bandwidth. Of course your computer may not appreciate frames repeatedly shoved down its throat at that fast rate, in which case you can either use an external trigger or one-shot to control the trigger instants, or set drop_frames (and a large DMA buffer) to get only the freshest frame when you are ready to process it. And (3), typical computer system ticks have a 10 millisecond period. Your frame capture process or thread may be denied processing time for several milliseconds at a time (sometimes much longer when other processes are busy) and there is not much you can do about it. You can use a real-time operating system to get control of the system tick, or you can mess with the configuration of your current OS. The Linux kernel can be configured to use various process prioritization strategies, for example. Can I measure the actual frame latency?It is possible to measure latency very accurately if your computer has a spare serial port, if you use DMA transfer, and your camera has an external trigger input. The latency typically comprises:
First check if your camera's external trigger input is compatible with the signal levels from your computer's serial port. It very likely is (most are opto-isolated) but you would want to minimize any risk to the camera. Connect the serial port data-terminal-ready (DTR) line (pin 4 on D-sub-9 connector, pin 20 on DB-25 connector) and the signal ground line (pin 5 on D-sub-9, pin 7 on DB-25) to your camera's external trigger input. Set the camera to use the external trigger, edge triggered, rising edge:
dc1394_set_trigger_mode(port,node,TRIGGER_MODE_0); dc1394_set_trigger_on_off(port,node,DC1394_TRUE); dc1394_set_trigger_polarity(port,node,DC1394_TRUE); Set up the camera to capture 100 images at a frame rate faster than 10 fps and display their filltimes (from the dc1394_cameracapture structure). Compile and run the following little program to generate 100 trigger pulses on the serial port's DTR line and display their timestamps:
#include <stdio.h> #include <stdlib.h> #include <fcntl.h> #include <unistd.h> #include <sys/ioctl.h> #include <time.h> #include <sys/time.h> /* Default trigger period */ #define T 0.1 /* Helper function */ static void setTrig(const int fd, const int level) { int sts; ioctl(fd,TIOCMGET,&sts); if (level) sts |= TIOCM_DTR; else sts &= ~TIOCM_DTR; ioctl(fd,TIOCMSET,&sts); } int main(int argc, char *argv[]) { int fd,i; double T2=T/2,t1,t2; struct timespec halfT; struct timeval start,finish; fd = open("/dev/ttyS0",O_RDWR); /* serial port */ /* Trigger half-period */ if (argc>1) T2 = atof(argv[1])/2; /* optionally from cmd line */ halfT.tv_sec = T2; halfT.tv_nsec = 1e9*(T2-(double)halfT.tv_sec); /* Generate 100 triggers */ setTrig(fd,0); nanosleep(&halfT,0); for (i=0; i<100; ++i) { gettimeofday(&start,0); setTrig(fd,1); /* trigger */ gettimeofday(&finish,0); t1 = start.tv_sec+1e-6*start.tv_usec; t2 = finish.tv_sec+1e-6*finish.tv_usec; printf("%.6f %.6f\n",(t1+t2)/2,(t2-t1)/2); fflush(stdout); nanosleep(&halfT,0); setTrig(fd,0); /* second half */ nanosleep(&halfT,0); } close(fd); return 0; } Now you can compare the trigger timestamps to the corresponding filltimes. If you repeat this experiment with different image heights (number of lines) you should be able to work out parameters like sensor line transfer time (slope) and transmission set-up time (offset). Here are some typical numbers:
Some of these delays are not well documented in the manufacturer's manuals, or are inaccurately reported. It may be better to measure them yourself for your specific camera. Can I find out exactly when a frame was acquired?On return from a successful dc1394_dma_...capture() function, the filltime member of the dc1394_cameracapture structure contains the time when the frame's DMA buffer was filled, that is, when frame transmission was completed. On most systems it should be accurate to a few tens of microseconds. If you are interested in the time the camera actually acquired the frame (the timestamp of the trigger or start of integration), you have to work backwards from the filltime by subtracting the delays mentioned in "Can I measure the actual frame latency?". Interfaces like Camwire can be configured to provide the time of the trigger signal as a frame timestamp. Even if you have well-measured latency components, you do not have any control over the timing of the IEEE 1394 bus cycle. The camera has to wait for its isochronous time slot before the frame packets can hop on the bus. The upshot is that the frame timestamp has an inherent uncertainty of at least plus or minus half the IEEE 1394 bus period. Bus periods are listed in the table in "How can I work out the packet size for a wanted frame rate?". How do I flush the DMA ring buffer?
Remember also to call dc1394_dma_done_with_buffer() after every successful call to a DMA capture function. How do I ensure that I don't lose any frames?Set drop_frames to 0, allocate a large DMA ring buffer (50 buffers) when you call the setup_capture function, and make sure you call capture functions faster than the camera frame rate. How can I find out if a frame was dropped?After each DMA capture function call, you can check the value returned in the num_dma_buffers_behind member of the dc1394_cameracapture structure. If drop_frames had been set to zero, you have not dropped any frames and num_dma_buffers_behind is the number of buffers still queued in the DMA ring buffer. If drop_frames was non-zero, num_dma_buffers_behind is the number of frames that were dropped. However, these statements are only true as long as the DMA ring buffer does not overflow. Checking num_dma_buffers_behind after each frame capture can help you keep track of the ring buffer status. If you have reason to believe that frames are transmitted regularly (for example on an external trigger signal) then you could also check the filltime member returned in dc1394_cameracapture for any irregularities in the time series. Why do I get the error: "VIDEO1394_IOC_LISTEN_CHANNEL failed"?You might get a message like "(dc1394_capture.c) VIDEO1394_IOC_LISTEN_CHANNEL ioctl failed!" after your program was interrupted for any reason and did not close down video1394 properly. The reason is that video1394 is still listening on that channel and won't re-initialize. The fix is to call ioctl(open(device,O_RDONLY),VIDEO1394_IOC_UNLISTEN_CHANNEL,&channel) where "device" is a char pointer your video1394 character device driver (by default "/dev/video1394/0" if your host card is port 0) and "channel" is an int variable initialized to the IEEE 1394 ISO channel previously allocated to your camera (usually numbered from 0 on each host. See "What is the ISO channel variable for?"). You could also try to do the same thing with dc1394_dma_unlisten() but then you will need the previous raw1394 handle and contents of the dc1394_cameracapture structure, which may no longer be available. What is the ISO channel variable for?As firewire is a shared bus, each camera must place a unique identifier in each packet so that the software can identify which camera a frame came from. The unique identifier used here is known as the ISO channel, and is just an integer. As well as writing the ISO channel number to the camera, you must also tell the video1394 layer which channel(s) it should listen for packets on. There is no automatic channel allocation, so you must control this manually (future versions of libdc1394 will handle this dynamically). You must first choose a unique channel number for the camera you are retrieving images from. If you are looping around an array of cameras, your loop variable "i" or "i+1" could be a suitable unique ISO channel identifier, or you could use the raw1394 (nodeid_t) node number as the channel number, which allows re-use of channel numbers on separate ports (host adaptors). When you set up the capture (e.g. with dc1394_dma_setup_capture()) you must pass this unique ISO channel number as a parameter. libdc1394 will write this channel number to the camera for you, and will use that number when communicating with video1394. If you wish to manipulate the channel number manually, you can use the dc1394_video_set_iso_channel_and_speed() and dc1394_video_get_iso_channel_and_speed() functions, remembering that this will only change or report what is set in the camera, not video1394. Is it safe to multi-thread with libdc1394?Yes. (But see the "Can I detect another process using my camera?" too.) Can I detect another process using my camera?The IIDC DCAM specification (and hence libdc1394) does not provide a way of finding out if a camera is already in use by another process. When another process tries to use my camera (usually by calling one of the libdc1394 capture set-up functions), the camera is typically rendered useless for both processes. However, you may be able to find out if any camera is in use by checking if /dev/raw1394 or /dev/video1394 has been opened, with `/usr/sbin/lsof | grep 1394'. Thanks
A special thanks to Johann Schoonees for maintaining the FAQ until October 2006. | |||||||||||||||||||||||||||||||||||||||||||||||||
© 2024 Damien Douxchamps. All rights reserved. |