Can GigE Vision Lose Frames?

I recently read an interesting post by B. Grey entitled “Can GigE be trusted?”. The central theme of the discussion is around the reliability of GigE in the transfer of images. I thought I’d add my 2 cents here.

First, why can’t we find any information about GigE cameras dropping frames? I see two possibilities: Either MV companies want to hide what must be a devastating problem, or more simply because it’s not an issue. Since I don’t subscribe to conspiracy theories, let’s walk through some catastrophic scenarios where a GigE Vision camera might lose a frame.

It all starts with image acquisition. An easy way to lose a frame of course, is not to capture one in the first place. Steve Maves correctly points to where this in fact happens: over-triggering the camera. The way to deal with too many triggers is manufacturer-specific: it can simply be ignored (with or without an error message) or it can be buffered to be executed as soon as possible after the current frame. Unfortunately, you need to read the fine print in the camera user manual to learn how it is implemented since this is not part of the standard.

So let’s assume the image was captured by the camera: what else can go wrong? There are a few situations where a camera might be forced to drop a frame before transmission on the gigabit link.

  • First, if a link is too slow to support the acquisition bandwidth (let’s say you connected your GigE camera on a 100 Mbps Fast Ethernet port), then the camera will likely need to drop a few frames to throttle acquisition bandwidth to match the transmission bandwidth. A similar situation can happen if too many cameras are connected, through Ethernet switch, to a single Ethernet port of the NIC. These are system design issues.
  • There is one more situation where a GigE camera might not send all the captured images to the PC. This has to do with a race condition between the trigger and the stop of an acquisition. Let’s assume that the camera has some on-board buffers (most GigE cameras do). The system starts a triggered acquisition and the camera starts an exposure each time it receives a trigger. But what happens if the PC sends an early StopAcquisition command after a valid trigger sequence has been initiated, but before the image was transferred? Well, once again, it depends on the camera vendor implementation. Some vendors might decide to stop any pending transfers to the PC and the image being exposed will be trashed. In this case, the number of received images at the PC will not match the number of triggers. But again, this is a special case and a well designed application would deal with such a scenario in a more elegant manner.

At this point, we can consider all captured images have been transmitted by the camera. So what can else go wrong? We all know Windows and Linux are not real-time operating systems. This is why you should not expect real-time acquisition without using GigE Vision software designed to minimize CPU usage. A good counter example is using the socket API (hence the Windows communication stack) to capture GigE images in real-time. It might work up to a point where either the transmission bandwidth will consume too much CPU or other pieces of software running on the PC will block the Windows thread responsible for image acquisition. This is why the GigE software vendors have invested in creating so called High Performance and Filter drivers. These special drivers run at the kernel level and thus have higher priority than most of the other threads scheduled by the operating system. This leaves them plenty of time to process incoming packets and copy them to the image buffers. But even such optimized drivers might be limited by other ill-behaved kernel drivers consuming CPU bandwidth for too long and preventing them to run for extended period of time (remember we’re talking about milliseconds here). In these theoretical situations, the optimized driver risks losing precious packets and might not even have the time to ask for packet resends. But I have yet to see such a situation.

Even if the GigE driver was able to receive all images and put them in the image buffers, the final part is the application software not being able to process that data fast enough. The number of host buffers is limited, and at some point there might be no more available to accommodate the next image. In that situation, the GigE driver will have to discard an image (by the way, same would be true for a frame grabber based application). Again, this should be taken care of in the system design.

I am sure there are a few other scenarios where an image could be dropped. My experience tells me there is always a way to get around the problem in a well designed MV system. This explains why you cannot find articles discussing GigE dropping frames. Well, until today that is!

Cheers!

About Eric

Eric is in charge of R&D activities at the Montreal office of Teledyne DALSA where he is surrounded by talented people working on the technologies of tomorrow. Chair of the GigE Vision committee, he enjoys reading and writing machine vision standards, especially the thicker ones.
Posted on by Eric. This entry was posted in Interface Standards, Machine Vision. Bookmark the permalink.

One Response to "Can GigE Vision Lose Frames?"

  1. Vincent Rowley says:

    Dear Eric,

    Thanks for your post.

    There is one topic that I think we should discuss. That is the lost of packets and frames that can occur in harsh environments. Given the fact that GigE Vision is often used in industrial environments, this topic requires some discussion. For instance, large EMI devices such as motors that are used in systems can cause images to be lost between the GigE Vision camera and the receiving PCs. Likewise, external disturbances can cause frame lost in your in-vehicle military vision system due to shock and vibration.

    Even though these issues can rightly be considered as system issues, the integrator needs to be aware of these. Some will use fiber optic at the back of their GigE Vision devices in order to increase immunity to noise. However, the GigE Vision packet resend feature can enable one to recover the frames lost incurred by these periods of external disturbances that are generally bursty and of random nature. That is provided that your GigE Vision device features enough memory for the resend of lost frames. The knowledge of your system will be paramount in helping you determining how much on-board memory your GigE Vision devices require if cannot afford to loose frames at the system level.

    The last topic that is worth discussing is the packet reordering.

    The GigE Vision streaming protocol is based on UDP which does not guarantee ordered data delivery. In small networks, packet reordering will be very unlikely to occur. In commercial large Ethernet systems, packet reordering is more likely to occur than  packet loss.

    If you want to deploy GigE Vision in large Ethernet networks, then you should make sure that the SW you use can adequately deal with packet reordering. Some GigE Vision SW packages cannot. Such packages will generally issue packet resend requests when packets arrive out of order which can lead to an avalanche of packet resend requests in some situations and impact network bandwidth utilization.

    I would also be interested to hear other perspectives on the topic.

    Vincent Rowley

Add Comment Register



Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>