When Machine Vision meets the NFL

TDALSA_freeD_jumboscreenWhere were you Sunday evening? As a football fan, I was watching the Dallas Cowboys Season Opener against the New York Giants, on NBC Sunday Night Football. This was a defining moment for Machine Vision: the first time that Replay Technologies “FreeD” was deployed on NFL turf to display “Matrix-like 3D rendering of the action. And they happen to be using technology powered by Teledyne DALSA! I’m not used to seeing our products showcased during prime time on a major US network, hence my enthusiasm for this particular game (granted I always enjoy the performance of the Cowboys cheerleaders, who wouldn’t?).

Machine Vision (MV) has always been about high bandwidth and high resolution image acquisition. What’s interesting now is how other markets are starting to use MV technology to create systems that Star Trek fans (count me in Mr. Spock!) have been dreaming of. And you can expect this trend to increase as these wild guys from Hollywood and the like find new ways to innovate with high-frame rate cameras. This is the difference between artists and engineers I guess!

Camera InstallHow does this work? Well, the folks at Replay Technology are certainly smarter than I am. They combine the images from twelve Falcon2 12 Megapixel CMOS color cameras connected to Xcelera frame grabbers to create a 3D rendering of the live action. This enables viewers to watch replays from angles that cannot be obtained using traditional TV broadcast systems. Exactly like in video games. And you thought stereo vision was complex! Using their algorithm, they can interpolate images when moving  from one camera to the next, and provide views that could only, until last Sunday, be seen on the holodeck. :-)

For the record, this started with baseball – at Yankee stadium. And now, who knows where this could lead? The Machine Vision world continues to deliver innovations that are leveraged by other markets: security, traffic surveillance, entertainment. These are new opportunities for all of us: when adjacent markets benefit from our work, there is a larger pie to share. And I love sharing as long as there is more for me! Where will MV take us next? Who knows? I have ideas, but I am sure you have a few yourself.

Time’s up. Beam me up Scotty!

About Eric

Eric is in charge of R&D activities at the Montreal office of Teledyne DALSA where he is surrounded by talented people working on the technologies of tomorrow. Chair of the GigE Vision committee, he enjoys reading and writing machine vision standards, especially the thicker ones.
Posted on by Eric. This entry was posted in Cameras, Frame grabbers, Interface Standards, Machine Vision, Teledyne DALSA News. Bookmark the permalink.

4 Responses to "When Machine Vision meets the NFL"

  1. Jordi Mendoza says:

    I am a little lost on that technogy application.
    Camera Link cameras only allow 6m distance from frame grabber to camera, and you say you installed 24 cameras. It means 24 computers next to the cameras?
    Plus a central server to pick them all?
    I’ve read Xcelera is a PCI x8 card, so no embedded computers have that kind of expansion slots. I’m missing something there?

    • Eric says:

      Without going into the specific of this particular implementation, you are correct that Camera Link has limited cable length. On top of that, there are so many PCIe x8 slots in a given PC. So you do need multiple PCs to deal with such a large scale application. This is why it is so cool!

  2. Grant Gerke, Contributing Writer says:

    Eric,
    Very interesting application as I saw it too on Sunday night. I’m contributing writer for Automation World magazine and was wondering if there are examples of this technology being used in an industrial or factory application? Or what could be possibly done in the future with this application?

    • Eric says:

      The application itself has been developed by Replay Technologies using high-performance machine vision components. Could something like this be used on the factory floor? Well, with today’s technology, the rendering from all those cameras takes some time. So we would be far from real-time in an inspection application. But as technology improves, who knows if we could simply swipe our mouse around to visualize and zoom a specific area. Obviously, we would have to talk about something quite large to inspect!

Add Comment Register



Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>