Eye trackers monitor either where one is looking or the motion of the eyes relative to the head. These devices are used in research on the visual system, in psychology, in cognitive linguistics, and in product design.
Because of their increased sophistication and accessibility, eye-tracking technologies have generated a great deal of interest in the commercial sector in recent years. Applications include web usability, advertising, marketing, video games, and automotive safety systems (anything from fatigue detection to letting the car know if you've seen a threat it has detected).
I was just talking with the folks at EyeTech Digital Systems. They told me that traditional systems use a camera that sends image data to the host PC, which performs all the image processing, number crunching, and eye gaze analysis. This usually works pretty well, but there are some limitations. Not all computers are created equal; some respond faster than others, and maximum transfer speeds over USB can vary. Also, the intense image processing required for eye tracking can be an excessive burden for computers with low-power processors, leaving little power for other user applications.
The solution is the Zynq-7000 All Programmable SoC from Xilinx, this site's sponsor. This little beauty (which we've discussed before) has allowed EyeTech Digital to create an eye-tracking module that overcomes these problems. The Zynq's programmable logic (PL) provides the parallel processing capabilities needed to filter and process images as fast as the sensor can supply them. The hard ARM Cortex-A9 dual core processing system (PS) can take processed images and calculate gaze information easily.
The EyeTech Digital SmartTracker module (shown with a quarter for size comparison).
Having the PL and the PS on the same chip (and allowing them to access the same memory) greatly eases development. The system also utilizes many of the hard peripherals available on the Zynq, such as the USB, I2C, GPIO, and XADC.
EyeTech Digital says performing all the processing on the Zynq can minimize the amount of data that needs to be transferred to the host computer. Also, because the system is not performing intense processing on the host side, the company can expand the range of host devices that can be augmented with eye-tracking capabilities, including tablets and other mobile devices.
I can think of all sorts of applications for this sort of technology. Just off the top of my head, for example, I'm thinking about Duane Benson's robot avatar project. When you are interacting with a person, that person responds to the things you do, like glancing at something or squeezing your eyelids closed if you are in pain. (I do this a lot when my wife is lecturing me about something.) Equipping this robot with an eye-tracking capability would make its interactions and responses much more lifelike.
Can you think of any interesting applications for this sort of technology?
Related posts (weird and wonderful):
Related posts (FPGA architectures, etc.):