Thursday, September 5, 2013

Introducing The Eye Tribe Tracker

It's with great pride I today introduce the Eye Tribe Tracker. It's the worlds smallest remote tracker, the first to use USB3.0 and the only one below $100. It's not targeting the research community, instead it aims for new grounds being developers of next-gen gaze interaction applications. I will let the academic crowd determine if it meets their requirements. I'm too biased to claim that it's better than this or that. The only way to properly evaluate eye trackers is through standardized evaluation carried out by independent parties.

On a personal level today marks an important milestone. I built my first gaze interaction software back in 2008, titled Neovisus, as the outcome of my MSc. at Lund University. During this work I realized that gaze interaction could be a natural interaction element, not just for a specific user group but for everyone. At the time eye trackers were unfortunately really hard to come by, the one I used costs $25,000 (and still does). Javier San Agustin and myself  attempted to fix this during our R&D of the ITU GazeTracker, an open source eye tracker software. In many ways we succeeded, but it lacked critical features; you had to order components to assembly your own rig, it was difficult to setup and tracking was far from robust compared to commercial alternatives.

Overall, the ITU GazeTracker was a great learning experience, it evolved to become most distributed open source eye tracking software and gathered an active community. At the same time, we learned what it would take to build something great. It would require us to focus and make a full time commitment.

Here we are two years later. With the launch of an truly affordable eye tracker we have taken a big step towards realizing the vision we are burning for. No longer is there a prohibiting barrier preventing developers from exploring the many benefits eye tracking can bring to their applications.

Best of all, this is still the beginning. I can't wait to get this into the hands of all the developers who placed a $99 bet on the future.

Tech specs (preliminary)

Sampling rate40Hz and 60Hz mode
Accuracy0.5° (average)
Spatial Resolution0.1° (RMS)
Latency<20ms at 60Hz
Calibration5, 9, 12 points
Operating range45cm – 75cm
Tracking area40cm x 40cm at 65cm distance
Screen sizesUp to 24”
API/SDKC++, C# and Java included
Data outputBinocular gaze data
Dimensions (W/H/D)20 x 1.9 x 1.6 cm (7.9 x 0.75 x 0.66 inches)
ConnectionUSB3.0 Superspeed

Tuesday, July 30, 2013

Duke University eye tracking Peacocks

  • Through their eyes: selective attention in peahens during courtship," Jessica Yorzinski, Gail Patricelli, Jason Babcock, John Pearson, Michael Platt. Journal of Experimental Biology, July 24, 2013.

Monday, March 18, 2013

ITU 2xPhD positions: Eye Tracking for mobile devices

From the IT University of Copenhagen (GazeGroup and EyeInfo) comes an offer for two fully funded PhD positions on the topic of eye tracking for mobile devices. This is an excellent opportunity to work together with domain expertise on the development of the next generation eye tracking systems. 
The aims of the project are to develop eye tracking algorithms and hardware for mobile devices, applying eye movement signals to games, toys, device interaction, augmented reality, and combining these signals with existing sensors in mobile devices, like GPS, gyroscope, accelerometer, and/or brain activity measured by EEG electroencephalography.
We are seeking two excellent PhD students to do research within one of two areas:
1) Development and test of new robust eye tracking and gaze estimation algorithms and optimization of eye tracking algorithms for low power consumption.
2) Exploration of novel ways of applying gaze interaction on smartphones, tablets and smartglasses.
The project requires a willingness to cooperate closely with the industrial partners involved in this project, i.e. The Eye Tribe, LEGO System A/S and Serious Games Interactive.
The ideal candidate will have a strong background in both computer science (especially within computer vision) AND in interaction design (with an experimental approach) OR excellent competence in at least one of them.
Contact person: John Paulin Hansen
Research Group: PitLab

Thursday, January 24, 2013

Tuesday, January 15, 2013

Wednesday, January 2, 2013