Thursday, September 5, 2013

Introducing The Eye Tribe Tracker

It's with great pride I today introduce the Eye Tribe Tracker. It's the worlds smallest remote tracker, the first to use USB3.0 and the only one below $100. It's not targeting the research community, instead it aims for new grounds being developers of next-gen gaze interaction applications. I will let the academic crowd determine if it meets their requirements. I'm too biased to claim that it's better than this or that. The only way to properly evaluate eye trackers is through standardized evaluation carried out by independent parties.


On a personal level today marks an important milestone. I built my first gaze interaction software back in 2008, titled Neovisus, as the outcome of my MSc. at Lund University. During this work I realized that gaze interaction could be a natural interaction element, not just for a specific user group but for everyone. At the time eye trackers were unfortunately really hard to come by, the one I used costs $25,000 (and still does). Javier San Agustin and myself  attempted to fix this during our R&D of the ITU GazeTracker, an open source eye tracker software. In many ways we succeeded, but it lacked critical features; you had to order components to assembly your own rig, it was difficult to setup and tracking was far from robust compared to commercial alternatives.

Overall, the ITU GazeTracker was a great learning experience, it evolved to become most distributed open source eye tracking software and gathered an active community. At the same time, we learned what it would take to build something great. It would require us to focus and make a full time commitment.

Here we are two years later. With the launch of an truly affordable eye tracker we have taken a big step towards realizing the vision we are burning for. No longer is there a prohibiting barrier preventing developers from exploring the many benefits eye tracking can bring to their applications.

Best of all, this is still the beginning. I can't wait to get this into the hands of all the developers who placed a $99 bet on the future.

Tech specs (preliminary)

Sampling rate40Hz and 60Hz mode
Accuracy0.5° (average)
Spatial Resolution0.1° (RMS)
Latency<20ms at 60Hz
Calibration5, 9, 12 points
Operating range45cm – 75cm
Tracking area40cm x 40cm at 65cm distance
Screen sizesUp to 24”
API/SDKC++, C# and Java included
Data outputBinocular gaze data
Dimensions (W/H/D)20 x 1.9 x 1.6 cm (7.9 x 0.75 x 0.66 inches)
Weight130g
ConnectionUSB3.0 Superspeed


Tuesday, July 30, 2013

Duke University tracking peacocks



  • Through their eyes: selective attention in peahens during courtship," Jessica Yorzinski, Gail Patricelli, Jason Babcock, John Pearson, Michael Platt. Journal of Experimental Biology, July 24, 2013.

Monday, March 18, 2013

ITU 2xPhD positions: Eye Tracking for mobile devices


From the IT University of Copenhagen (GazeGroup and EyeInfo) comes an offer for two fully funded PhD positions on the topic of eye tracking for mobile devices. This is an excellent opportunity to work together with domain expertise on the development of the next generation eye tracking systems. 
The aims of the project are to develop eye tracking algorithms and hardware for mobile devices, applying eye movement signals to games, toys, device interaction, augmented reality, and combining these signals with existing sensors in mobile devices, like GPS, gyroscope, accelerometer, and/or brain activity measured by EEG electroencephalography.
We are seeking two excellent PhD students to do research within one of two areas:
1) Development and test of new robust eye tracking and gaze estimation algorithms and optimization of eye tracking algorithms for low power consumption.
2) Exploration of novel ways of applying gaze interaction on smartphones, tablets and smartglasses.
The project requires a willingness to cooperate closely with the industrial partners involved in this project, i.e. The Eye Tribe, LEGO System A/S and Serious Games Interactive.
The ideal candidate will have a strong background in both computer science (especially within computer vision) AND in interaction design (with an experimental approach) OR excellent competence in at least one of them.
Contact person: John Paulin Hansen
Research Group: PitLab

Thursday, January 24, 2013

Tuesday, January 15, 2013

Eyetribe @ CES2013



Wednesday, January 2, 2013

Monday, November 5, 2012

Lund University HumLab eye tracking equipped classroom/lab.

What's better than an eye tracker in a lab? A room full of them! On Thursday the new eye tracking lab at Lund University HumLab was opened. It's housed in the basement of the Center of Language and Linguistics, close to the existing eye tracking lab where I did my Masters thesis on gaze interaction in 2008. The new lab is termed "the digital classroom" and features 25 eye tracking equipped computers for large studies on electronic media and education. During the last ten years the HumLab group have pursued research on the education processes, how students read educational material and how their reading style evolves during university studies. The digital classroom contains eye trackers from German manufacturer SMI (RED-M) and is co-financed by the Wallenberg foundation and Lund University for a total investment of 2.2 million SEK (US$328k). In January a new project starts that aims at improving learning in elementary and high school. Big congrats to Kenneth Holmqvist and the team. Very exciting to see the output of this!


Kenneth Holmqvist and Jana Holsanova (on the right)

By the way, I'm in the process of reading a book by the same group titled "Eye Tracking: A comprehensive guide to methods and measures". It is, by far, the most accurate, comprehensive and well-written publication on eye tracking and associated research to this date. A must-read for any serious researcher and/or developer.

Tuesday, October 23, 2012

Gaze-controlled drone

From Alexandre Alapetite and John Paulin Hansen, who I previously did an eye controlled robot with, comes a demo that shows gaze control of a drone. The user´s gaze is determined by an eye tracking apparatus (Alea technology) situated below the display. The drone will fly in the direction that people are looking. The operator is located near the drone. However, he could be situated anywhere, even hundreds of miles away. Gaze Controlled Flying was presented as an interactive demo at the NordiChi 2012 conference, October 16, IT University of Copenhagen, Denmark. Cool guys!

European Conference on Eye Movements 2013 announced


The European Conference on Eye Movements 2013 will be held in Lund, Sweden, from August 11th to 16th 2013. ECEM is the largest and oldest eye tracking conference in the world. The conference webpage is now public: http://ecem2013.eye-movements.org/

Next year’s conference will include four panel discussions, 9 keynote speakers and a large number of sessions of 4 to 6 talks. We also include pre-conference methods workshops taught by top experts in the field on diverse topics related to eye movements and eye tracking, open to all researchers at every level, and to members of industry, running from the 7th to 10th August. You can see a list of these topics and the teachers here: http://ecem2013.eye-movements.org/workshops.
Important dates include the following:
  • Oct 15, 2012: Submission of proposals and abstracts will open.
  •  Jan 15, 2013: Deadline for proposals for symposia.
  • Feb 25, 2013: Notification on acceptance for symposia.
  • March 1, 2013: Deadline for 2-page extended abstract for talks and 200 word abstracts for posters.
  • April 1, 2013: Registration opens.
  • April 15, 2013: Notification on acceptance for talks and posters.
  • May 1, 2013: Last day for reduced registration fee.
Organising committee
  • Conference Chairs: Kenneth Holmqvist and Arantxa Villanueva
  • Conference Organiser: Fiona Mulvey
  • Scientific Board: Halzska JarodzkaIgnace HoogeRudolf Groner and Päivi Majaranta
  • Exhibition Chairs: John Paulin Hansen and Richard Andersson
  • Method Workshop Organisers: Marcus Nyström and Dan Witzner Hansen
  • Web Masters: Nils Holmberg and Detlev Droege
  • Proceedings Editors: Roger Johansson and Richard Dewhurst
  • Registration Managers: Kerstin Gidlöf and Linnéa Larsson
  • Student Volunteer Managers: Linnéa LarssonRichard Dewhurst and Kerstin Gidlöf
  • Social Program Organisers: Richard AnderssonJana Holsanova and Kerstin Gidlöf
Contact
  • Conference chairs and organiser: management at/på/an ecem2013.eye-movements.org
  • Exhibition: exhibition at/på/an ecem2013.eye-movements.org
  • Method workshops: workshops at/på/an ecem2013.eye-movements.org
  • The web page: webmaster at/på/an ecem2013.eye-movements.org

Tuesday, October 2, 2012

Fujitsu tablet and monitor

Today the first public demonstrations of the Fujitsu/Docomo/Tobii tablet came online, all from the CEATEC 2012 expo in Japan. The prototype tablet, called iBeam, is designed by Fujitsu for Docomo and contains an eye tracking module from Swedish Tobii, namely the IS-20 which was introduced earlier this year. The form factor appears a bit on the large size with a bump towards the edge where the eye tracking module is placed, sort of looks like a tablet inside another case. On the software side the tablet is running Android where a gaze marker is overlaid on the interface. Selection is performed using simple dwell activation which is known for being both stressful and error-prone. The sample apps contains the usual suspects, panning of photos and maps, scrolling browser and image viewer. Pretty neat for a prototype.




Fujitsu also demonstrated a LCD monitor with an eye tracking camera system embedded while the actual gaze estimation algorithms are running on an embedded Windows computer. This display is not using the Tobii IS20 but a system developed by Fujitsu themselves which is stated to be low-cost. Question is why they didn't use this for the tablet. From what I can tell it does not provide the same level of accuracy, it appears to be a rough up/down, left/right type of gaze estimation which explains why the demo apps only handles panning of maps and images.