Thursday, October 28, 2010

Gaze Tracker 2.0 Preview

On my 32nd birthday I'd like to celebrate by sharing this video highlighting some of the features in the latest version of the GT2.0 that I've been working on with Javier San Agustin and the GT forum. Open source eye tracking have never looked better. Enjoy!


HD video available (click 360p and select 720p)

Friday, October 1, 2010

Tuesday, August 17, 2010

How to build low cost eye tracking glasses for head mounted system (M. Kowalik, 2010)

Michał Kowalik of the Faculty of Computer Science and Information Technology at the West Pomeranian University of Technology in Szczecin, Poland, has put together a great DIY instruction for a headmounted system using the ITU Gaze Tracker. The camera of choice is the Microsoft LifeCam VX-1000 which has been modified by removing the casing and IR filter. In addition, three IR LEDs illuminate the eye using power from the USB cabel. This is then mounted on a pair of safety glasses, just like Jason Babcock & Jeff Pelz previously have done. Total cost of the hardware less than 50€. Neat. Thanks Michal.

Download instructions as PDF (8.1Mb)

    Monday, August 16, 2010

    Call for Papers: ACM Transactions Special Issue on Eye Gaze

    ACM Transactions on Interactive Intelligent Systems
    Special Issue on Eye Gaze in Intelligent Human-Machine Interaction

    Aims and Scope

    Partly because of the increasing availability of nonintrusive and high-performance eye tracking devices, recent years have seen a growing interest in incorporating human eye gaze in intelligent user interfaces. Eye gaze has been used as a pointing mechanism in direct manipulation interfaces, for example, to assist users with “locked-in syndrome”. It has also been used as a reflection of information needs in web search and as a basis for tailoring information presentation. Detection of joint attention as indicated by eye gaze has been used to facilitate computer-supported human-human communication. In conversational interfaces, eye gaze has been used to improve language understanding and intention recognition. On the output side, eye gaze has been incorporated into the multimodal behavior of embodied conversational agents. Recent work on human-robot interaction has explored eye gaze in incremental language processing, visual scene processing, and conversation engagement and grounding.

    This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.

    Topics

    • Empirical studies of eye gaze in human-human communication that provide new insight into the role of eye gaze and suggest implications for the use of eye gaze in intelligent systems. Examples include new empirical findings concerning eye gaze in human language processing, in human-vision processing, and in conversation management.
    • Algorithms and systems that incorporate eye gaze for human-computer interaction and human-robot interaction. Examples include gaze-based feedback to information systems; gaze-based attention modeling; exploiting gaze in automated language processing; and controlling the gaze behavior of embodied conversational agents or robots to enable grounding, turn-taking, and engagement.
    • Applications that demonstrate the value of incorporating eye gaze in practical systems to enable intelligent human-machine communication.

    Guest Editors

    • Elisabeth André, University of Augsburg, Germany (contact: andre[at]informatik[dot]uni-augsburg.de)
    • Joyce Chai, Michigan State University, USA

    Important Dates

    • By December 15th, 2010: Submission of manuscripts
    • By March 23rd, 2011: Notification about decisions on initial submissions
    • By June 23rd, 2011: Submission of revised manuscripts
    • By August 25th, 2011: Notification about decisions on revised manuscripts
    • By September 15th, 2011: Submission of manuscripts with final minor changes
    • Starting October, 2011: Publication of the special issue on the TiiS website and subsequently in the ACM Digital Library and as a printed issue
     Source http://tiis.acm.org/special-issues.html

    Tuesday, August 10, 2010

    Eye control for PTZ cameras in video surveillance

    Bartosz Kunka, a PhD student at the Gdańsk University of Technology have employed a remote gaze-tracking system called Cyber-Eye to control PTZ cameras in video surveillance and video-conference systems. The movie prepared for system presentation on Research Challange at SIGGRAPH 2010 in Los Angeles.

    Wednesday, August 4, 2010

    EOG used to play Super Mario

    Came across some fun work by Waterloo labs that demos how to use a bunch of electrodes and a custom processing board to do signal analysis and estimate eye movement gestures though measuring EOG. It means you'll have to glance at the roof or floor to issue commands (no gaze point-of-regard estimation). Good thing is that the technology doesn't suffer from issues with light, optics and sensors that often makes video based eye tracking and gaze point-of-regard estimation complex. Bad thing is that it requires custom hardware, mounting of electrodes and wires, besides that the interaction style appears to involve looking away from what you are really interested in.

    Sunday, July 18, 2010

    Monday, June 28, 2010

    Video-games can be beneficial!

    Appears video-games can be beneficial your your eyes despite what mother said. Came across this article in the British Daily Mail, found it inspiring and believe it could be done even better with an interactive application using real-time gaze tracking input. Direct quote:

    "A six-year-old boy who nearly went blind in one eye can now see again after he was told to play on a Nintendo games console. Ben Michaels suffered from amblyopia, or severe lazy eye syndrome in his right eye from the age of four. His vision had decreased gradually in one eye and without treatment his sight loss could have become permanent. His GP referred him to consultant Ken Nischal who prescribed the unusual daily therapy. Ben, from Billericay, Essex, spends two hours a day playing Mario Kart on a Nintendo DS with his twin Jake. Ben wears a patch over his good eye to make his lazy one work harder. The twins' mother, Maxine, 36, said that from being 'nearly blind' in the eye, Ben's vision had 'improved 250 per cent' in the first week. She said: 'When he started he could not identify our faces with his weak eye.  Now he can read with it although he is still a way off where he ought to be. 'He was very cooperative with the patch, it had phenomenal effect and we’re very pleased.' Mr Nischal of Great Ormond Street Children's Hospital, said the therapy helped children with weak eyesight because computer games encourage repetitive eye movement, which trains the eye to focus correctly. 'A games console is something children can relate to. It allows us to deliver treatment quicker,' he said. 'What we don’t know is whether improvement is solely because of improved compliance, ie the child sticks with the patch more, or whether there is a physiological improvement from perceptual visual learning.' The consultant added that thousands of youngsters and adults could benefit from a similar treatment." (source)

    Tuesday, June 15, 2010

    Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)

    A new version of the Dasher typing interface utilizes speech recognition provided by the CMU PocketSphinx software doubles the typing performance measured in words per minute. From a previous 20 WPM to 40 WPM, close to what a professional keyboard jockey may produce.



    Abstract
    Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
    user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
    minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).

      
    • Speech Dasher: Fast Writing using Speech and Gaze
      Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]

    Wednesday, May 26, 2010

    Abstracts from SWAET 2010


    The booklet containing the abstracts for the Scandinavian Workshop on Applied Eye Tracking (SWAET) is now available for download, 55 pages about 1Mb. The abstracts spans a wide range from gaze interaction to behavior and perception. A short one page format makes it attractive to venture into a multitude of domains and acts as a nice little starting point for digging deeper. Shame I couldn't attend, maybe next year. Kudos for making this booklet available.




     Title Authors
     Eye movements during mental imagery are not perceptual re-enactments R. Johansson, J. Holsanova, K. Holmqvist
     Practice eliminates "looking at nothing" A. Scholz, K. Mehlhorn, J.F. Krems
     Learning Perceptual Skills for Medical Diagnosis via Eye Movement  Modeling Examples on Patient Video Cases H. Jarodzka, T. Balslev, K. Holmqvist, K. Scheiter, M. Nyström, P. Gerjets, B. Eika
     Objective, subjective, and commercial information: The impact of presentation format on the visual inspection and selection of Web search results Y. Kammerer, P. Gerjets
     Eye Movements and levels of attention: A stimulus driven approach F.B. Mulvey, K. Holmqvist, J.P Hansen
     Player‟s gaze in a collaborative Tetris game P Jermann, M-A Nüssli, W. Li
     Naming associated objects: Evidence for parallel processing L. Mortensen , A.S. Meyer
     Reading Text Messages - An Eye-Tracking Study on the Influence of Shortening Strategies on Reading Comprehension V. Heyer, H. Hopp
     Eye movement measures to study the online comprehension of long (illustrated) texts J. Hyönä, J.K, Kaakinen
     Self-directed Learning Skills in Air-traffic Control; A Cued Retrospective Reporting Study L.W. van Meeuwen, S. Brand-Gruwel, J.J. G. van Merriënboer, J. J.P.R. de Bock, P.A. Kirschner
     Drivers‟ characteristic sequences of eye and head movements in intersections A. Bjelkemyr, K. Smith
     Comparing the value of different cues when using the retrospective think aloud method in web usability testing with eye tracking A. Olsen
     Gaze behavior and instruction sensitivity of Children with Autism Spectrum Disorders when viewing pictures of social scenes B. Rudsengen, F. Volden
     Impact of cognitive workload on gaze-including interaction S. Trösterer, J. Dzaack
     Interaction with mainstream interfaces using gaze alone H. Skovsgaard, J. P. Hansen, J.C. Mateo
     Stereoscopic Eye Movement Tracking: Challenges and Opportunities in 3D G. Öqvist Seimyr, A. Appelholm, H. Johansson R. Brautaset
     Sampling frequency – what speed do I need? R. Andersson, M. Nyström, K. Holmqvist
     Effect of head-distance on raw gaze velocity M-A Nüssli, P. Jermann
     Quantifying and modelling factors that influence calibration and data quality M. Nyström, R. Andersson,  J. van de Weijer