Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Tuesday, December 8, 2009

Scandinavian Workshop on Applied Eye-tracking (SWAET) 2010.

The first call for papers for the annual Scandinavian Workshop on Applied Eye-Tracking (SWAET) organized by Kenneth Holmqvist and the team at the Lund University Humanities laboratory was just announced. The SWAET 2010 will be held in Lund, Sweden between May 5-7th. The invited speaker is Gerry Altmann (blog) from the Dept. of Psychology at University of York, UK and Ignace Hooge (s1, s2) from the Dept. of Psychology at Utrecht University, Holland.

Visit the SWAET website for more information.

Update: Download the abstracts (pdf, 1Mb)

Tuesday, November 24, 2009

Remote tracker and 6DOF using a webcam

The following video clips demonstrates a Masters thesis project from the AGH University of Science and Technology in Cracow, Poland. The method developed provides 6 degrees of freedom head tracking and 2D eye tracking using a simple, low resolution 640x480 webcam. Under the hood it's based on the Lucas-Kanade optical flow and POSIT. A great start as the head tracking seems relatively stable. Imagine it with IR illumination, a camera with slightly higher resolution and a narrow angle lens. And of course, pupil + glint tracking algorithms for calibrated gaze estimation.


Monday, November 23, 2009

ITU GazeTracker in the wild

Came across these two Youtube videos from students out there using the ITU GazeTracker in their HCI projects. By now the software has been downloaded 3000 times and the forum has seen close to three hundred posts. It's been a good start, better yet, a new version is in the makings. It offers a complete network API for third party applications, improved tracking performance, better camera control and a number of bugfixes (thanks for your feedback). It will be released when it's ready.







Thanks for posting the videos!

Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.






Medical Image Perception Society 2009 - Day three

Session 10. Displays and Tools. Chair: Kevin Berbaum, PhD
  • Objective methodology to compare clinical value of computed tomography artifact reduction algorithms. G Spalla, C Marchessoux, M Vaz, A Ricker, & T Kimpe
  • LCD Spatial Noise Suppression: Large-field vs. ROI Image Processing. WJ Dallas, H Roehrig, J Fan, EA Krupinski, & J Johnson
Session 11. Displays and Tools. Chair: Miguel Eckstein, PhD
  • Stereoscopic Digital mammography: Improved Accuracy of Lesion Detection in Breast Cancer Screening. DJ Getty, CJ D’Orsi, & RM Pickett
  • Detectability in tomosynthesis projections, slices and volumes: Comparison of human observer performance in a SKE detection task. I Reiser, K Little, & RM Nishikawa
Thanks Craig, Miguel and Elisabeth for a wonderful event, learned so much in just three days. Plenty of inspiration for future research.

Medical Image Perception Society 2009 - Day two

Session 6. Performance Measurement II. Chair: Matthew Freedman, MD, MBA
  • Coding of FDG Intensity as a 3-D Rendered Height Mapping to Improve Fusion Display of Co-Registered PET-CT Images. RM Shah, C Wood, YP Hu, & LS Zuckier
  • Estimation of AUC from Normally Distributed Rating Data with Known Variance Ratio. A Wunderlich & F Noo
  • Using the Mean-to-Variance Ratio as a Diagnostic for Unacceptably Improper Binormal ROC Curves. SL Hillis & KS Berbaum
Session 7. Performance Measurement II. Chair: Stephen Hillis, PhD
  • BI-RADS Data Should Not be Used to Estimate ROC Curves. Y Jiang & CE Metz

  • Estimating the utility of screening mammography in large clinical studies. CK Abbey, JM Boone, & MP Eckstein

  • Issues Related to the Definition of Image Contrast, DL Leong & PC Brennan
Session 8. Models of Perceptual processing. Chair: Yulei Jiang, PhD
  • Channelized Hotelling Observers for Detection Tasks in Multi-Slice Images. L Platiša, B Goossens, E Vansteenkiste, A Badano & W Philips

  • Channelized Hotelling observers adapted to irregular signals in breast tomosynthesis detection tasks. I Diaz, P Timberg, CK Abbey, MP Eckstein, FR Verdun, C Castella, FO Bochud

  • Detecting Compression Artifacts in Virtual Pathology Images Using a Visual Discrimination Model. J Johnson & EA Krupinski

  • Automatic MRI Acquisition Parameters Optimization Using HVS-Based Maps. J Jacobsen, P Irarrázabal, & C Tejos

  • Parametric Assessment of Lesion Detection Using a Pre-whitened Matched Filter on Projected Breast CT Images. N Packard, CK Abbey, & JM Boone

  • Model Observers for Complex Discrimination Tasks: Deployment Assessment of Multiple Coronary Stents. S Zhang, CK Abbey, X Da, JS Whiting, & MP Eckstein
Session 9. Special Invited Session on Neuroscience and Medical Image Perception. Chair: Miguel Eckstein, PhD
  • Decoding Information Processing When Attention Fails: An Electrophysiological Approach. B Giesbrecht
  • Some Neural Bases of Radiological Expertise. SA Engel

Tuesday, October 20, 2009

Medical Image Perception Society 2009 - Day one

The first day of the Medical Image Perception Society conference, held biannual, this year in Santa Barbara was filled with interesting talks. Plenty of research utilizing eye tracking as a means of obtaining data. The conference is hosted by Craig Abbey and Miguel Eckstein at the Department of Psychology at the University of California, Santa Barbara in cooperation with Elizabeth Krupinski (book1 , book2) from University of Arizona whom has performed extensive research on eye movements (among other things) in relation to medical imaging and radiology.

Session 1. Visual Search. Chair: Claudia Mello-Thoms, PhD
Session 2. Visual Search. Chair: Elizabeth Krupinski, PhD
  • Visual Search Characteristics of Pathology Residents Reading Dermatopathology Slides. J Law & C Mello-Thoms
  • Are you a good eye-witness? Perceptual differences between physicians and lay people. C Mello-Thoms
  • Eye movements and computer-based mammographic interpretation training. Y Chen & A Gale
Session 3. Perceptual Effects. Chair: David Manning, PhD
  • Nuisance levels of noise effects Radiologists Performance. MF Mc Entee, A O'Beirne, J Ryan, R Toomey, M Evanoff, D Chakraborty, D Manning, & PC. Brennan
  • Observer Performance in Stroke Interpretation: The Influence of Experience and Clinical Information in Multidimensional Magnetic Resonance Imaging. L Cooper, A Gale, J Saada, S Gedela, H Scott, & A Toms
  • Interpretation of wrist radiographs: A comparison between final year medical and radiography students. L Hutchinson, P Brennan & L Rainford
  • Tumor measurement for revised TNM staging of lung cancer. FL Jacobson, A Sitek, D Getty, & SE Seltzer
  • Does Reader Visual Fatigue Impact Performance? EA Krupinski & KS Berbaum
  • Ambient Temperature is an Important Consideration in the Radiology Reading Room. MF Mc Entee & S Gafoor
Session 4. Performance Measurement I. Chair: Dev Chakraborty, PhD
  • Perceptual indicators of the holistic view in pulmonary nodule detection. MW Pietrzyk, DJ Manning, T Donovan, & Alan Dix
  • An e-learning tutorial demonstrates significant improvements in ROC performance amongst naive observers in breast image interpretation. PBL Soh, PC Brennan, A Poulos, W Reed
  • Is n ROC-type response Truly always better than A Binary Response? D Gur, AI Bandos, HE Rockette, ML Zuley, CM Hakim, DM Chough, MA Ganott
  • Recognition of Images in Reader Studies: How Well Can We Predict Which Will Be Remembered? T Miner Haygood, P O’Sullivan, J Ryan, E Galvan, J-M Yamal, M Evanoff, M McEntee, J Madewell, C Sandler, E Lano, & P Brennan
Session 5. Performance Measurement I. Chair: Alastair Gale, PhD
  • New classes of models with monotonic likelihood ratios. F Samuelson
  • Sample size estimation procedure for free-response (FROC) studies. DP Chakraborty & M Bath
  • Comparison of Four Methods (ROC, JAFROC, IDCA, and ROI) for Analysis of Free Response Clinical Data. F Zanca, DP Chakraborty, J Jacobs, G. Marchal, and H Bosmans
Feel free to post additional links in the comments. Slides will be posted as they become available.

Thursday, October 8, 2009

DoCoMo EOG update

While eye movement detection using EOG is nothing new the latest demonstration by Japanese NTT DoCoMo illustrates recent developments in the field. The innovation here is the form factor which is quite impressive. Typically EOG is detected using electrodes placed around the eyes as in Andreas Bullings prototype demonstrated at CHI 09 in Boston. Now it can be done using tiny sensors inside the ear. Just compare it to the prototype demonstrated last year!







Thanks Roman for the links!

Monday, September 28, 2009

Wearable Augmented Reality System using Gaze Interaction (Park, Lee & Choi)

Came across this paper on a wearable system that employs a small eye tracker and a head mounted display for augmented reality. I've previously posted a video on the same system. It's a future technology with great potential, only imagination sets the limit here. There is a lot of progress in image/object recognition and location awareness taking place right now (with all the associated non-trivial problems to solve!)


Abstract
"Undisturbed interaction is essential to provide immersive AR environments. There have been a lot of approaches to interact with VEs (virtual environments) so far, especially in hand metaphor. When the user‟s hands are being used for hand-based work such as maintenance and repair, necessity of alternative interaction technique has arisen. In recent research, hands-free gaze information is adopted to AR to perform original actions in concurrence with interaction. [3, 4]. There has been little progress on that research, still at a pilot study in a laboratory setting. In this paper, we introduce such a simple WARS(wearable augmented reality system) equipped with an HMD, scene camera, eye tracker. We propose „Aging‟ technique improving traditional dwell-time selection, demonstrate AR gallery – dynamic exhibition space with wearable system."
  • Park, H. M., Seok Han Lee, and Jong Soo Choi 2008. Wearable augmented reality system using gaze interaction. In Proceedings of the 2008 7th IEEE/ACM international Symposium on Mixed and Augmented Reality - Volume 00 (September 15 - 18, 2008). Symposium on Mixed and Augmented Reality. IEEE Computer Society, Washington, DC, 175-176. DOI= http://dx.doi.org/10.1109/ISMAR.2008.4637353

Friday, September 18, 2009

The EyeWriter project

For some time I've been following the EyeWriter project which aims at enabling Tony, who has ALS, to draw graffiti using eye gaze alone. The open source eye tracker is available at Google code and is based on C++, OpenFrameworks and OpenCV. The current version supports basic pupil tracking based on image thresholding and blob detection but they are aiming for remote tracking using IR glints. Keep up the great work guys!

The Eyewriter from Evan Roth on Vimeo.

eyewriter tracking software walkthrough from thesystemis on Vimeo.

More information is found at http://fffff.at/eyewriter/

Monday, September 14, 2009

GaZIR: Gaze-based Zooming Interface for Image Retrieval (Kozma L., Klami A., Kaski S., 2009)

From the Helsinki Institute for Information Technology, Finland, comes a research prototype called GaZIR for gaze based image retrieval built by Laszlo Kozma, Arto Klami and Samuel Kaski. The GaZIR prototype uses a light-weight logistic regression model as a mechanism for predicting relevance based on eye movement data (such as viewing time, revisit counts, fixation length etc.) All occurring on-line in real time. The system is build around the PicSOM (paper) retrieval engine which is based on tree structured self-organizing maps (TS-SOMs). When provided a set of reference images the PicSOM engine goes online to download a set of similar images (based on color, texture or shape)

Abstract
"We introduce GaZIR, a gaze-based interface for browsing and searching for images. The system computes on-line predictions of relevance of images based on implicit feedback, and when the user zooms in, the images predicted to be the most relevant are brought out. The key novelty is that the relevance feedback is inferred from implicit cues obtained in real-time from the gaze pattern, using an estimator learned during a separate training phase. The natural zooming interface can be connected to any content-based information retrieval engine operating on user feedback. We show with experiments on one engine that there is sufficient amount of information in the gaze patterns to make the estimated relevance feedback a viable choice to complement or even replace explicit feedback by pointing-and-clicking."


Fig1. "Screenshot of the GaZIR interface. Relevance feedback gathered from outer rings influences the images retrieved for the inner rings, and the user can zoom in to reveal more rings."

Fig2. "Precision-recall and ROC curves for userindependent relevance prediction model. The predictions (solid line) are clearly above the baseline of random ranking (dash-dotted line), showing that relevance of images can be predicted from eye movements. The retrieval accuracy is also above the baseline provided by a naive model making a binary relevance judgement based on whether the image was viewed or not (dashed line), demonstrating the gain from more advanced gaze modeling."

Fig 3. "Retrieval performance in real user experiments. The bars indicate the proportion of relevant images shown during the search in six different search tasks for three different feedback methods. Explicit denotes the standard point-and-click feedback, predicted means implicit feedback inferred from gaze, and random is the baseline of providing random feedback. In all cases both actual feedback types outperform the baseline, but the relative performance of explicit and implicit feedback depends on the search task."
  • László Kozma, Arto Klami, and Samuel Kaski: GaZIR: Gaze-based Zooming Interface for Image Retrieval. To appear in Proceedings of 11th Conference on Multimodal Interfaces and The Sixth Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI), Boston, MA, USA, Novermber 2-6, 2009. (abstract, pdf)

Friday, September 11, 2009

An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eye-Tracking Data (Nyström M. & Holmqvist K, 2009)

From Markus Nyström and Kenneth Holmqvist at the Lund University Humanities Lab (HumLab) in Sweden comes an interesting paper on a novel algorithm that is capable of detecting glissades (aka dynamic overshoot) in eye tracker data. These are wobbling eye movements often found at the end of saccades and has previously been considered errors in saccadic programming with limited value. What ever their function is the phenomena does exists and should be accounted for. The paper reports finding glissades following half of all saccades while reading or viewing scenes, and has an average duration of 24 ms. This is work is important as it extends the default categorization of eye movement e.g. fixation, saccade, smooth pursuit, and blink. The algorithm is based on velocity saccade detection and is driven by data while containing a limited number of subjective settings. The algorithm contains a number of improvements such as thresholds for peak- and saccade onset/offset detection, adaptive threshold adjustment based on local noise levels, physical constraints on eye-movements to exclude noise and jitter, and new recommendations for minimum allowed fixation and saccade duration. Also, important to note that the data was obtained using a high-speed 1250 Hz SMI system, how the algorithm performs on a typical remote tracker running at 50-250Hz has yet to be defined.

Wednesday, September 9, 2009

Psychnology Journal: Gaze control for work and play

"PsychNology Journal (ISSN 1720-7525) is a quadrimestral, international, peer-reviewed journal on the relationship between humans and technology. The name 'PsychNology' emphasizes its multidisciplinary interest in all issues related to the human adoption and development of technologies. Its broad scope allows to host in a sole venue advances and ideas that would otherwise remain confined within separate communities or disciplines. PNJ is an independent, electronic publication that leaves the copyright to authors, and provides wide accessibility to their papers through the Internet and several indexing and abstracting services including PsycInfo and EBSCO."

The Psychnology Journal Special edition on Gaze control for work and play is now available online. It contains some of the highlights from the Cogain conference last year in an extended journal format. For the COGAIN people this is old news, for the rest it's hopefully interesting stuff. The NeoVisus prototype I presented in Prague should have appeared but unfortunately did not have the time to make the necessary changes. More information on the scrollable keyboard and text entry by gaze in general is available in Päivi's excellent Ph.D thesis. Also, rumor has it that Javier San Agustin's Ph.D thesis gaze interaction and a low-cost alternative is getting closer to D-day. We're all looking forward to it, hang in there mate =)

Thursday, August 20, 2009

A geometric approach to remote eye tracking (Villanueva et al, 2009)

Came across this paper today, it's good news and a great achievement, especially since consumer products for recording high definition over a plain USB port has begun to appear. For example the upcoming Microsoft Lifecam Cinema HD provides 1,280 x 720 at 30 frames per second. This is to be released on September 9th at a reasonable US$ 80. Hopefully it will allow a simple modification to remove the infrared blocking filter. Things are looking better and better for low-cost eye tracking, keep up the excellent work, it will make a huge difference for all of us.

Abstract
"This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a set of approaches is proposed based on different combinations of well-known features and their behaviour is valuated, taking into account various additional criteria such as free head movement, and minimum hardware and calibration requirements. The paper proposes a final method based on multiple glints and the pupil centre; the method is evaluated experimentally. Future trends in eye tracking technology are also discussed."


The algorithms were implemented in C++ running on a Windows PC equipped with a Pentium 4 processor at 3 GHz and 1 GB of Ram. The camera of choice delivers 15 frames per second at 1280 x 1024. Optimal distance from screen is 60 cm which is rather typical for remote eye trackers. This provides a track-box volume of 20 x 20 x 20 cm. Within this area the algorithms produce an average accuracy of 1.57 degrees. A 1 degree accuracy may be achieved obtained if the head is the same position as it was during calibration. Moving the head parallel to the monitor plane increases error by 0.2 - 0.4 deg. while moving closer or further away introduces a larger error between 1-1.5 degrees (mainly due to camera focus range). Note that no temporal filtering was used in the reporting. All-in-all these results are not so far from what typical remote systems produce.


The limitation of 15 fps stems from the frame rate of the camera, the software itself is able to process +50 images per second on the specified machine. Leaving it to our imagination what frame rates may be achieved with a fast Intel Core i7 processor with four cores.


  • A. Villanueva, G. Daunys, D. Hansen, M. Böhme, R. Cabeza, A. Meyer, and E. Barth, "A geometric approach to remote eye tracking," Universal Access in the Information Society. [Online]. Available: http://dx.doi.org/10.1007/s10209-009-0149-0

Tuesday, August 18, 2009

COGAIN Student Competition Results

Lasse Farnung Laursen, a Ph.D student with the Department of Informatics and Mathematical Modeling at the Technical University of Denmark, won this years COGAIN student competition with the leisure application called GazeTrain.

"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)

The GazeTrain game.

Runner ups, sharing the second place were

Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".

Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.

Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Lund Eye-Tracking Academy (LETA)

Kenneth Holmqvist and his team at the Humanities Lab at Lund University, Sweden will host another three day long LETA training course in eye tracking and analysis of eye movement data. This is an excellent opportunity to get hands-on experience using state-of-the art equipment and setting up experiments. The course is held between 23rd-25th September and the registration is open.

Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?

Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).

Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz

Thursday, August 6, 2009

Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze

The most complete publication on gaze typing is now available as Päivi Majaranta at the University of Tampere have successfully defended her PhD thesis. It summarizes previous work and discusses/exemplifies important topics such as word prediction, layout, feedback and user aspects. The material is presented in a straight forward manner with a clear structure and excellent illustrations. It will without doubt be useful for anyone who is about to design and develop a gaze based text entry interface. Congratulations Päivi for such an well written thesis.



Friday, July 31, 2009

SMI RED 250!


Today SMI announced the new RED250 which, as the name suggests, has an impressive 250Hz sampling rate. It has an accuracy of 0.5 degrees or below (typ.), less than 10 ms. latency and operates within 60-80 cm head distance. The track-box is 40x40 cm at 70cm distance and will recover tracking faster than the previous model. No details on pricing yet but top of the line performance never comes cheap. Get the flyer as pdf.

Survey on gaze visualization in 3D virtual environments

Got an email today from Sophie Stellmach a PhD student from the User Interface & Software Engineering group at the Otto-von-Guericke University in Germany. She has posted an online survey and would like your some input from eye tracking specialists on 3D gaze visualization.

"In the course of my work I have developed several gaze visualizations for facilitating eye tracking studies in (static) three-dimensional virtual environments. In order to evaluate the potential utility of these techniques, I am conducting an online survey with eye tracking researchers and professionals. I would like to invite you to this survey as I think that your answers are highly valuable for this investigation. The survey should take less than 10 minutes of your time! Your answers will be stored anonymously. You can access the survey under the following link: http://gamescience.bth.se/survey/index.php?sid=27319=en "

Wednesday, July 22, 2009

Telegaze update

Remember the TeleGaze robot developed by Hemin Omer which I wrote about last September? Today there is a new video available showing an updated interface which appears to be somewhat improved, no further information is available.
Update: The new version includes an automatic "person-following" mode which can be turned on or off through the interface. See video below

Gaze Interaction in Immersive Virtual Reality - 3D Eye Tracking in Virtual Worlds

Thies Pfeiffer (blog) working in the A.I group at the Faculty of technology, Bielefeld University in Germany have presented some interesting research on 3D gaze interaction in virtual environments. As the video demonstrates they have achieved high accuracy for gaze based pointing and selection. This opens up for a wide range of interesting man-machine interaction where digital avatars may mimic natural human behavior. Impressive.



Publications
  • Pfeiffer, T. (2008). Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up. In Virtuelle und Erweiterte Realität - Fünfter Workshop der GI-Fachgruppe VR/AR, 81-92. Aachen: Shaker Verlag GmbH. [Abstract] [BibTeX] [PDF]
  • Pfeiffer, T., Latoschik, M.E. & Wachsmuth, I. (2008). Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments. Journal of Virtual Reality and Broadcasting, 5 (16), dec. [Abstract] [BibTeX] [URL] [PDF]
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). 3D fixations in real and virtual scenarios. Journal of Eye Movement Research, Special issue: Abstracts of the ECEM 2007, 13.
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). Blickfixationstiefe in stereoskopischen VR-Umgebungen: Eine vergleichende Studie. In Vierter Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, 113-124. Aachen: Shaker. [Abstract] [BibTeX] [PDF]
List of all publications available here.

Wednesday, July 15, 2009

Gaze & Voice recognition game development blog

Jonathan O'Donovan, a masters student in Interactive Entertainment Technology at the Trinity College in Dublin, have recently started a blog for his thesis. It will combine gaze and voice recognition for developing a new video game. So far the few posts available have mainly concerned the underlying framework but a proof-of-concept combining gaze and voice is demonstrated. The project will be developed on a Microsoft Windows based platform and utilizes the XNA game development framework for graphics and the Microsoft Speech SDK for voice input. The eye tracker of choice is a Tobii T60 provided by Acuity ETS (Reading, UK). The thesis will be supervised by Veronica Sundstedt at the Trinity College Computer Science dept.
Keep us posten Jonathan, excitied to see what you'll come up with!





Update: 
The project resulted in the Rabbit Run game which is documented in the following publication:

  • J. O’Donovan, J. Ward, S. Hodgins, V. Sundstedt (2009) Rabbit Run: Gaze and Voice Based Game Interaction (PDF). 

Monday, July 13, 2009

Oculis labs Chameleon prevents over shoulder reading

"Two years ago computer security expert Bill Anderson read about scientific research on how the human eye moves as it reads and processes text and images. 'This obscure characteristic... suddenly struck me as (a solution to) a security problem,' says Anderson. With the help of a couple of software developers, Anderson developed a software program called Chameleon that tracks a viewer's gaze patterns and only allows an authorized user to read text on the screen, while everyone else sees gibberish. Chameleon uses gaze-tracking software and camera equipment to track an authorized reader's eyes to show only that one person the correct text. After a 15-second calibration period in which the software learns the viewer's gaze patterns, anyone looking over that user's shoulder sees dummy text that randomly and constantly changes. To tap the broader consumer market, Anderson built a more consumer-friendly version called PrivateEye, which can work with a simple Webcam to blur a user's monitor when he or she turns away. It also detects other faces in the background, and a small video screen pops up to alert the user that someone is looking at the screen. 'There have been inventions in the space of gaze-tracking. There have been inventions in the space of security,' says Anderson. 'But nobody has put the two ideas together, as far as we know.'" (source)

Patent application
Article by Baltimore Sun

Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).


Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Tuesday, May 26, 2009

Toshiba eye tracking for automotive applications

Seen this one coming for a while. Wonder how stable it would be in a real-life scenario..
Via Donald Melanson at Engadget:
"We've seen plenty of systems that rely on facial recognition for an interface, but they've so far been a decidedly rarer occurrence when it comes to in-car systems. Toshiba looks set to change that, however, with it now showing off a new system that'll not only let you control the A/C or radio with the glance of your eye, but alert you if you happen to take your eyes off the road for too long. That's done with the aid of a camera mounted above the steering wheel that's used to identify and map out the driver's face, letting the car (or desktop PC in this demonstration) detect everything from head movement and eye direction to eyelid blinks, which Toshiba says could eventually be used to alert drowsy drivers. Unfortunately, Toshiba doesn't have any immediate plans to commercialize the technology, although it apparently busily working to make it more suited for embedded CPUs." (source)

Tuesday, May 19, 2009

Hands-free Interactive Image Segmentation Using Eyegaze (Sadeghi, M. et al, 2009)

Maryam Sadeghi, a Masters student at the Medical Image Analysis Lab at the Simon Fraser University in Canada presents an interesting paper on using eye tracking for gaze driven image segmentation. The research has been performed in cooperation with Geoffry Thien (Ph.D student), Dr. Hamarneh and Stella Atkins (principal investigators). More information is to be published on this page. Geoffry Thien completed his M.Sc thesis on gaze interaction in March under the title "Building Interactive Eyegaze Menus for Surgery" (abstract) unfortunately I have not been able to located a electronic copy of that document.

Abstract
"This paper explores a novel approach to interactive user-guided image segmentation, using eyegaze information as an input. The method includes three steps: 1) eyegaze tracking for providing user input, such as setting object and background seed pixel selection; 2) an optimization method for image labeling that is constrained or affected by user input; and 3) linking the two previous steps via a graphical user interface for displaying the images and other controls to the user and for providing real-time visual feedback of eyegaze and seed locations, thus enabling the interactive segmentation procedure. We developed a new graphical user interface supported by an eyegaze tracking monitor to capture the user's eyegaze movement and fixations (as opposed to traditional mouse moving and clicking). The user simply looks at different parts of the screen to select which image to segment, to perform foreground and background seed placement and to set optional segmentation parameters. There is an eyegaze-controlled "zoom" feature for difficult images containing objects with narrow parts, holes or weak boundaries. The image is then segmented using the random walker image segmentation method. We performed a pilot study with 7 subjects who segmented synthetic, natural and real medical images. Our results show that getting used the new interface takes about only 5 minutes. Compared with traditional mouse-based control, the new eyegaze approach provided a 18.6% speed improvement for more than 90% of images with high object-background contrast. However, for low contrast and more difficult images it took longer to place seeds using the eyegaze-based "zoom" to relax the required eyegaze accuracy of seed placement." Download paper as pdf.

The custom interface is used to place backgound (red) and object (green) seeds which are used in the segmentation process. The custom fixation detection algorithm triggers a mouse click to the gaze position, if 20 of the previous 30 gaze samples lies within a a 50 pixel radius.


The results indicate a certain degree of feasibility for gaze assisted segmentation, however real-life situations often contain more complex images where borders of objects are less defined. This is also indicated in the results where the CT brain scan represents the difficult category. For an initial study the results are interesting and it's likely that we'll see more of gaze interaction within domain specific applications in a near future.


  • Maryam Sadeghi, Geoff Tien, Ghassan Hamarneh, and Stella Atkins. Hands-free Interactive Image Segmentation Using Eyegaze. In SPIE Medical Imaging 2009: Computer-Aided Diagnosis. Proceedings of the SPIE, Volume 7260 (pdf)

Wednesday, May 13, 2009

GaCIT 2009 : Summer School on Gaze, Communication, and Interaction Technology

"The GaCIT summer school offers an intensive one-week camp where doctoral students and researchers can learn and refresh skills and knowledge related to gaze-based text entry under the tutelage of leading experts in the area. The program will include theoretical lectures and hands-on exercises, an opportunity for participants to present their own work, and a social program enabling participants to exchange their experiences in a relaxing and inspiring atmosphere."

The GaCIT workshop is organized by the graduate school on User-Centered Information Technology at the University of Tampere, Finland (map). The workshop runs between July 27-31. I attended last year and found it to be great week with interesting talks and social events. See the day-by-day coverage of the GaCIT 2008.

Topics and speakers:
  • Introduction to Gaze-based Communication (Howell Istance)

  • Evaluation of Text Entry Techniques (Scott MacKenzie)
    Survey of text entry methods. Models, metrics, and procedures for evaluating text entry methods.

  • Details of Keyboards and Users Matter (Päivi Majaranta)
    Issues specific to eye-tracker use of soft keyboards, special issues in evaluating text entry techniques with users that use eye trackers for communication.

  • Communication by Eyes without Computers (TBA)
    Introduction to eye-based communication using low-tech devices.

  • Gesture-based Text Entr Techniques (Poika Isokoski)
    Overview of studies evaluating techniques such as Dasher, QuikWrite and EdgeWrite in the eye-tracker context

  • Low-cost Devices and the Future of Gaze-based Text Entry (John Paulin Hansen)
    Low-cost eye tracking and its implications for text entry systems. Future of gaze-based text entry.

  • Dwell-free text entry techniques (Anke Huckauf)
    Introduction to gaze-based techniques that do not utilize the dwell-time protocol for item selection.

Visit the GaCIT 2009 website for more information.

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Tuesday, May 12, 2009

BBC News: The future of gadget interaction

Dan Simmons at BBC reports on future technologies from the Science Beyond Fiction 2009 conference in Prague. The news headline includes a section on the GazeCom project who won the 2nd prize for their exhibit "Gaze-contingent displays and interaction". Their website hosts additional demonstrations.

"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.

External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.

The GazeCom project involves the following partners:

ETRA 2010 Call for papers

ETRA 2010 will be the sixth biennial symposium in a series that focuses on all aspects of eye movement research across a wide range of disciplines. The goal of ETRA is to bring together computer scientists, engineers and behavioral scientists in support of a common vision of enhancing eye tracking research and applications. ETRA 2010 is being organized in conjunction with the European Communication by Gaze Interaction (COGAIN) research network that specializes in gaze-based interaction for the benefit of people with physical disabilities.

Update: List of accepted and presented papers.

Symposium Themes
  • Advances in Eye Tracking Technology and Data Analysis
    Eye tracking systems, calibration algorithms, data analysis techniques, noise reduction, predictive models, 3D POR measurement, low cost and natural light systems.
  • Visual Attention and Eye Movement Control
    Studies of eye movements in response to natural stimuli, driving studies, web use and usability studies.
  • Eye Tracking Applications
    Gaze-contingent displays, attentive user interfaces, gaze-based interaction techniques, security systems, multimodal interfaces, augmented and mixed reality systems, ubiquitous computing.
  • Special Theme: Eye Tracking and Accessibility
    Eye tracking has proved to be an effective means of making computers more accessible when the use of keyboards and mice is hindered by the task itself (such as driving), or by physical disabilities. We invite submissions that explore new methodological strategies, applications, and results that use eye tracking in assistive technologies for access to desktop applications, for environment and mobility control, and for gaze control of games and entertainment..
Two categories of submissions are being sought – Full Papers and Short Papers.
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.

Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.

Full Papers Deadlines
  • Sep. 30th, 2009 Full Papers abstract submission deadline
  • Oct. 7th, 2009 Full Papers submission deadline
  • Nov. 13th, 2009 Acceptance notification
Short Papers Deadlines
  • Dec. 2th, 2009 Short Papers submission deadline
  • Jan. 8th, 2010 Short Papers acceptance notification
  • Jan. 15th, 2010 All camera ready papers due
More information on the ETRA website.