Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).


Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Tuesday, May 26, 2009

Toshiba eye tracking for automotive applications

Seen this one coming for a while. Wonder how stable it would be in a real-life scenario..
Via Donald Melanson at Engadget:
"We've seen plenty of systems that rely on facial recognition for an interface, but they've so far been a decidedly rarer occurrence when it comes to in-car systems. Toshiba looks set to change that, however, with it now showing off a new system that'll not only let you control the A/C or radio with the glance of your eye, but alert you if you happen to take your eyes off the road for too long. That's done with the aid of a camera mounted above the steering wheel that's used to identify and map out the driver's face, letting the car (or desktop PC in this demonstration) detect everything from head movement and eye direction to eyelid blinks, which Toshiba says could eventually be used to alert drowsy drivers. Unfortunately, Toshiba doesn't have any immediate plans to commercialize the technology, although it apparently busily working to make it more suited for embedded CPUs." (source)

Tuesday, May 19, 2009

Hands-free Interactive Image Segmentation Using Eyegaze (Sadeghi, M. et al, 2009)

Maryam Sadeghi, a Masters student at the Medical Image Analysis Lab at the Simon Fraser University in Canada presents an interesting paper on using eye tracking for gaze driven image segmentation. The research has been performed in cooperation with Geoffry Thien (Ph.D student), Dr. Hamarneh and Stella Atkins (principal investigators). More information is to be published on this page. Geoffry Thien completed his M.Sc thesis on gaze interaction in March under the title "Building Interactive Eyegaze Menus for Surgery" (abstract) unfortunately I have not been able to located a electronic copy of that document.

Abstract
"This paper explores a novel approach to interactive user-guided image segmentation, using eyegaze information as an input. The method includes three steps: 1) eyegaze tracking for providing user input, such as setting object and background seed pixel selection; 2) an optimization method for image labeling that is constrained or affected by user input; and 3) linking the two previous steps via a graphical user interface for displaying the images and other controls to the user and for providing real-time visual feedback of eyegaze and seed locations, thus enabling the interactive segmentation procedure. We developed a new graphical user interface supported by an eyegaze tracking monitor to capture the user's eyegaze movement and fixations (as opposed to traditional mouse moving and clicking). The user simply looks at different parts of the screen to select which image to segment, to perform foreground and background seed placement and to set optional segmentation parameters. There is an eyegaze-controlled "zoom" feature for difficult images containing objects with narrow parts, holes or weak boundaries. The image is then segmented using the random walker image segmentation method. We performed a pilot study with 7 subjects who segmented synthetic, natural and real medical images. Our results show that getting used the new interface takes about only 5 minutes. Compared with traditional mouse-based control, the new eyegaze approach provided a 18.6% speed improvement for more than 90% of images with high object-background contrast. However, for low contrast and more difficult images it took longer to place seeds using the eyegaze-based "zoom" to relax the required eyegaze accuracy of seed placement." Download paper as pdf.

The custom interface is used to place backgound (red) and object (green) seeds which are used in the segmentation process. The custom fixation detection algorithm triggers a mouse click to the gaze position, if 20 of the previous 30 gaze samples lies within a a 50 pixel radius.


The results indicate a certain degree of feasibility for gaze assisted segmentation, however real-life situations often contain more complex images where borders of objects are less defined. This is also indicated in the results where the CT brain scan represents the difficult category. For an initial study the results are interesting and it's likely that we'll see more of gaze interaction within domain specific applications in a near future.


  • Maryam Sadeghi, Geoff Tien, Ghassan Hamarneh, and Stella Atkins. Hands-free Interactive Image Segmentation Using Eyegaze. In SPIE Medical Imaging 2009: Computer-Aided Diagnosis. Proceedings of the SPIE, Volume 7260 (pdf)

Wednesday, May 13, 2009

GaCIT 2009 : Summer School on Gaze, Communication, and Interaction Technology

"The GaCIT summer school offers an intensive one-week camp where doctoral students and researchers can learn and refresh skills and knowledge related to gaze-based text entry under the tutelage of leading experts in the area. The program will include theoretical lectures and hands-on exercises, an opportunity for participants to present their own work, and a social program enabling participants to exchange their experiences in a relaxing and inspiring atmosphere."

The GaCIT workshop is organized by the graduate school on User-Centered Information Technology at the University of Tampere, Finland (map). The workshop runs between July 27-31. I attended last year and found it to be great week with interesting talks and social events. See the day-by-day coverage of the GaCIT 2008.

Topics and speakers:
  • Introduction to Gaze-based Communication (Howell Istance)

  • Evaluation of Text Entry Techniques (Scott MacKenzie)
    Survey of text entry methods. Models, metrics, and procedures for evaluating text entry methods.

  • Details of Keyboards and Users Matter (Päivi Majaranta)
    Issues specific to eye-tracker use of soft keyboards, special issues in evaluating text entry techniques with users that use eye trackers for communication.

  • Communication by Eyes without Computers (TBA)
    Introduction to eye-based communication using low-tech devices.

  • Gesture-based Text Entr Techniques (Poika Isokoski)
    Overview of studies evaluating techniques such as Dasher, QuikWrite and EdgeWrite in the eye-tracker context

  • Low-cost Devices and the Future of Gaze-based Text Entry (John Paulin Hansen)
    Low-cost eye tracking and its implications for text entry systems. Future of gaze-based text entry.

  • Dwell-free text entry techniques (Anke Huckauf)
    Introduction to gaze-based techniques that do not utilize the dwell-time protocol for item selection.

Visit the GaCIT 2009 website for more information.

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Tuesday, May 12, 2009

BBC News: The future of gadget interaction

Dan Simmons at BBC reports on future technologies from the Science Beyond Fiction 2009 conference in Prague. The news headline includes a section on the GazeCom project who won the 2nd prize for their exhibit "Gaze-contingent displays and interaction". Their website hosts additional demonstrations.

"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.

External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.

The GazeCom project involves the following partners:

ETRA 2010 Call for papers

ETRA 2010 will be the sixth biennial symposium in a series that focuses on all aspects of eye movement research across a wide range of disciplines. The goal of ETRA is to bring together computer scientists, engineers and behavioral scientists in support of a common vision of enhancing eye tracking research and applications. ETRA 2010 is being organized in conjunction with the European Communication by Gaze Interaction (COGAIN) research network that specializes in gaze-based interaction for the benefit of people with physical disabilities.

Update: List of accepted and presented papers.

Symposium Themes
  • Advances in Eye Tracking Technology and Data Analysis
    Eye tracking systems, calibration algorithms, data analysis techniques, noise reduction, predictive models, 3D POR measurement, low cost and natural light systems.
  • Visual Attention and Eye Movement Control
    Studies of eye movements in response to natural stimuli, driving studies, web use and usability studies.
  • Eye Tracking Applications
    Gaze-contingent displays, attentive user interfaces, gaze-based interaction techniques, security systems, multimodal interfaces, augmented and mixed reality systems, ubiquitous computing.
  • Special Theme: Eye Tracking and Accessibility
    Eye tracking has proved to be an effective means of making computers more accessible when the use of keyboards and mice is hindered by the task itself (such as driving), or by physical disabilities. We invite submissions that explore new methodological strategies, applications, and results that use eye tracking in assistive technologies for access to desktop applications, for environment and mobility control, and for gaze control of games and entertainment..
Two categories of submissions are being sought – Full Papers and Short Papers.
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.

Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.

Full Papers Deadlines
  • Sep. 30th, 2009 Full Papers abstract submission deadline
  • Oct. 7th, 2009 Full Papers submission deadline
  • Nov. 13th, 2009 Acceptance notification
Short Papers Deadlines
  • Dec. 2th, 2009 Short Papers submission deadline
  • Jan. 8th, 2010 Short Papers acceptance notification
  • Jan. 15th, 2010 All camera ready papers due
More information on the ETRA website.

Thursday, May 7, 2009

Interactive Yarbus at MU, Netherlands

An interactive art exhibition by Christien Meindertsma in the Netherlands opens up for a real time generation of scanpaths to draw images similar to the ones presented in classic Yarbus paper. The main purpose is to illustrate individual differences in the way we look at objects (such as faces, umbrellas, cups etc.) These images are then printed directly and becomes a part of the exhibition. The exhibition runs until June 14th (location: Eindhoven).

Scanpath from the Yarbus (1967) for comparison.

Wednesday, May 6, 2009

COGAIN 2009 Program announced

This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.

Update: the proceedings can be downloaded as pdf.


The program for May 26th
  • 08.00 Registration, exhibition, demonstrations, coffee, and rolls
SESSION I
  • 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
  • 09.10 Eye guidance in natural behaviour (B. W. Tatler)
  • 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
  • 10.30 Coffee, exhibition, demonstrations
SESSION II
  • 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
  • 11.30 An introduction to the 17 papers presented in the afternoon
  • 12.00 Lunch, exhibition, demonstrations, posters
SESSION III Track 1
SESSION III Track 2
14.50 Coffee, exhibition, demonstrations, posters

SESSION IV Track 1
  • 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
  • 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
  • 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
  • 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
  • 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
SESSION IV Track 2
  • 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
  • 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
  • 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
  • 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
19.00 COGAIN2009 dinner at Brede Spisehus