Monday, February 16, 2009

ID-U Biometrics: Eye movement based access control

Daphna Palti-Wasserman and Yoram Wasserman at ID-U Biometrics have developed a system which provides secure signatures to access control based on individual eye movement patterns. The subject’s response to a dynamic stimuli provides an unique characteristics. As the stimuli will change the subjects’ responses will be different each time but the pattern of eye movements and the users eye characteristics will remains the same. This results in a "code" which is not entered and not consciously controlled by the user which reduces issues of spoofing. Currently its in a proof-of-concept state, achieving a 100% accurate and stable eye tracking method which would be required for identification has yet to be achieved (by any eye tracking platform that is) However, this method of user identification could be applied in other situations than the ATM (I guess that's why they won the GLOBES start-up competition)

Links:

Tuesday, February 10, 2009

COGAIN 2009 (26th May) "Gaze interaction for those who want it most".

"The 5th international COGAIN conference on eye gaze interaction emphasises user needs and future applications of eye tracking technology. Robust gaze interaction methods have been available for some years, with substantial amounts of applications to support communication, learning and entertainment already being used. However, there are still some uncertainties about this new technology among communication specialists and funding institutions. The 5th COGAIN conference will focus on spreading the experiences of people using gaze interaction in their daily life to potential users and specialists who have yet to benefit from it. Case studies from researchers and manufacturers working on new ways of making gaze interaction available for all, as well as integrating eye gaze with other forms of communication technology are also particularly welcome. We also encourage papers and posters which reach beyond the special case of eye control for people with disabilities into mainstream human-computer interaction development, for instance using eye tracking technology to enhance gaming experience and strategic play."

Themes:

  • Gaze-based access to computer applications
  • Gaze and environmental control
  • Gaze and personal mobility control
  • User experience studies
  • Innovations in eyetracking systems
  • Low cost gaze tracking systems
  • Attentive interfaces and inferring user intent from gaze
  • Gaze-based interaction with virtual worlds
  • Gaze and creativity
  • Gaming using gaze as an input modality
  • Gaze interaction with wearable displays
  • Using gaze with other modalities including BCI

"Papers which deal with the use of eye gaze to study the usability of mainstream applications and websites are not normally considered for inclusion in the conference". For more information see the COGAIN 2009 Call for Papers

Important dates:

Paper submission, 28th February. Notification on acceptance, 15th April. The conference will be held on the 26th of May at the Danish Technical University in connection with the Visionday event.

Friday, January 30, 2009

SWAET 2009 Annouced

The Scandinavian Workshop of Applied Eye-Tracking aims at being a meeting place for graduate students, researchers and others using eye-tracking as a measuring tool. It will be held at the University of Stavanger (May 6- 7th). Keynote speakers at SWAET 2009 are Dr Benjamin Tatler (University of Dundee) and Prof Jukka Hyönä (University of Turku).

Suggested topics for workshop presentations:
  • Reading in various contexts
  • Psycholinguistics
  • Integration of pictures and language
  • Face-to-face interaction and other social contexts
  • Attention (such as top-down/bottom-up factors)
  • Controlling interfaces with eye-tracking
  • Viewer behaviour towards images and video
  • Vehicle and traffic research
  • Human factors; such as air traffic control, ship navigation and pilots
  • Evaluation of user interfaces
  • Cognitive processes such as navigation, planning, problem solving, mental imagery, memory etc.
If you wish to present your research, you have to submit an abstract no later than March 15th 2009. Decisions on acceptance are given on April 1st.

Registration at the conference is € 50 for all delegates except graduate and undergraduate students, who participate free of charge. After April 10th, expect to pay € 80 (students € 30).

Wednesday, January 21, 2009

Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments

Andreas Bulling in the Wearable Computing Group at the Swiss Federal Insitute of Technology (ETH) is working on a new Electrooculography-based eye tracking system. This technology relies on the small but measurable electrical currents (potentials) created by the eye musculature. A set of electrodes are attached to the skin and after signal processing this data can be used for controlling computer interfaces or other devices. The obvious advantage of this method of eye tracking compared to the more traditional corneal reflection video-based methods is that its not sensitive to sunlight and may therefor be used outdoors. However, to my knowledge, it provide a lower accuracy, this results in most EOG interfaces relying on eye gestures rather than gaze fixations.

"We want to introduce the paradigm of visual perception and investigations on eye movements as new methods to implement novel and complement current context-aware systems. Therefore, we will investigate the potential but also possible limitations of using eye movements to perform context and activity recognition in wearable settings. Besides recognizing individual activities another focus will be put on long-term eye movement analysis." More information.

Recently Andreas got a paper accepted for the CHI 2009 conference in Boston (April 4-9th) where the system will be demonstrated during the interactivity session. Andreas and the team at ETH are planning to investigate attentive user interfaces (AUI) in mobile settings using wearable systems, such as the prototype demonstrated in the video below.

View on YouTube

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Saturday, January 3, 2009

The Argentinian Eye Mouse software released (Amaro & Ponieman)

Nicolás Amaro and Nicolás Ponieman at the ORT Argentina, recently got the Chamber of Industry and Trade Argentine-German Award for Innovation 2008 for their work on a low-cost (webcam) headmounted corneal reflection based solution. Best of all the software can be downloaded which will directly benefit those who are in need but cannot afford the state-of-the-art systems currently on the market. As demonstrated by the video below it is capable of running grid-based interfaces, thus it should be adequate for GazeTalk and similar.

View on YouTube

Friday, January 2, 2009

An Unobtrusive Method for Gaze Tracking (N. Chitrik & Y. Schwartzburg)

Nava Chitrik and Yuliy Schwartzburg have in partial fulfillment of their Senior Design Project Requirements constructed a low-cost approach for remote eye tracking at the Cooper Union for the Advancement of Science and Art, Electrical Engineering Department.

"The line of a person's gaze is known to have many important applications in artificial intelligence (AI) and video conferencing but determining where a user is looking is still a very challenging problem. Traditionally, gaze trackers have been implemented with devices worn around the user's head, but more recent advances in the field use unobtrusive methods, i.e. an external video camera, to obtain information about where a person is looking. We have developed a simplified gaze tracking system using a single camera and a single point source mounted compactly in the view of the user, a large simplification over previous methods which have used a plurality of each. Furthermore, our algorithms are robust enough to allow head motion and our image processing functions are designed to extract data even from low-resolution or noisy video streams. Our system also has the computational advantage of working with very small image sizes, reducing the amount of resources needed for gaze tracking, freeing them up for applications that might utilize this information.

To reiterate: The main differences between this implementation and similar implementations are that this system uses a histogram method as opposed to edge detection to work with very low resolution video extremely quickly. However, it requires an infrared camera and infrared LED's. (Which can be purchased for less than 25 dollars online.)"

View on YouTube

Monday, December 8, 2008

Journal of Eye Movement Research: Special issue on eye tracking now online.

The special issue on "Eye Tracking and Usability Research" in the Journal of Eye Movement Research is now online. It features the following articles:
  • Helmert, J. R., Pannasch, S. & Velichkovsky, B. M. (2008). Eye tracking and Usability Research: an introduction to the special issue (editorial). Download as PDF.

  • Castellini, C. (2008). Gaze Tracking in Semi-Autonomous Grasping. Download as PDF

  • Helmert, J. R., Pannasch, S. & Velichkovsky, B. M. (2008). Influences of dwell time and cursor control on the performance in gaze driven typing. Download as PDF.

  • Huckauf, A. & Urbina, M. H. (2008). On object selection in gaze controlled environments. Download as PDF.

  • Hyrskykari, A. & Ovaska, S., Majaranta, P., Räihä, K.-J. & Lehtinen, M. (2008). Gaze Path Stimulation in Retrospective Think-Aloud. Download as PDF.

  • Pannasch, S., Helmert, J.R., Malischke, S., Storch, A. & Velichkovsky, B.M. (2008). Eye typing in application: A comparison of two systems with ALS patients. Download as PDF.

  • Zambarbieri, D., Carniglia, E. & Robino, C. (2008). Eye Tracking Analysis in Reading Online Newspapers. Download as PDF.

Monday, November 24, 2008

Our gaze controlled robot on the DR News

The Danish National Television "TV-Avisen" episode on our gaze controlled robot was broadcasted Friday 22nd November for the nine o´ clock news. Alternative versions (resolution) of the video clip can be found at the DR site.








View video

Friday, November 21, 2008

Eye movement control of remote robot

Yesterday we demonstrated our gaze navigated robot at the Microsoft Robotics event here at ITU Copenhagen. The "robot" transmits a video which is displayed on a client computer. By using an eye tracker we can direct the robot towards where the user is looking. The concept allows for a human-machine interaction with a direct mapping of the users intention. The Danish National TV (DR) came by today and recorded a demonstration. It will be shown tonight at the nine o´ clock news. Below is a video that John Paulin Hansen recorded yesterday which demonstrates the system. Please notice that the frame-rate of the video stream was well below average at the time of recording. It worked better today. In the coming week we'll look into alternative solutions (suggestions appreciated) The projects has been carried out in collaboration with Alexandre Alapetite from DTU. His low-cost, LEGO-based rapid mobile robot prototype, gives interesting possibilities to test some human-computer and human-robot interaction.



The virgin tour around the ITU office corridor (on YouTube)



Available on YouTube