Tuesday, July 22, 2008

Eye gestures (Hemmert, 2007)

Fabian Hemmert at the Potsdam University of Applied Sciences published his MA thesis in 2007. He put up a site with extensive information and demonstrations of his research in eye gesture such as winks, squints, blinks etc. See the videos or thesis. Good work and great approach!

One example:






"Looking with one eye is a simple action. Seeing the screen with only one eye might therefore be used to switch the view to an alternate perspective on the screen contents: a filter for quick toggling. In this example, closing one eye filters out information on screen to a subset of the original data, such as an overview over the browser page or only the five most recently edited files. It was to see how the users would accept the functionality at the cost of having to close one eye, a not totally natural action." (Source)

Monday, July 21, 2008

SMI Experiment Suite 360

Video demonstrates the easy workflow of Experiment Suite 360: Experiment Builder, iView RED: X Non-Invasive Eye Tracker and BeGaze Analaysis Software. It provides a set of examples of what eye tracking can be used for. Furthermore, the remote based system (IView RED) is the same eye tracker that was used for developing the NeoVisus prototype (although the interface works on multiple systems)


Tuesday, July 15, 2008

Sebastian Hillaire at IRISA Rennes, France

Sebastian Hillaire is a Ph.D student at the IRISA Rennes in France, member of the BUNRAKU and France Telecom R&D. His work is situated around using eye trackers for improving the depth-of-field visual scene in 3D environments. He has published two papers on the topic:

Automatic, Real-Time, Depth-of-Field Blur Effect for First-Person Navigation in Virtual Environment (2008)

"We studied the use of visual blur effects for first-person navigation in virtual environments. First, we introduce new techniques to improve real-time Depth-of-Field blur rendering: a novel blur computation based on the GPU, an auto-focus zone to automatically compute the user’s focal distance without an eye-tracking system, and a temporal filtering that simulates the accommodation phenomenon. Secondly, using an eye-tracking system, we analyzed users’ focus point during first-person navigation in order to set the parameters of our algorithm. Lastly, we report on an experiment conducted to study the influence of our blur effects on performance and subjective preference of first-person shooter gamers. Our results suggest that our blur effects could improve fun or realism of rendering, making them suitable for video gamers, depending however on their level of expertise."

Screenshot from the algorithm implemented in Quake 3 Arena.

  • Sébastien Hillaire, Anatole Lécuyer, Rémi Cozot, Géry Casiez
    Automatic, Real-Time, Depth-of-Field Blur Effect for First-Person Navigation in Virtual Environment. To appear in IEEE Computer Graphics and Application (CG&A), 2008 , pp. ??-??
    Source code (please refer to my IEEE VR 2008 publication)

Using an Eye-Tracking System to Improve Depth-of-Field Blur Effects and Camera Motions in Virtual Environments (2008)

"
We describes the use of user’s focus point to improve some visual effects in virtual environments (VE). First, we describe how to retrieve user’s focus point in the 3D VE using an eye-tracking system. Then, we propose the adaptation of two rendering techniques which aim at improving users’ sensations during first-person navigation in VE using his/her focus point: (1) a camera motion which simulates eyes movement when walking, i.e., corresponding to vestibulo-ocular and vestibulocollic reflexes when the eyes compensate body and head movements in order to maintain gaze on a specific target, and (2) a Depth-of-Field (DoF) blur effect which simulates the fact that humans perceive sharp objects only within some range of distances around the focal distance.

Second, we describe the results of an experiment conducted to study users’ subjective preferences concerning these visual effects during first-person navigation in VE. It showed that participants globally preferred the use of these effects when they are dynamically adapted to the focus point in the VE. Taken together, our results suggest that the use of visual effects exploiting users’ focus point could be used in several VR applications involving firstperson navigation such as the visit of architectural site, training simulations, video games, etc."



Sébastien Hillaire, Anatole Lécuyer, Rémi Cozot, Géry Casiez
Using an Eye-Tracking System to Improve Depth-of-Field Blur Effects and Camera Motions in Virtual Environments. Proceedings of IEEE Virtual Reality (VR) Reno, Nevada, USA, 2008, pp. 47-51. Download paper as PDF.

QuakeIII DoF&Cam sources (depth-of-field, auto-focus zone and camera motion algorithms are under GPL with APP protection)

Passive eye tracking while playing Civilization IV

While the SMI iView X RED eye tracker used in this video is not used for driving the interaction it showcases how eye tracking can be used for usability evaluations in interaction design (Civilization does steal my attention on occations, Sid Meier is just a brilliant game designer)

Thursday, July 10, 2008

Eye Gaze Interactive Air Traffic Controllers workstation (P.Esser & T.J.J Bos, 2007)

P.Esser and T.J.J Bos at the Maastricht University have developed a prototype for reducing the repetitive strain injuries Air Traffic Controllers sustain while operating their systems. The research was conducted at the National Aerospace Laboratory in the Netherlands. The results indicate a clear advantage compared to the traditional roller/track ball, especially for large distances. This is expected since Fitt's law does not apply in the same manner for eye movement as physical limb/hand movement. Sure eye movement over longer distances takes more time to perform than short ones but it does not compare to moving you arm one inch vs. one meter. Certainly there are more applications that could benifit from gaze assisted interaction, medical imaging in the field of radiology is one (such as CT, MRI, these produce very high resolution images with resolutions up to 4096x4096 pixels)


Summary of the thesis "Eye Gaze Interactive ACT workstation"
"Ongoing research is devoted to finding ways to improve performance and reduce workload of Air Traffic Controllers (ATCos) because their task is critical to the safe and efficient flow of air traffic. A new intuitive input method, known as eye gaze interaction, was expected to reduce the work- and task load imposed on the controllers by facilitating the interaction between the human and the ATC workstation. In turn, this may improve performance because the freed mental resources can be devoted to more critical aspects of the job, such as strategic planning. The objective of this Master thesis research was to explore how human computer interaction (HCI) in the ATC task can be improved using eye gaze input techniques and whether this will reduce workload for ATCos.


In conclusion, the results of eye gaze interaction are very promising for selection of aircraft on a radar screen. For entering instructions it was less advantageous. This is explained by the fact that in the first task the interaction is more intuitive while the latter is more a conscious selection task. For application in work environments with large displays or multiple displays eye gaze interaction is considered very promising. "



Download paper as pdf

Wednesday, July 9, 2008

GazeTalk 5

The GazeTalk system is one of the most comprehensive open solutions for gaze interaction today. It has been developed with the disabled users in mind and supports a wide range of everyday tasks. It dramatically increases the quality of life for the disabled suffering from ALS or similar conditions. The following information is quoted from the COGAIN website.

Information about Gazetalk 5 eye communication system

GazeTalk is a predictive text entry system that has a restricted on-screen keyboard with ambiguous layout for severely disabled people. The main reason for using such a keyboard layout is that it enables the use of an eye tracker with a low spatial resolution (e.g., a web-camera based eye tracker).

The goal of the GazeTalk project is to develop an eye-tracking based AAC system that supports several languages, facilitates fast text entry, and is both sufficiently feature-complete to be deployed as the primary AAC tool for users, yet sufficiently flexible and technically advanced to be used for research purposes. The system is designed for several target languages, initially Danish, English, Italian, German and Japanese.

Main features

  • type-to-talk
  • writing
  • email
  • web – browser
  • Multimedia – player
  • PDF – reader
  • letter and word prediction, and word completion
  • speech output
  • can be operated by gaze, headtracking, mouse, joystick, or any other pointing device
  • supports step-scanning (new!)
  • supports users with low precision in their movements, or trackers with low accuracy
  • allows the user to use Dasher inside GazeTalk and to transfer the text written in Dasher back to GazeTalk

GazeTalk 5.0 has been designed and developed by the Eye Gaze Interaction Group at the IT University of Copenhagen and the IT-Lab at the Royal School of Library and Information Science, Copenhagen.


gazetalk v5 screen shot gazetalk v5, linked with Dasher - screen shot

more info Read more About Gazetalk or view GazeTalk manual PDF icon

Short manual on data recording in Gazetalk Short manual on data recording in GazeTalk PDF icon

GazeTalk Videos

more info Download Gazetalk

Eye tracking using a webcamera

From the French ESIEA school of engineering comes a custom developed eye tracker using a simple web camera. It has head tracking capabilities and works in low light situations.



More information:
Hubert Wassner Blog (prof. of Comp.Sci) French / English (automatic translation)

Sunday, July 6, 2008

Eye tracking in space

The Eye Tracking Device (ETD) is used to determine the influence of prolonged microgravity and the accompanying vestibular (inner ear) adaptation on the orientation of Listings Plane (a coordinate framework, which is used to define the movement of the eyes in the head).

"The working hypothesis is that in microgravity the orientation of Listings Plane is altered, probably to a small and individually variable degree. Further, with the loss of the otolith-mediated gravitational reference, it is expected that changes in the orientation of the coordinate framework of the vestibular system occur, and thus a divergence between Listing?s Plane and the vestibular coordinate frame should be observed. While earlier ground-based experiments indicate that Listing?s Plane itself is to a small degree dependent on the pitch orientation to gravity, there is more compelling evidence of an alteration of the orientation of the vestibulo-ocular reflex (VOR), reflex eye movement that stabilizes images on the retina during head movement by producing an eye movement in the direction opposite to head movement, thus preserving the image on the center of the visual field, in microgravity.

Furthermore, changes in bodily function with relation to eye movement and spatial orientation that occur during prolonged disturbance of the vestibular system most likely play a major role in the problems with balance that astronauts experience following re-entry from space.

In view of the much larger living and working space in the ISS, and the extended program of spacewalks (EVAs) being planned, particular care must be given to assessing the reliability of functions related to eye movement and spatial orientation.

The performance of the experiments in space are therefore of interest for their expected contribution to basic research knowledge and to the improvement and assurance of human performance under weightless conditions."


NASA Image: ISS011E13710 - Cosmonaut Sergei K. Krikalev, Expedition 11 Commander representing Russia's Federal Space Agency, uses the Eye Tracking Device (ETD), a European Space Agency (ESA) payload in the Zvezda Service Module of the International Space Station. The ETD measures eye and head movements in space with great accuracy and precision.

"The ETD consists of a headset that includes two digital camera modules for binocular recording of horizontal, vertical and rotational eye movements and sensors to measure head movement. The second ETD component is a laptop PC, which permits digital storage of all image sequences and data for subsequent laboratory analysis. Listing's Plane can be examined fairly simply, provided accurate three-dimensional eye-in-head measurements can be made. Identical experimental protocols will be performed during the pre-flight, in-flight and post-flight periods of the mission. Accurate three-dimensional eye-in-head measurements are essential to the success of this experiment. The required measurement specifications (less than 0.1 degrees spatial resolution, 200 Hz sampling frequency) are fulfilled by the Eye Tracking Device (ETD)."

More information:
http://www.nasa.gov/mission_pages/station/science/experiments/ETD.html
http://www.spaceflight.esa.int/delta/documents/factsheet-delta-hp-etd.pdf
http://www.energia.ru/eng/iss/researches/medic-65.html

Realtime computer interaction via eye tracking (Dubey, 2004)

Premnath Dubey conducted research on eye tracking and gaze interaction for his masters thesis in 2004 at the Department of Computing at Curtin University in Australia.

Abstract
"This thesis presents a computer vision-based eye tracking system for human computer interaction. The eye tracking system allows the user to indicate a region of interest in a large data space and to magnify that area, without using traditional pointer devices. Presented is an iris tracking algorithm adapted from Camshift; an algorithm originally designed for face or hand tracking. Although the iris is much smaller and highly dynamic. the modified Camshift algorithm efficiently tracks the iris in real-time. Also presented is a method to map the iris centroid, in video coordinates to screen coordinates; and two novel calibration techniques, four point and one-point calibration. Results presented show that the accuracy for the proposed one-point calibration technique exceeds the accuracy obtained from calibrating with four points. The innovation behind the one-point calibration comes from using observed eye scanning behaviour to constrain the calibration process. Lastly, the thesis proposes a non-linear visualisation as an eye-tracking application, along with an implementation."

Download paper as PDF.

Thursday, July 3, 2008

Low cost eye tracking

Marcelo from Argentina have developed a low cost solution using the Logitech Quickcam Express webcamera. The video it produces has a resolution of 352 x 288 pixels. It is mounted close to the eye and with extra illumination from two lamps. Marcelos crude eye tracker relies on an elliptic fitting of the pupil in the visible light spectrum which differes from most commercial alternatives (which uses infrared light to create reflections on the eye ball, this is typically the second step to increase the accuracy)

Considering the low resolution of the camera and the simplicity of the setup the results are noteworthy. Hope to see more development of this!






Wednesday, July 2, 2008

Hot Zone prototype (Wakaruru)

The following video demonstrates a prototype of the Hot Zone system for controling windows applications. Call out the Hot Zone are made by pressing a single Hotkey and blink. The menu can be closed by looking outside the zone and blinking. Submitted to YouTube by "Wakaruru"


Submitted to YouTube by "Wakaruru"

The second video demonstrates how Hot Zone could be used to work with real world applications (PowerPoint). The center of the zone is located at the current gaze position when it's call out (like a context menu). the commands in the five zones depend on current selected object (based on what you are looking at now). No mouse needed (the cursor was designated as the gaze position using API). Pure blink without hotkey pressed was designated as single mouse click. Thus blink on the text area start text editing. Keyboard is only used for typing and hotkey to call out the zone. The eye tracking device used is ASL EH6000.


Submitted to YouTube by "Wakaruru"