OVERVIEW OUTLINE IMPLEMENTATION RESEARCH CREDITS
An array of technical sensors control the screen and record data of user's habits.

Configuration

The two participants are located in front of the screen (fig. 1) and the movements of their eyes are tracked with a video-based eye tracking device (fig. 2). There are no instructions - the participants recognize their affect on the stage, and begin to realize that their own points of view determine the path that the experience is going to take.

The eye tracking system captures information 60 times a second. All eye-driven events are triggered by a pointed gaze that lasts at least 100 ms (as suggested in [9] ). In this application there is no need to differentiate between saccadic and fixed movements.

Data Collection

A linear time-based map of each user's eye movements will allow for a rich bed for data analysis. The eye position history of each user will be recorded for posterior analysis and used in future versions of the system.

Per-Image and Per-User Data Analysis

Total Pixel Coverage of the User vs. Total Number of Pixels in the Image -- A way of measuring the variance of the user within the image.

Time Until First Fixation -- How long does the User take to find an interesting area of the image?

Number of Fixations in Specific Areas -- How often does the User return to the same area after leaving it?

User Variance -- Does the user tend to move around the image or focus on one area?

Average area of Interest -- What is the averaged center point of movement around which the user move his/her eyes?

Average Size of Area of interest -- Once the user has found an interesting area, how far away from it does he/she wander?

Gaze Duration -- How long does the user rest on each specfic area of the image?

Duration of Saccadic Movements -- How long does the user take to settle on one area of the image?

Comparative Analysis between Two Users

Distance Graph -- How does the distance between the two user's gaze change over time?

User Correlation -- How often does the two users share the same view?

Are there certain images or areas of images that promote a "shared-view" or does it depend on the users?

How do different users navigate in space?

In the final scene, how does it take for both users to negotiate a resolution? How long does it take to blow the image up to its maximum size? Do users generally agree or disagree on whether to do so?

Budget

  • Two Head-Mounted Eye-Trackering Devices (Hardware & Software) $60,000 
  • 20ft wide screen / HD Projector $10,000 
  • One programmer and eye tracking operator for six months $10,000 
  • One user-interface designer for six months $10.000

Fig 1. Blueprint of the system configuration.

Fig 2. A head-mounted eye-tracking unit.

Fig 3. Eye-driven events will be triggered when a user's gaze rests on a 'hotspot' for at least 100 ms.