Crowdsourcing Gaze Data Collection

Collective Intelligence 2012

Dmitry Rudoy

Technion

Dan B Goldman

Adobe

Eli Shechtman

Adobe

Lihi Zelnik-Manor

Technion

Abstract

Knowing where people look is a useful tool in many various image and video applications. However, traditional gaze tracking hardware is expensive and requires local study participants, so acquiring gaze location data from a large number of participants is very problematic. In this work we propose a crowdsourced method for acquisition of gaze direction data from a virtually unlimited number of participants, using a robust self-reporting mechanism. Our system collects temporally sparse but spatially dense points-of-attention in any visual information. We apply our approach to an existing video data set and demonstrate that we obtain results similar to traditional gaze tracking. We also explore the parameter ranges of our method, and collect gaze tracking data for a large set of YouTube videos.

Paper

BibTeX

Slides

Demo (video)

Results (video)

References

Live Demo

To try the system go to: Live Demo

The entire live demo takes only several minutes. It consists of two parts: a short tutorial session and video experiment. In the end you will be presented your results compared to the data collected in our experiments.