​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​

At CES, EyeTech had a Google Glass in your booth, and a Samsung VR headset.

What use cases do you imagine for eye tracking in the Virtual Reality / Augmented Reality industries?

What are the constraining factors on eye tracking in wearables and headsets?

asked 23 Feb '15, 18:24

phyatt's gravatar image

phyatt ♦♦
15792054
accept rate: 8%


For this post, Wearables, Head Mounted Displays (HMD) and Virtual Reality (VR) and Augmented Reality (AR) are all lumped into a single category and are referred to as wearables.

Existing User Interactions on Wearables

Right now some of the most common interactions with AR and VR are found with Google Cardboard, Google Glass and with the Oculus Rift.

With Google Cardboard, you reach up and slide a magnet to indicate a selection.

With Google Glass, you use your voice saying "ok glass" or you tap or slide your finger on the right side of your headset.

With Oculus Rift you are often playing a first person shooter video game of sorts.

A number of AR/VR systems have put a gesture recognition system on the front plate to watch where your hands are placed in front of you. Leap Motion and Soft Kinectic are examples of this kind of gesture recognition.

Selection Mechanisms with an Eye Tracker

Right now on desktop applications of eye tracking, there are a few different mechanisms of selection:

  • blink
  • dwell
  • an off screen location
  • an on screen target
  • some pattern or eye gesture
  • hardware button or mouse button

The pointing mechanism with an eye tracker is intuitive and easy, making it ideal for marking your intent. Typically for end user use, you want the targets to be fairly large, comparable to touch screen buttons.

To get an idea of the size of buttons to use, see the youtube video of QuickACCESS in action from EyeTech.

Example Interactions using an Eye Tracker

Specifically for eye tracking with a wearable screen, you don't always want to have your hands straight out in front of you to indicate you want to interact with an object. If you have a few different objects presented in your view and you don't want to turn your head to point directly at it, eye tracking would allow you to indicate which one you are interested in easily.

Then you could select with a head nod or a long blink or simply looking at it for a prolonged period of time. Or looking at it presents a menu above it and you look at the menu that popped out from it to finalize your selection.

Constraints for Eye Tracking on an Wearable

USB Bandwidth

Most eye tracking systems are bandwidth heavy, meaning that they compete with the other devices on the USB bus of the host computer. If you have gesture recognition, a front facing camera both running and you try to run an eye tracker, are you looking at possible issues unless you use EyeTech's AEye camera.

EyeTech has moved its algorithms off of the host computer and onto a chip, making it light weight on the USB bus. All the image processing and crunching of the image happens on board and the end measurements are extremely small packets that are ideal for applications that are competing for USB bandwidth.

Power

Adding an eye tracker does require additional power to be able to run its camera and processing.

Mounting

How to mount an eye tracker for use with a wearable is another challenge. The infrared lighting and camera in combination with existing lenses and projections and screens that are already on the wearable can make it a challenge. EyeTech has experience mounting and integrating an eye tracker into several different wearables. Please contact EyeTech for more information about how to get started with mounting eye tracking on your wearable.

Operating System

Most eye trackers only support Windows. EyeTech has supported Windows since 1996 and has supported Android since 2013. Android examples can be found by searching the public gitlab repository. Other related wearables projects may be available upon request.

Because the AEye is a standalone single board computer, it can work with almost any protocol. Though there may be some up front engineering costs to get support for your intended protocol. For example the AEye can be ran completely through UART, over blue tooth, over wifi, or SPI or I2C, but they are all non standard in the eye tracking world, so the interfaces aren't fully supported yet.

Low Bandwidth Protocols

When switching to a low bandwidth protocol, streaming high resolution images over the interface in real time probably won't work well. But because EyeTech supports running the eye tracker without sending the pixel data, you can run very lean, and for debugging can query the eye positions in the sensor view and send that over a low bandwidth protocol.

link

answered 17 Apr '15, 11:09

phyatt's gravatar image

phyatt ♦♦
15792054
accept rate: 8%

There is a full post on eyetechds.com about HMD, too including a video from May 2015. http://www.eyetechds.com/wearables--hmd.html

(09 Jun '15, 18:53) phyatt ♦♦
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×15
×2
×2
×1
×1

Asked: 23 Feb '15, 18:24

Seen: 1,900 times

Last updated: 09 Jun '15, 18:53

Copyright © 2014-2017 EyeTech Digital Systems Inc. All rights reserved. | About | FAQ | Privacy | Support | Contact | Powered by BitNami OSQA