BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Why Your Eyes Will Replace Your Keyboard

This article is more than 10 years old.

So much will soon be obsolete as the Age of Context changes how people and technology interact. Among the most imminent victims are the keyboard and the mouse. Not only are they both too cumbersome for our new mobility, they are becoming too slow for the traditional productivity desktop.

Our new mobile and wearable devices make those venerable peripherals objects of frustration and ridicule. How many iPhone email signatures have you seen lately, with signature apologies caused by typing on one of those itsy-bitsy, teen-weeny keyboards?

As we move beyond the phone and the tablet and adapt newer wearable devices, including bracelets, digital eyewear, lanyards and jewelry, we are going far beyond the smartphone and tablet. There are anti-rape bras as well as sensor-enabled tattoos. We are using gestures to play games and change TV channels. We are getting special deals on items we look at on retail shelves and we are opening locked doors in our homes and automobiles, using mobile apps, sensors, location technologies and facial recognition.

One of the most frequently discussed innovations is gesture control, which is rapidly gaining in popularity, partly because they speed up gaming and increase user immersion—but there is something far more natural than gestures or even the awkward, but occasionally successful conversations we are starting to have with our technologies.

I refer to the human eyes. I learned in a recent interview with Carl Korobkin,  of Tobii, world leader in eye tracking and gaze interaction technologies, that our eyes are the fastest moving part of the human body and have the most direct access to the brain. The eyes are a fundamental underpinning of gestures and voice interaction. It is important for technology to know what you are looking at to understand the context of what you want by tapping, moving your hand or telling a sensor-enabled device to open.

Korobkin is vice president, business development for Tobii’s  OEM Mountain View, California-based business unit, a new part to an older company. The Stockholm-headquartered privately held company is nearly 15 years old, has 400 employees and offices worldwide. It is the world leader in these visual technologies with a market share somewhere near 75 percent. The market is relatively small today, but looks extremely well positioned for the sea change that context is relentlessly pushing.

The company started by providing computer sys for physically impaired patients who could not operate computers. By developing a way to operate PCs with eye controls, the digital world has been made accessible to many thousands of people who were otherwise locked out.

The company provides technology to many of the world’s leading companies, such as Proctor & Gamble who use Tobii digital eyewear to track what people look at. Many universities use the tech for analysis that lets researchers and clinicians, understand behavior by seeing what a patient is looking at when she or he takes a particular action.

Yet, Tobii’s brand is not exactly a household word. I had never heard of them until their PR agency invited me to meet them at CES where they were co-announcing the first eye-tracking gaming gear with gaming peripheral maker SteelSeries.

The partnership will bring a sensor bar to market that looks a lot like Wii’s sensor bar. It will be equipped with a camera and infrared sensors that follow gamer eyes in action.  It works with many popular existing games, without adding complexity. According to Korobkin, is a faster, more natural way than gestures, which also require eye tracking technology.

While I found that interesting, it was the picture of games to come that really caught my attention. Korobkin described a future mystery game where the player enters a room.

“There’s a dead body on the floor. You look around and see three characters. You notice one is sweating profusely and avoids your gaze…”

It would be stepping into a movie. I could see similar applications where students could be taken to great moments in history; first responders trained to make right decisions before entering a burning building. The same eye-tracking could be used to allow remote experts to assist in brain surgery, auto repair or gourmet cooking.

I met up with Korobkin in Mountain View, shortly after CES. He convinced me that many such changes are afoot. Some reminded me of an earlier conversation with Microsoft’s Stefan Weiss who had talked about someday soon being able to buy a sweater in a store window simply by looking at it, then tapping or telling a device to buy it.

I was also reminded of my conversations with Ford, Audi and Toyota while researching Age of Context, my recent book with Robert Scoble. They had talked about contextual windshields of the near-term future that showed drivers dashboard information without forcing them to take little glimpses that distract the eyes from the road. The data would appear when you looked at a specific place on the windshield. You could adjust radio or climate by voice control.

Korobkin also told me about Tobii Eye-X, a software engine that allows people to operate computers by eye control. It can be added to Windows 8, he said, without Microsoft having to make a single change. Users will be able to use eye controls to play hundreds of existing games without any need to upgrade the game software.

Intel is both a customer and a $21 million investor. Having missed the great migration into wearables, the company made clear at CES its intention of getting back into the game with wearables [], and eye controls will play an obvious role.

There is also a great deal of new competition including SR Research, Eye Mobil and Eagle Eye Tracking [LINKS] to name a few.  Competition is almost always good for the user. It speeds up innovation while simultaneously reducing costs.

Korobkin, gave me a brief demo on a Windows-based PC, equipped with a Tobii sensor panel. It took about 30 seconds to let the machine map out my field of vision. I was told it could see where I was looking at within 5 mm field of accuracy.

Once, the technology had mapped my eye, it could tell by where I looked, that I wanted to play or change music, check stocks or weather. In short it knew my intent by where I looked.

I also got to play with a digital eyewear prototype set that Tobii is working on.  While they already have eyewear for the professional research and analysis markets, the relentless reductions in size and cost means they will be playing in the consumer marketplace. The photo shows what their sensor kit looked like a couple of years ago, above and what it looks like now. The prices have been reduced proportionately.

Like Google Glass, the device has a tiny camera that watches the user’s eye, more evidence that competition will drive down prices and accelerate adoption of contextual technologies.

Tobii is one more reason why I believe the future is sooner than I think. I regret Scoble and I did not know about the company when we wrote our recent book. The upside is that technologies like Tobii offers, motivates to do a sequel sooner, rather than later.