The next big thing in mobile interfaces

I’m a big fan of futuristic-everything. I’ve worked with user interfaces when I used to live in the UK, and since then I’ve always tried to think of possible ways to better the ways we interact with machines.

First of all, the future is mobile, and there’s little dispute about this. Desktop machines will only be used for very heavy, specialized purposes, the same way trucks are used today. Most of the people will just own fast-enough mobile, portable devices, rather than desktops. Even tablets will hit the ceiling of what people would want to carry with them. Basically, anything bigger than a 4.5″-5″ screen smartphone will be too much to carry with you. It will be seen no different than the way we today feel about 1981′s businessmen, carrying around the Osborne 1 “portable” computer.

The biggest problem that such small devices will face is their minuscule screen, while the web expands to SXGA and higher resolutions as a minimum. Sure, they will all use very high resolution (1080p is around the corner), but the small physical screen poses problems for applications that actually do require a lot of widgets, and a large real screen estate working area (e.g. serious video/still editors, 3D editors, complex js web sites etc).

There are two ways to deal with the problem. One, is holographic projected displays. Unfortunately, we’re far away technologically from something like it. The second way, which is closer to our reach, is a projected display via glasses.

The idea is this:
- Smartphone in the pocket, is connected wirelessly, via Bluetooth or other similar protocol, to special glasses.
- When the glasses are activated, a transparent, high-resolution computer screen is shown in front of the user, at a certain perceived distance.
- Glasses feature an HD camera (or dual HD cameras for 3D, located where each eye would normally be), and they capture the real world ahead of the user, at 120 fps. The “real world” view is overlayed with the computer screen view. This way, the user can still walk on the streets of NY, and use his cellphone at the same time, without having to look down for it. Update: No reason to capture/overlay the real world, you can have transparent glass that can change its transparency to solid on demand. We already have the technology.
- Using gestures, by placing his fingers on the virtual screen, and by “reading back” by sonar (“acoustic location”) and/or the cams, the phone would know what you clicked, and carry out the actions.
- Voice recognition will be an alternative way to use the system. In the last 2-3 years there have been major strides in voice recognition.

Of course and it would look a bit funny at first, seeing people on the streets move their hands around like idiots, but if enough of them bite (and Apple has an uncanny way of making people try new things), then it can be deemed “normal.” Besides, silliness didn’t stop people wearing Bluetooth headsets that made them look like they talk to themselves. And Bluetooth headsets are less useful than this idea, which can greatly improve universal productivity and usability on the go.

So, I expect such a device to be reality before 2015. We already have the technology to do all this, it’s just that the experience won’t be perfect yet. It’s just a matter of time though.

Soon, the glasses will just be transparent screens that will couple as your real reading glasses, so they won’t look outlandish and silly at all. The smartphone itself will shrink, and will just be inside a wristwatch-like device, since the main interface will move to the glasses (that was previously optional). Its other usage would be to act as a webcam.

After that, the whole system will move inside your eye (like Futurama’s “eyephone”). And you won’t need your hands to control your device anymore: a simple device implanted on the back of your head, or it could be in the same device as the “eyephone” itself, will be able to read brainwaves (we already have the technology to do this, in a smaller scale, for medical purposes mostly).

By 2050 (maybe sooner), you won’t even need a mobile device with you to pair your eyes with. The device inside your eyes will be able to wirelessly connect to your data/network, using your body as an antenna for the closest “tower”. It (you) will be a dumb terminal for the most part.

The era of true Cyborgs will begin.

1 Comment »

Michael C. wrote on October 9th, 2010 at 12:29 AM PST:

The helmet mounted sight with eye tracking was pioneered in MiG-29 about 20 years ago, so no hand-waving will be needed.

Comments are closed as this blog post is now archived.

Lines, paragraphs break automatically. HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

The URI to TrackBack this blog entry is this. And here is the RSS 2.0 for comments on this post.