I have long been an advocate of going beyond the current paradigms i.e mouse, desktops.
The current evolution for 2009 is on touching screens, popularized by Apple with iPhones.

March 2009 may be remembered as the real start of a breakthrough in user interfaces, as Vuzix and Metaio have just combined their forces to release the first commercial augmented reality user interface.

Portability and pixels to be told by the computer

Vuzix have been developping a "head up display" that's the ultimate solution for screens. By being on the nose, you can combine ultra portability, actually wearability, and a huge number of pixels allowing to see as much information as with a screen for a desktop computer.
The revolution is to combine it with a digital camera from Metaio, the "CamAR", that can get what you're seeing and show additional information on top of it.

CamAR.jpg



Portability and maniability to tell the computer

As this point, you may say that you've already seen this kind of system.
But here, the "be told" part of the interface is completed by a new way to tell the computer, via a PhasAR, a kind of joystick to control what's you're seeing on the wearable screen.

PhasAR.jpg



From niche innovation to mainstream commercial success

As I said before, this is just the beginning.
Everything may be too big to seduce more than the people not afraid to look like cyborgs.
The CPU power needed to understand the data as well as what Intel demonstrated a long time ago in an Asian IDF and fit in the pocket may not be there yet.
And most of all, no OS has ever been built from scratch for this, being to it what Mac OS were to the desktop and webOS is to be to the touch palmtop.
It took around 20 years between the first personal computer from Apple and the definitive mainstream success of personal computers with...Windows 95.
Let's hope that some smart guy will make this transition a little bit faster...