But if all of this sounds too energetic - or perhaps dangerous – for an open-plan office, there are other less exhausting ways of communicating with your computer. For example, systems that track eye movements are already used to help those with disabilities. Applications include controlling wheelchairs and as a substitute for traditional keyboards as a means of typing. Steven Feiner, professor of computer science at Columbia University, and leading augmented reality researcher, believes eye tracking interfaces will become commonplace. "If you look in a particular place or direction, that could be the signal to call up certain information," he says. "For example, you could look into the top right of your field of vision to call up the time, or perhaps your latest email message."
Under the skin
Augmented reality, as this is known, does away with the need for separate screens by overlaying information from computers directly into a user's field of vision using glasses, contact lenses or some form of head-mounted visor. "What we are trying to do is integrate information with the world around you, so that instead of looking at information in a separate place to a thing, you see it in the space of the thing," he says.
The field was first envisaged as far back as the 1960s, but only now is the technology becoming available to make it practical. Some recent innovations have come close. Earlier this year Google demonstrated Project Glass, a system which projects information such as incoming calls and emails onto the user's glasses. This isn't really augmented reality because the content is unrelated to what the wearer is looking at. Some argue that mobile phone apps that provide information related to tourist attractions or the night sky when these are viewed through the device's camera also fall short because they track the position and orientation of the camera rather than the user's eye movements.
True augmented reality will allow a maintenance engineer to look at an engine, read repair instructions and see specific components highlighted all within his field of vision. It could also be used in corporate meetings and even at dinner parties to call up people's names and other information about them just by looking at them, for example.
The logical conclusion to this form of augmented reality design might be a system embedded within the user's eyes. Such technology is clearly far off enough to deserve to be placed in the realm of science fiction, embedding other interface devices into other parts of the body, such as just under the skin, may not be. For example, tiny embedded computers or sensors that interface with the human body itself could prove extremely useful or even lifesaving. Some researchers have already begun to explore this notion, implanting tiny chips under their skin to control doors and lights in a building. Imagine red LEDs lighting up beneath your skin when you've drunk too much to drive, or a device that warns you when your blood pressure or cholesterol level become dangerously high.
Of course, these specialist applications are a far cry from something that would replace the keyboard and mouse, says Autodesk's Fitzmaurice. "You could have buttons to control an MP3 player, for example, and the benefit would be that they would always be with you and would never get lost, but would you really want them under your skin?"
In most cases, the answer is likely no. Perhaps best to hold on to that qwerty keyboard for a while longer yet.