In virtual reality, headsets are only half the story. To feel fully immersed in their virtual surroundings, users need to feel like what they're doing with their body relates accurately to what they're doing on-screen, which is where motion-tracking comes into play.

We've already seen some impressive input tools for VR headsets, most notably HTC's Vive controllers, which are accurate enough to bring an added layer of believability to its VR experience. Nevertheless, Microsoft is working on a future where the only thing we'll need to interact with virtual environments is our hands and a well-placed camera.

Microsoft's Handpose technology is able to accurately track precise, detailed hand movements for interacting with computers, with the goal being that we'll eventually be able to move away from keyboard, mouse and controller. Microsoft believes that accurate hand tracking is the final ingredient required to allow consumers to interact with computers more organically, alongside speech and facial recognition.

In virtual reality, this would mean being able to interact with and manipulate even small objects as we do in real life, such as picking up tools, pushing buttons, flipping switches and so on.

Jamie Shotton, a principal researcher in computer vision at Microsoft's UK research lab in Cambridge, said: "How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them. We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them."

Microsoft has been working on its Handpose hand tracking for some time, but released two new videos this week to demonstrate the recent progress it has made. One shows Handpose being used to interact with a virtual control board, with the tracking tech appearing to have no problem picking up the small hand movements needed to manipulate various dials, knobs and sliders. In another, a researcher is able to use a set of virtual turntables.

Handy controls

Microsoft is also looking at how developers could create tools that allowed computers to understand specific gestures and associate them to certain functions, with one example being that you could mimic hanging up a phone to end a Skype call.

The goal of the research project, called Project Prague, is to enable computers to not just recognise users' hands, but also the intention of their gestures. According to Microsoft, developers would be able to make custom gestures for their own apps with "very little additional programming or expertise" and the system would work with an off-the-shelf 3D camera.

In virtual reality applications, it could be argued that a lack of something tangible to hold could detract from the immersion – say, for instance, when you're meant to be holding a weapon of some kind. Yet because Microsoft's system is so accurate, it allows for input gestures like touching your fingers together, which the researchers believe can give the impression of touching a solid object. It will also rely on other sensory experiences such as sounds and sigh to help convince users they're touching a physical object when they're not.

While we probably won't be seeing the death of the keyboard any time soon, Microsoft believes the company is on the cusp of making accurate hand-tracking tools commercially available.

"This has been a research topic for many, many years, but I think now is the time where we're going to see real, usable, deployable solutions for this," said Shotton.