Xbox One Kinect helps researchers develop sophisticated hand tracking tech

21.04.2015
Microsoft's research team have created a highly detailed hand-tracking technology through the Xbox One Kinect - opening up opportunities for a truly immersive virtual reality.

While virtual reality technology has made strides - allowing users to conjure up a beach from a bus stop, ride a roller coaster from their living room and even hurl into space from the office - the ability to interact with a virtual object on screen has long defeated engineers.

However, researchers are one step closer to interacting with the screen thanks to Microsoft's flexible hand-tracking system, called Handpose.

Using a completely standard, out-of-the-box Xbox One Kinect, Handpose uses the console's single depth camera in combination with software created by the research team, to track hand and finger movement.

The movements are then translated to the screen, replicating the exact movements of a hand and fingers in real time.

While hand movement replication has been seen before from researchers across Europe and the US, Microsoft said that its latest development is truly innovative. Users, it said, can walk several metres away from the camera and their gestures will still be read accurately.

Further, the camera picks up hand gestures even if it isn't completely stable, making it perfect for everyday use.

The Kinect already tracks full body movements, but not such minute details as the wrist or finger. A system like Handpose could eventually be used in gaming systems like Kinect, or in a virtual reality headset like the Oculus Rift, allowing people to reach out their hand to lift, move and place objects on the screen, providing an entirely immersive experience.

It has practical workplace uses too, including robotics. A person could remotely handle hazardous substances in a factory, for example, by moving their hands in front of a screen.

Microsoft said it hoped Handpose will enable robots to mimic the dexterity of the human hand, so a machine could twist a lid off a jar. for example.

Combining this with artificial intelligence "provides another step toward helping computers interpret our body language, including everything from what kind of mood we are in to what we want them to do when we point at something," the firm said.

The system will be presented at the Seoul Computer Human Interaction conference this week.

(www.techworld.com)

Margi Murphy

Zur Startseite