At some point in the evolution of VR and AR, controllers will fade away and the headsets themselves will have sensors that track our bare hands in what’s essentially real time. One firm already has an SDK – launching today – that allows developers to bake hand tracking directly into their apps, using the camera that’s already on your smartphone, tablet or laptop.

Swedish startup ManoMotion has existed for around two years, though the technology it’s building its hand tracking on has been incubating for around seven. Like many tech companies, ManoMotion’s aim is to take the next steps in making our technology interactions more human. “How can we make technology more intuitive, more interactive?” co-founder and CEO Daniel Carlman stated as his company’s primary goal. “Make it easier to interact with technology?”


Upgrade to a Plus subscription today, and read the site without ads.

It’s just US$19 a year.


ManoMotion’s SDK (software development kit), which launches today, can use just about any existing device camera to track bare hands in near real time for use in virtual or augmented apps – no controllers required.

While the first thing I pictured this being used on was mobile VR – like the Gear VR or Daydream – the company sees its tech being implemented first on smartphone-based augmented reality (AR): services like Snapchat and Facebook that are expected to continue incorporating more phone AR elements into their apps over the coming months and years.

While it may be neat to hold your phone up to see some virtual characters scurrying around your real environment, wouldn’t it be much cooler to pick those characters up with your real hands? Squeeze them and see them react accordingly? That’s what ManoMotion believes its SDK can do.

“We can understand dynamic gesture,” explains Carlman. “How much you’re grabbing or pushing something. And depth.”

Carlman says the hand tracking is low latency: less than 10 ms lag on iOS and a bit more than 17 ms on a Galaxy S6. Keep in mind that’s a two-year-old Android phone; Carlman says the better the phone’s processing capabilities, the lower the latency. He believes that the tracking performance is in a good place right now and isn’t his company’s top concern at the moment – but that it can be pushed further in the future.

One of the most interesting aspects of ManoMotion’s approach is that it doesn’t require any extra hardware or participation from hardware manufacturers. Any Unity developer can bake it into their app; all they need is access to the phone’s camera – a permission that any AR app would already be using.

Unfortunately we haven’t had a chance to demo the company’s tracking system yet. The accuracy and latency would need to hit a certain bar in order to make it worthwhile: Products like PlayStation VR illustrate how virtual tracking that’s just “almost good” can make for a clunky, frustrating experience for the user. HoloLens is another example: As exciting as the headset’s tech is, its gesture tracking requires precise finger positions to work properly. To be completely intuitive, tracking tech needs to be so smart that it gets out of the way and allows human instinct to roam uninhibited.

At least on paper, though, ManoMotion’s approach sounds like a promising solution that doesn’t require any extra accessories or upgrades.

If you’re a Unity developer who wants to add hand tracking to your app, you can download ManoMotion’s SDK from the company website below. Or if you’re like me and are just intrigued by where AR and VR can go next, perhaps you can take this as a hint of the controller-free future we’ll eventually enjoy inside our virtual and augmented reality.

Developer page: ManoMotion

View gallery – 2 images