It’s the latest example of a technology convergence between AR/VR and robotics, one that’s helping reduce longstanding barriers to entry for industrial robots and may help hasten automation adoption among small- and mid-sized manufacturers.
Programming by demonstration (PbD) has been employed in a number of collaborative robots in recent years. But the process is clunky, requiring an end-user to physically move a robotic arm through the steps of task.
Also: Augmented and virtual reality mean business: Everything you need to know
Thus far, PbD hasn’t been well-suited to fine manipulation. Instead, the technique has been a starting point in a more complex programming procedure, as Professor Maya Cakmak of the University of Washington, an expert in PbD, told me last year.
“The software is still difficult for most robots,” Professor Cakmak explained. “Some of the more accurate robots out there have 300-page user manuals. I’ve seen some code for these, and you have to know algebra and matrix transformations to still be able to do anything.”
That’s kept a lot of companies that aren’t willing to hire an expert and don’t want to bring in contractors every time they change their processes from investing in collaborative robots. More effective PbD would open up new markets for the technology, spurring on an already strong industrial automation sector.
Virtual reality and augmented reality may be the missing piece of the puzzle.
A company called Covariant.ai has been working on a technique called reinforcement learning that utilizes an off-the-shelf VR headset and AI to teach robots how to perform tasks. Researchers at Elon Musk’s OpenAI have similarly been using virtual reality to teach robots how to grasp objects and move along prescribed paths.
Concurrent AR/VR development in motion sensors and time of flight cameras, along with advanced control systems for gaming, have created a readily adoptable hardware set that can be employed to train robots.
Also: The urgent case for Open AR Cloud: Why we need a digital copy of the real world
Sisu’s VUDU system enables users to control grasping and articulation speed with a pressure-sensitive trigger on the joystick. With Sixense technology onboard, the system’s frame of reference constantly adjusts to the position of the user, which makes it practical and intuitive for an average person to use in a real world environment.
“In the past, conditions on factory and shop floors have made it nearly impossible to deploy advanced motion capture technology that is essential for enabling easy programming of industrial robots,” said Russell Aldridge, Co-founder and CEO of Sisu. “With Sixense’s motion tracking technology, we are able to bypass these issues, paving the way for countless organizations to increase their efficiency through precision automation.”
It’s a lofty boast, but eliminating programming barriers helped spur a business computing revolution, and it’s likely to have a seismic impact on how readily a large segment of the market adopts automation.