OpenAI Robotic Hand Learns How to Work Without Human Examples

OpenAI Robotic Hand Learns How to Work Without Human Examples

You pick up things so frequently throughout the day that the act seems simple. However, it’s just the end result of a network of nerves, tendons, and muscles that you’ve honed all your life. Making a robot that can pick things up with the same reliability has proven difficult, and even small changes can make a carefully designed robot hand all thumbs. A company called OpenAI says it has developed a robot hand that grips objects in a more human-like way, and it didn’t have to be taught by humans — it learned all on its own.

For your entire life, your brain has been learning how to pick up different objects. On a conscious level, there’s no difference between picking up a wooden block or an apple. You just do it. Translating human movements into a machine would be unnecessarily complicated. So, OpenAI decided to skip the human element altogether. They let a robot hand try and fail over and over in a simulation until it slowly learned how to pick up various objects.

The simulated robot hand didn’t have to operate in real time, so researchers were able to simulate about 100 years of trial and error in about 50 hours. It took some serious computing hardware to make that happen: 6,144 CPUs and 8 GPUs powered the learning phase. OpenAI calls this system Dactyl, and it’s moved beyond the simulation.

With Dactyl turned loose on a physical robot hand, it’s capable of remarkably human-like movements. Something we take for granted, like spinning an object around to look at the other side, is tedious for most robots. Dactyl can do it with ease, but it has advanced hardware to help. The Shadow Dexterous Hand has 24 degrees of freedom compared with seven for most robot arms. The robot knows the position of each finger, and there’s a feed of three camera angles to help it orient the object.

Importantly, this system isn’t stuck with a single type of object. It can grip and manipulate anything that fits in its hand. This is called “generalization,” and it’s an essential aspect of robotics as we integrate machines into our lives. You don’t want to have to train a robot to do every single thing it might need to do in a day. Ideally, it should be able to figure something out if it’s similar to a task it’s already performed. For example, if your robot butler can pour your orange juice in the morning, it should be able to pour you a scotch in the evening without learning precisely how to do both.

Dactyl isn’t going to pour you any drinks quite yet, but maybe someday.

Continue reading

Google’s AutoML Creates Machine Learning Models Without Programming Experience

The gist of Cloud AutoML is that almost anyone can bring a catalog of images, import tags for the images, and create a functional machine learning model based on that.

Turing Robotics Files for Bankruptcy Without Ever Delivering a Phone

It's increasingly unlikely that it ever will now that TRI has filed for bankruptcy in Finland, where it was set to manufacture the device.

Intelligence Officials Warn Against Buying Huawei and ZTE Phones, Without Justification

Multiple national security figures in the US have issued a direct warning that consumers should avoid devices from Chinese tech giants Huawei and ZTE. However, the warning lacks any specific claims of wrongdoing.

The New Oculus Go Slashes VR’s Entry Price Without Gutting Quality

The new Oculus Go promises a standalone VR experience at a much lower price point. Early previews suggest it nails that balance.