OpenAI Robotic Hand Learns How to Work Without Human Examples

OpenAI Robotic Hand Learns How to Work Without Human Examples

You pick up things so frequently throughout the day that the act seems simple. However, it’s just the end result of a network of nerves, tendons, and muscles that you’ve honed all your life. Making a robot that can pick things up with the same reliability has proven difficult, and even small changes can make a carefully designed robot hand all thumbs. A company called OpenAI says it has developed a robot hand that grips objects in a more human-like way, and it didn’t have to be taught by humans — it learned all on its own.

For your entire life, your brain has been learning how to pick up different objects. On a conscious level, there’s no difference between picking up a wooden block or an apple. You just do it. Translating human movements into a machine would be unnecessarily complicated. So, OpenAI decided to skip the human element altogether. They let a robot hand try and fail over and over in a simulation until it slowly learned how to pick up various objects.

The simulated robot hand didn’t have to operate in real time, so researchers were able to simulate about 100 years of trial and error in about 50 hours. It took some serious computing hardware to make that happen: 6,144 CPUs and 8 GPUs powered the learning phase. OpenAI calls this system Dactyl, and it’s moved beyond the simulation.

With Dactyl turned loose on a physical robot hand, it’s capable of remarkably human-like movements. Something we take for granted, like spinning an object around to look at the other side, is tedious for most robots. Dactyl can do it with ease, but it has advanced hardware to help. The Shadow Dexterous Hand has 24 degrees of freedom compared with seven for most robot arms. The robot knows the position of each finger, and there’s a feed of three camera angles to help it orient the object.

Importantly, this system isn’t stuck with a single type of object. It can grip and manipulate anything that fits in its hand. This is called “generalization,” and it’s an essential aspect of robotics as we integrate machines into our lives. You don’t want to have to train a robot to do every single thing it might need to do in a day. Ideally, it should be able to figure something out if it’s similar to a task it’s already performed. For example, if your robot butler can pour your orange juice in the morning, it should be able to pour you a scotch in the evening without learning precisely how to do both.

Dactyl isn’t going to pour you any drinks quite yet, but maybe someday.

Continue reading

NASA Begins Building Orion Spacecraft that Will Return Humans to Moon
NASA Begins Building Orion Spacecraft that Will Return Humans to Moon

NASA's primary Orion contractor Lockheed Martin has just started work on one very important project: the first welds of the Orion capsule that will return humans to the moon.

Epic’s New MetaHuman Creator Generates Digital Characters that Avoid the Uncanny Valley
Epic’s New MetaHuman Creator Generates Digital Characters that Avoid the Uncanny Valley

The company's new MetaHuman engine promises to deliver photorealistic digital characters in a snap, and you can check out a demo right now.

New Xiaomi Phone Has a Secondary Display in the Camera Hump
New Xiaomi Phone Has a Secondary Display in the Camera Hump

Chinese mobile giant Xiaomi is set to announce a new device called the Mi 11 Ultra, and the device has leaked early. It's got a giant camera module that supports up to 120x zoom, and there's even an extra screen. Yes, a screen in the camera hump. Because why not, I guess?

IBM Built an AI Capable of Holding Its Own Against Humans in a Debate
IBM Built an AI Capable of Holding Its Own Against Humans in a Debate

IBM's Project Debater isn't sweeping humans off the debate stage just yet, but it's getting better at competing with them.