Google’s New Robot Is Better at Tossing Things Than You Are

Robots have been able to pick up and move objects for decades, but only if they were programmed with exact instructions to complete the task. Neural networks have opened up new frontiers in robotics research, giving machines the ability to cope with oddly sized objects and unexpected conditions. Google sold off some of its robotics projects a few years back, but the company refocused on machine learning. Google AI researchers have built a simple robot arm that understands the physics of tossing objects, which is more impressive than you might assume.
We all know how to toss something into a bin — it’s a skill you pick up as a child. There’s an intuitive aspect to it that’s hard to teach a machine, though. You might change the way you hold an object based on its shape or density, and the weight affects where in the swing you release your grip. The “TossingBot” from Google’s robotics lab uses multiple neural networks to toss objects even more effectively than the average engineer.
When TossingBot was first presented with a bin of random items, it didn’t know how to throw them into the right bins. The TossingBot learned how to deal with those objects by integrating simple physics with deep learning. The system has an overhead camera feed showing the tossable objects and three neural networks that tell it what to do. One network recognizes objects inside the bin, a second determines how to pick them up, and a third describes how to throw it into the right container. The networks are refined by repeated attempts, most of which are failures early on. According to the engineers, TossingBot is now better at throwing things than the humans who designed it.
According to Google, tossing can be a good strategy for a robot when moving certain (non-fragile) objects. We humans do it all the time because it’s faster and gets things to areas we can’t reach. That’s why the TossingBot has to aim for compartments outside its physical reach. It can process more than 500 objects an hour, whereas a robot that sets things down can only do 200-300 per hour.
Google’s also designing these new robots to be good at their assigned tasks and not to look like humans or animals. We’ve all seen the weirdly organic robots from Boston Dynamics, which Google sold several years ago. Those machines might be able to navigate a human-dominated world more effectively, but they’re not particularly good at any specific task. Atlas can pick up a box and move it to another location, but the TossingBot can sort through a whole bin of different objects in the same amount of time. The warehouses of the future will probably be automated by devices like TossingBot powered by neural networks rather than humanoid Atlas robots.
Continue reading

Scientists Confirm the Presence of Water on the Moon
Scientists have confirmed the discovery of molecular water on the moon. Is there any of it in a form we can use? That's less clear.

Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption
The Oculus Quest 2 is now available, and it's an improvement over the original in every way that matters. And yet, it's $100 less expensive than the last release. Having spent some time with the Quest 2, I believe we might look back on it as the headset that finally made VR accessible to mainstream consumers.

SpaceX Launches ‘Better Than Nothing’ Starlink Beta
Those lucky few who have gotten invitations to try the service will have to pay a hefty up-front cost, and the speeds aren't amazing. Still, it's a new generation of satellite internet.

Samsung, Stanford Built a 10,000 PPI Display That Could Revolutionize VR, AR
Ask anyone who has spent more than a few minutes inside a VR headset, and they'll mention the screen door effect. This could eliminate it for good.