Grasping Robots Compete to Rule Amazon’s Warehouses

This content has been archived. It may no longer be accurate or relevant.

From Wired:

Amazon employs 45,000 robots, but they all have something missing: hands.

Squat wheeled machines carry boxes around in more than 20 of the company’s cavernous fulfillment centers across the globe. But it falls exclusively to humans to do things like pulling items from shelves or placing them into those brown boxes that bring garbage bags and pens and books to our homes. Robots able to help with so-called picking tasks would boost Amazon’s efficiency—and make it much less reliant on human workers. It’s why the company has invited a motley crew of mechanical arms, grippers, suction cups—and their human handlers—to Nagoya, Japan, this week to show off their manipulation skills.

The Amazon Robotics Challenge starts Thursday and tasks teams with picking up objects ranging from towels to toilet brushes and moving them between storage bins and boxes. The handiest contestants stand to win prizes from a pool totaling $250,000—and perhaps a shot at helping refine what happens when you ask Alexa to restock your paper towels. The showdown is taking place in Nagoya because it’s part of this year’s RoboCup, a festival of robotic competition which includes events for rescue, domestic, and soccer robots.

. . . .

One change Amazon has made to this year’s contest is to give the robots less space to work with than previous years. They now have to deal with objects right next to or on top of each other, as a human worker packing a bin of varied products into a box might. A bigger change is that half the objects a robot has to handle in a given round of the contest will only be revealed 30 minutes before it starts.

That’s a headache for the teams but is a better match for conditions inside Amazon’s warehouses, where grasping robots will need to be quick studies. A fulfillment center might receive tens of thousands of new objects every day, says Alberto Rodriguez, a roboticist at MIT, who is part of an advisory committee that helped Amazon design this year’s contest. Teams have had to develop workflows in which photos of new objects snapped from different angles are fed into machine learning software so a robot can figure out how to grab something it had never seen half an hour previously.

Link to the rest at Wired