Automation: Machines Compete in Amazon Robotic Challenge
08/01/2017
A+
|
a-
Print Friendly and PDF
The latest Amazon Robotic Challenge took place over July 27-30 in Nagoya Japan, hosting a competition to develop a grasping machine that can pack boxes for shipment in the company’s warehouses. Or as ZeroHedge put it so clearly: Amazon Hosts Robotics Competition To Figure Out How To Replace 230,000 Warehouse Workers.

The contest used to be called the Amazon Picking Challenge, but the big management brains may have thought it was time for a classier name. I see the earlier Url for the event, AmazonPickingChallenge.org has been transformed into AmazonRobotics.com.

“Picking” is the term for pulling items from the warehouse inventory to be packed into boxes for customers’ orders. But as the video below shows, packing may be the harder challenge. The robot sucks up objects well enough and then drops them into a large box, with no attempt to use space efficiently. Amazon may have to order a lot of extra large boxes if this sort of machine is adopted.

I’ve blogged about this competition over the last three years and can report no stunning breakthroughs. For example, the robots in the video following aren’t able to pick up the objects on every try:

Below is the winner, from Australia, Robot Vision’s Cartman, that does the basic grab-and-drop pretty well, but no human pickers need to be worried about their jobs just yet.

The upshot is the amazing dexterity of the human hand coupled with our brains is very hard to recreate in a machine. However, the machines are going gangbusters in many other areas of work, from farms to factories, so the fact remains that America should seriously reduce the number of immigrant workers imported by the government. We have plenty of them already.

Amazon’s New Robo-Picker Champion Is Proudly Inhuman, MIT Technology Review, July 31, 2017

It only needs to see seven images of a new object before it can reliably spot and grab it.

A robot that owes rather a lot to an annoying arcade game has captured victory in Amazon’s annual Robotics Challenge.

E-commerce companies like Amazon and Ocado, the world’s largest online-only grocery retailer, currently boast some of the most heavily automated warehouses in the world. But items for customers’ orders aren’t picked by robots, because machines cannot yet reliably grasp a wide range of different objects.

That’s why Amazon gathers together researchers each year to test out machines that pick and stow objects. It’s a tough job, but one that could ultimately help the company to fully automate its warehouses. This year the task was made even harder than usual: teams had only 30 minutes for their robots to familiarize themselves with the objects before trying to pick them out of a jumble of items. That, says Amazon, is supposed to better simulate warehouse conditions, where new stock is arriving all the time and pallets may not be neatly organized.

The winner, a robot called Cartman, was built by the Australian Centre for Robotic Vision. Unlike many competitors, which used robot arms to carry out the tasks, Cartman is distinctly inhuman, with its grippers moving in 3-D along straight lines like an arcade claw crane. But it works far, far better. According to Anton Milan, one of Cartman’s creators, the device’s computer-vision systems were crucial to the victory. “One feature of our system was that it worked off a very small amount of hand annotated training data,” he explained to TechAU. “We only needed just seven images of each unseen item for us to be able to detect them.”

That kind of fast learning is a huge area of research for machine learning experts. Last year, DeepMind showed off a so-called “one-shot” learning system, that can identify objects in a image after having only seen them once before. But the need to identify objects that are obscured by other items and pick them up means that Cartman needs a little more data than that.

(Read more: TechAU, “Robot, Get the Fork Out of My Sink,” “Machines Can Now Recognize Something After Seeing It Once,” “Inside Amazon”)

Print Friendly and PDF