This Robot Assembled an IKEA Chair in 20 Minutes
A pair of stationary robotic arms successfully executed the roughly 50 steps required to put together an IKEA STEFAN chair.
Anyone who’s ever assembled furniture from IKEA understands the daunting task at hand. The pile of loose parts, wonky shapes, and pages of instructions that are both rudimentary and confusing are enough to make you second guess your intelligence and spatial comprehension.
A new study in the journal Science Robotics shows that robots can do it just as well as humans. Built with off-the-shelf hardware, 3D cameras, and force sensors, two factory robot arms put together a STEFAN chair from IKEA in about 20 minutes.
The system demonstrates that these assembly line robots can use a combination of different skills, including vision, touch, and force, to undertake complex tasks originally designed for people. Factory robots may be capable of working outside unstructured settings, which could bring automation to areas of manufacturing, such as in some aspects of the electronics and aircraft industry, where it currently doesn’t exist.
If the idea of a robot building a piece of IKEA furniture sounds familiar, it’s been done before. Back in 2013, a team from MIT used two mobile robots to assemble a table. That system, however, required customized grippers on the ends of the robot arms and motion capture techniques that required reflective markers on the table that the camera’s vision software could detect.
“We didn’t customize anything for this task,” team leader Francisco Suárez-Ruiz, a research fellow at Nanyang Technological University in Singapore, told Seeker.
The setup included two stationary robotic arms affixed to table tops and positioned across from one another about three feet apart. A special camera was set up on a tripod about five feet away. The device is actually two cameras that film the same scene, each one from a slightly different position. Software compares pixels in both views and based on the differences between them determines how far apart objects are from one another. In this way, the robot arms calculate where they are in relation to each other as well as the chair parts placed between them.
From previous work, the researchers had programmed the robot arms to perform numerous elementary tasks, such as picking up an object, turning it, grasping a peg, and inserting it into a hole. But in this task, the robot arms had to combine those skills and figure out the best way to execute them without colliding with each other or snapping the wood.
Like many IKEA assembly projects, this one was a team effort. The researchers would tell the robot to do something, such as, “grab the back of the chair” or “insert the peg.” The robot then had to find the objects, plan the motion, and execute the task.
Of the total time it took to assemble the chair, locating the right parts took, in total, three seconds. Motion planning took the longest — 11 minutes, 21 seconds. An algorithm called Bidirectional Rapidly Exploring Random Tree, or Bi-RRT, allowed the robotic arms to, computationally speaking, test out a feasible path from the beginning of a move to its end. Executing the motions took eight minutes, 55 seconds.
In all, the robot arms had to execute about 50 different steps. During the early stages of the experiment, each robotic arm was programmed to carry out its tasks with precision. But soon, the researchers realized that the arms would fight for control when holding the same piece, breaking the wood. There would have to be some compromise. So the researchers programmed one arm to execute the motion with precision and the other arm to give in a bit, if it felt resistance.
“At the end, they cooperate,” said Suárez.
The researchers next plan to incorporate machine learning into the system, which could speed up the assembly time. Artificial intelligence will give the camera the ability to recognize parts it has seen before and will improve how the arms plan their motions, grasp objects, insert pegs and — perhaps best of all — interpret the instructions.