Case Studies

All set

Furniture assembly with Ensenso N35

Furniture assembly – with Ensenso N35

Who has not already experienced this: the new piece of furniture is just waiting to be assembled. You put the parts into order, look into the assembly instructions and… it takes longer than expected!

Scientists at Nanyang Technological University, Singapore (NTU Singapore) have developed a robot that can independently assemble the individual components of a chair without interruption. The robot consists of an Ensenso N35 3D camera and two robot arms equipped with grippers for picking up objects.

Application

Furniture assembly with Ensenso N3
The scientists at Nanyang Technological University

To help the robot assemble the IKEA chair, the team from the „School of Mechanical and Aerospace Engineering“, encoded algorithms with three different open source libraries. The robot hardware is designed to simulate how people mount objects: the "eyes" are replaced by a 3D camera and the "arms" by industrial robot arms capable of moving in six axes. Each arm is equipped with parallel grippers for picking up objects. Force sensors are attached to the wrists to determine how strongly the "fingers" grip and how strongly they bring objects into contact with each other.

The robot starts the assembly process by taking 3D images of the parts lying on the ground to create a map of the estimated positions of the various components.This task is performed by an Ensenso 3D camera. The camera works according to the "projected texture stereo vision" principle (Stereo Vision), which imitates human vision. Two cameras acquire images from the same scene from two different positions. Although the cameras see the same scene content, there are different object positions according to the cameras’ projection rays. Special matching algorithms compare the two images, search for corresponding points and visualize all point displacements in a Disparity Map. The Ensenso software can determine the 3D coordination for each individual image pixel or object point, in this case the chair components.

Challenge

Furniture assembly with Ensenso N3
Two gripper arms are used to hold the components

The challenge is to locate the components as precisely, quickly and reliably as possible in a confusing environment. This is ensured by the Ensenso camera’s light-intensive projector. This produces a high-contrast texture on the object surface by using a pattern mask, even under difficult light conditions. The projected texture supplements the weak or non-existent object surface structure found on the components of the IKEA chair.

Although not required for this application, the N35 model used here even goes one step further: thanks to the integrated FlexView projector technology the pattern projected on the object surface of the components can be shifted to vary the texture on the surface. Acquiring multiple image pairs with different textures of the same object scene produces a lot more image points. Thus the components of the chair are displayed in 3D in a much higher resolution to make them easier for the robot to recognize. Another advantage is the robot hand-eye calibration function of the Ensenso software. Using a calibration plate, it ensures that the position of the camera coordinate system (in this case the stationary camera) is determined with respect to the base coordinate system (position of the component). This enables the robot's hand to react precisely to the image information and reaches exactly its destination.

"For a robot, putting together an IKEA chair with such precision is more complex than it looks," explains Professor Pham Quang Cuong of NTU. "The job of assembly, which may come naturally to humans, has to be broken down into different steps, such as identifying where the different chair parts are, the force required to grip the parts, and making sure the robotic arms move without colliding into each other. Through considerable engineering effort, we developed algorithms that will enable the robot to take the necessary steps to assemble the chair on its own." The result: the NTU robot installs the "Stefan" chair from Ikea in just 8 minutes and 55 seconds.

Outlook

Furniture assembly with Ensenso N35
Precise assembly of chair through robot arms

According to Professor Pham Quang Cuong, artificial intelligence will make the application even more independent and promising in the future: "We are looking to integrate more artificial intelligence into this approach to make the robot more autonomous so it can learn the different steps of assembling a chair through human demonstration or by reading the instruction manual, or even from an image of the assembled product."

The robot developed by the scientists at NTU Singapore is used for research into clever manipulation, an area of robotics that requires precise control of the forces and movements of special robot hands or fingers. This requires perfect interaction of all hardware and software components. 3D image processing using Ensenso stereo 3D cameras is a key to the solution. It convinces not only through accuracy,
but also in terms of economy and speed.

This marks real progress in furniture assembly - and not only here.

Ensenso N35 at a glance. 3D-Vision fast and precise.

Ensenso N35
  • With GigE interface – versatile and flexible
  • Compact, robust aluminum housing
  • IP65/67
  • Global Shutter CMOS sensors and pattern projector, optionally with blue or infrared LEDs
  • Max. fps (3D): 10 (2x Binning: 30) and 64 disparity levels
  • Max. fps (offline processing): 30 (2x Binning: 70) and 64 disparity levels
  • Designed for working distances of up to 3,000 mm (N35) and variable picture fields
  • Output of a single 3D point cloud with data from all cameras used in multi-camera mode
  • Live composition of the 3D point clouds from multiple viewing directions
  • Pre-calibrated and therefore easy to set up
  • Integrated function for robot hand-eye calibration with calibration plate
  • Integration of uEye industrial cameras on the software side, for example, to capture additional color information or barcodes
  • Subsampling and binning for flexible data and frame rates
  • Integrated FlexView technology for more detailed accuracy of the point cloud and higher robustness of 3D data on difficult surfaces
  • "Projected texture stereo vision" process for capturing untextured surfaces
  • Capture of both stationary and moving objects
  • Free software package with driver and API for Windows and Linux
  • One software package supports USB and GigE models
  • HALCON, C, C++ and C# sample programs with source code
  • Pre-calibrated and therefore easy to set up
  • Integrated function for robot hand-eye calibration with calibration plate
  • Integration of uEye industrial cameras on the software side, for example, to capture additional color information or barcodes
  • Subsampling and binning for flexible data and frame rates

NTU Nanyang Technolocigal University Singapore

NTU Nanyang Technolocigal University Singapore

School of Mechanical and Aerospace Engineering
50 Nanyang Avenue
Singapore 639798

www.mae.ntu.edu.sg