Share
Industrial Technology - Linked-in Industrial Technology - Twitter Industrial Technology - News Feed
Latest Issue
Diary and Events

Southern Manufacturing & Electronics

Farnborough, Hants(GU14 6XL)

11/02/2020 - 13/02/2020

Southern Manufacturing and Electronics is the most comprehensive annual industrial exhibition in the (more)

Liberty and fraternity: the story of robots and vision

Liberty and fraternity: the story of robots and vision

There’s a liberation taking place in factory automation – one that is winning new freedoms and accessibility to robots for users in almost every kind of production environment, says Neil Sandhu of Sick.

Robots need ‘eyes’ in the form of machine vision, and the two technologies have always developed symbiotically. Now, together, they are shrugging off the shackles of their somewhat exclusive, and often expensive, past and are being delivered into the hands of the many. But eyes can’t just be ‘bolted on’; you need to add intelligence and communication to make them guides. It’s these robot guidance systems that have the potential to open up countless more applications to many more new users and uses – replacing all those heavy or repetitive manual tasks, such as picking from a bin or feeder and tending at a machine.

Just as importantly, for vision to truly bring robotics power to the people, they need to be simple to install and use. Until recently, most vision systems probably required a heavy investment in expert support to design and install, and certainly a great deal of programming knowledge, and external computing power, to set up. Now vision systems are becoming ‘plug and play’ – easy to install and commission.

There are plenty of cameras to choose from, and they are all very good at what they do. But at Sick, a few years ago, we starting to think about vision systems in a slightly different way, thinking about a camera as being like a smart phone. We all have our favourite smart phones and they are all, essentially, quite similar. We treat them as

a ‘blank canvas’ onto which we download the ‘app’

we need.

So, we started to think about vision sensors in the same way. We developed a range of cameras together with a concept where a developer or programmer can create the functionality needed for the application, fully supported by remote communication to a cloud-based gateway. The solution lies in the software, not the hardware, and it’s deployed onto the camera, simply like a smart phone. The end-user doesn’t need to worry about programming it or getting it up and running. What they get is the device with the app – out-of-the box – and the set-up is intuitive; just like when you buy a new phone.

The beauty of this approach is that a small range of cameras can be put to work on a myriad of different robot guidance tasks, and even be adapted to switch seamlessly between more than one application. Called Sick AppSpace, we are excited about the potential for this software development platform to open up the accessibility and usability of more 2D and 3D vision-guided solutions to machinebuilders and end users alike.

In a typical cobot application developed in AppSpace, a single camera with an app talks to the robot. It can be ‘trained’ to find a shape of a part or product and tell the robot how to pick it up and where to place it, very accurately. The vision system tells the robot where the part is in the X coordinate, in the Y coordinate, even in its rotation. Critically, the camera talks directly to the robot. There is no control system in between. Thus, we can easily support belt picking, picking from feeders, packaging, robot machine tending, picking up kits of parts. We can train all these things into the system and its very easy for the user to configure, because it is just like using an app.

In the case of Universal Robots (UR), Sick has also worked to develop an interface that makes configuring a vision task on a cobot very easy indeed. The Sick Inspector URCap software has been developed to ensure easy integration between a UR3, UR5 or UR10 robot and the Sick Inspector PIM60 2D vision sensor. It is a is simple yet powerful toolkit for creating vision-guided robot pick and place, quality inspection and measurement with minimum time and effort.

The Sick Inspector PIM60 URCap is quick and easy to program and configure without the need for a separate PC or specialist software expertise. Standard configurations such as changing jobs and pick-points, calibration and alignment are done directly from the robot control pendant, making the everyday operations fast and straightforward. More advanced operations such as inspection and dimension measurement of objects prior to picking, can be done through SOPAS – Sick’s standard device configuration tool. The Sick Inspector URCap is also ready to expand through extra data fields that can accommodate results from both detailed object inspections and measurements.

To fully appreciate the scope of this functionality, cast your mind back to a game you may have played in your childhood called ‘pick up sticks’. You may even remember ‘Jack Straws’, a version of the game where you had to pick up mini plastic spades, tools, crutches and swords from a pile, one at a time, using a tiny metal hook. Even quite a small child has the keen vision and dexterity to pick up the uppermost piece from a pile of randomly-arranged objects without disturbing the others. But, this simple game might seem the ultimate task for a robot to conquer.

In fact, there are already numerous automated industrial applications where robots are the perfect candidates for the job of picking randomly-arranged parts or products. The need to pick up components that have been delivered to the factory in a container, bin or stillage and transfer them onto a conveyor belt for onward processing is a very common task. However, until recently it would have taken a great deal of money, programming complexity and sophisticated, heavyweight robot hardware to replicate a task that is, literally, childsplay. Previously, most 3D part localisation systems have been developed for larger scale, heavyweight industrial robot applications, many in the automotive industry. Now Sick has developed both 2D and 3D vision-guided part localisation systems using the AppSpace software development platform for smaller-scale robots and cobots flexibly. This sort of robotics is not a high-speed substitute for manual picking but replaces a human’s repetitive and mundane task with a safer, high consistency alternative. It opens up new applications for picking specific small parts like bolts from a deep mixed parts bin and placing them on a conveyor or selecting part-completed items and placing on a press or machining centre.

The Sick PLOC2D is an easy set-up vision system for 2D localisation of parts, products or packages to be picked from a static workstation, moving belt, or feeder system. The Sick PLB 520 uses a stereoscopic vision camera to enable 3D vision-guided bin picking applications of much smaller objects than was previously possible.

The PLOC2D and the PLB 520 have been developed to be directly compatible and simple to integrate with most leading industrial robot systems, including cobots such as Universal Robots. They can be rapidly and easily connected directly to the robot control without programming skills or training and are ready to use almost immediately. With installed software and an SD card, both systems have an easy-to-use interface which is compatible with webserver, Ethernet TCP/IP robot and PLC interfaces, allowing site or remote configuration.

In future, greater product diversification, customisation and smaller production batches that will become the norm across many manufacturing environments. As part of this process, seamless connectivity between auto-identification and imaging devices is also essential. Software integration platforms such as Sick’s 4DPRO facilitate this integration. Real time communication and data transfer can be enabled between devices, as well as full connectivity to the factory network with compatibility with all standard communications protocols and fieldbuses.

Fully-connected robot vision guidance systems collect, record and store data ‘in the cloud’. In this way, even simple systems have the power to monitor and track quality inspection trends, provide system diagnostics and be accessed remotely in real-time.

The ‘democratisation’ of vision-guided automation is underway. Soon, almost no job will be too small for your robot to given at the beginning of a shift – maybe even switching between jobs as needed all along the production, packaging and warehouse process. With off-the-shelf hardware and ready to use ‘apps’, the versatility and flexibility promised for end users makes the future look very exciting indeed.

Download pdf

Other News from Sick (UK) Ltd

Making light work of robot integration

Latest news about Vision systems

Additional Information
Text styles