Neal Welch, Mitsubishi Electric Business Manager for Life Science, explores how we interact with collaborative robots and how long it will be until voice activation becomes commonplace.
Automation has always lent itself to the manufacture of mass-produced disposable medical devices and the bulk processing of samples, for two main reasons: initially for speed and repeatability, and then because human presence poses one of the biggest contamination risks in a clean production or processing environment.
We have already reached a point where servo-controlled automation equipment – whether that is a packaging machine, or a robot pick-and-place cell – can move much faster than the physical production or loading process that feeds it. Hence, there is a limiting factor to further increases in speed and repeatability for fully automated processes.
Many repetitive tasks that have thus far been carried out by personnel, however, can still be transferred to robots as they become more flexible, compact and easier to teach. Automated systems are also making use of AI, and as such are becoming better at decision making, whether this is in developing new, more efficient work flows or arranging for machines to call for service and maintenance on-demand, rather than it being carried out by routine.
In both instances operational efficiency is increased, and as a consequence so is competitiveness and ultimately profitability. What is changing, though, is the level at which a robot can be integrated effectively into an environment that still contains people. Taking the functionality offered by Mitsubishi Electric’s MELFA family of robots as an example, the latest generation combines the benefits of industrial machines such as speed and accuracy, with the safety of cooperative and collaborative robots.
Standard articulated arm and SCARA style robots are available and are designed specifically to work in clean room environments; by adding safety sensors and safety control via the PLC or robot controller they can work at full speed without any physical guarding. When personnel approach for a visual inspection, loading, unloading or maintenance checks, for example, the robot can react by automatically slowing down and then coming to a stop as the person gets closer. As soon as they leave the vicinity and are safe, the robot is back up to full speed without any re-start routine or manual safety interlocks to worry about.
Flexible and programmable
Because robots are flexible and programmable, each application can be programmed to work and interact with people in different modes of operation, changing how they move and react to suit the task the person has to complete, not the other way around. This is already having a significant impact on the market, accelerating the up-take of robots in the medical device manufacturing industry.
We are supplying everything from a robot arm that sits on a packaging machine and inserts devices into blister packs ready for sealing, through to a robot arm mounted on an AGV that can move around a product area autonomously delivering parts and assemblies to different workstations, some automatic and others manned.
The next stage is the integration of IIoT and Industry 4.0 style systems where robots and machines are interconnected in a way the transcends their physical location. Mitsubishi Electric, for example, is already using Edge Computing and various forms of AI such as the IBM Watson online AI service, built-in machine learning and physical teaching functions for establishing complex processes quickly without having to hard code routines and parameters as we have done in the past.
This level of digitalisation has already allowed us to create interactive safety glasses with augmented reality displays for routine servicing and voice activation for robot function control, so it’s safe to assume that in the future, we will be looking at our robots in a different way and talking to them about what they are doing.
To learn more please visit www.automation.mitsubishielectric.co.uk.