A Vision for Modern Manufacturing
Deep Learning for Electronics Inspections
By Kelly McSweeney
Powerful robots will soon get artificial eyes and a brain. This next gen upgrade is what engineers at Northrop Grumman are developing through sophisticated deep learning algorithms. Adding to the automation process in the manufacturing setting, machine vision will be used to autonomously inspect and classify electronic components.
Currently, an operator has to sit at a microscope all day to inspect electronic parts such as circuit boards for tiny defects. A speck of dust, for example, may cause failure if trapped inside a circuit board. And when safety is the utmost priority, a circuit board inside a military flight vehicle must be perfect.
“If the circuit card fails, you’re causing the system to fail, which is a pretty serious consequence,” said Ashley Strong, automation engineer at Northrop Grumman. Strong is part of a small team working on a machine vision algorithm to give robots the ability to see and decide. The system will enhance the automation that Northrop Grumman already uses in manufacturing; In other words, powerful robots will soon get artificial eyes and a brain.
Prepping Robots for Accuracy Using 3D Simulation and Digital Twins
“We’re automating the dirty, dull and dangerous tasks that operators are currently doing manually,” Strong said.
Strong and her colleagues create automated stations that utilize 6-axis robots. They also use 3D simulation software to test different scenarios before real life application. For example, before equipping the robot with a new tool (called an “end effector” in robotics lingo), they run a simulation to ensure the tool won’t accidentally interfere with another robot or person nearby.
“We use a ‘digital twin’ to put our actual station into a virtual environment and run checks and begin our development process prior to actually receiving any product or hardware,” she said.
Robots are great at doing the same thing repeatedly, which is why they’re common fixtures for industrial settings such as automotive and aerospace manufacturing. But until recently, inspections were too nuanced, so this task was left to humans. Ideally, robots would both build the product and inspect it for double consistency.
Designing Upgrades for Robots: Machine Vision
Strong and her colleagues are designing a machine vision system that can be used in new robots, enhance existing robots or work entirely on its own.
If a human operator is inspecting a defective part under a microscope, they can quickly decide to give the product a failing grade. But this is a two-step process for a computer; First the camera photographs the product and then the machine learning algorithm analyzes the image for defects and decides whether to “pass” or “fail” the product. Since the inspections are more complicated than a black and white pass/fail, the computer really needs to learn the difference in the inspection. This is why Strong and her team turned to a deep learning algorithm to capture the intricacies of the inspection.
The team is currently collecting large amounts of data for deep learning. The more data collected, the easier it is to train the algorithm to think like a human. But unlike a human, once the robot is equipped with machine learning capabilities, it can repeat the inspection process endlessly without getting tired, bored or distracted.
Northrop Grumman’s six-axis robots are outfitted with machine vision for inspecting electronic parts, but in the near future this inspection will be taken to the next level with deep learning. Strong also suggested that the same technology could be used for predictive maintenance on machines and trend analysis to improve performance. For example, if a certain part is placed a centimeter to the left, could it improve overall performance? “We can find trends to ultimately affect the manufacturing process to make the performance better,” she said.