Those who grow their own fruit and veg, or even those who rear their own meat on smallholdings or family farms, will know that homegrown produce is very different from what you find on supermarket shelves. The crates of tomatoes or apples stacked in the fruit and veg aisle of a supermarket are filled with items that are reasonably consistent in terms of size, shape, colour, ripeness, etc. Compare that to homegrown produce, where a whole host of environmental conditions, such as weather or pests and diseases, lead to large variations in the items that are picked and dug up. Supermarkets require uniformity in the products they sell because that is what the customer wants, and they will use various technological methods, including machine vision, to achieve this.
A lot of the uniformity in fruit and vegetables obtained by the supermarkets is reliant on the growing, harvesting and transporting conditions – pesticides and fertilisers, as well as storage conditions playing a key role. Inspection systems utilising machine vision also play a part, sorting produce and identifying contaminants to gain homogeneity throughout the product group.
JAI, a digital camera manufacturer based in Copenhagen, Denmark, produces a multi-spectral camera that simultaneously measures both visible and near infrared (NIR) light spectrums through a single lens using two channels. The AD-080CL camera is especially applicable for sorting and grading fruit and vegetables, where the visible channel is used for determining the ripeness of the fruit whilst images from the NIR channel show any sort of bruising or tissue damage under the skin. ‘When the tissue walls collapse the light [from the NIR spectrum] reflects differently from that reflected by healthy tissue,’ explains Gunnar Jonson, director of product marketing at JAI. This allows damaged fruit to be identified.
The system could be set up using two cameras, but this would be a more complex operation, as the two cameras would need to be carefully aligned. There is also the added cost involved with such a system. Jonson says that the AD-080CL camera is easier to implement from a logistics standpoint and more reliable, with one common lens focusing and setting the iris.
Fruit and vegetable vendors are looking for an even distribution of quality within their pool of produce, explains Jonson. He states that it’s not just separating good from bad either, but also grading the produce to ensure top-quality items are sold for premium prices.
e2v, a manufacturer of electronic tubes, sensors and semiconductors, has developed similar multispectral technology in its latest camera family, the Eliixa. The camera family contains a multi linear sensor that features either tri-linear RGB, quadric linear RGB+monochrome, or RGB+near infrared (NIR) light, the latter being particularly suitable for sorting fruit and vegetables. ‘Food processing is a niche application for us [e2v] but is something that the Eliixa is suitable for,’ says Florence Forthoffer, marketing engineer at e2v.
The camera can also be used to check bottles of drink as they pass on a conveyor belt. The NIR vision can see the liquid inside bottles that are opaque and can analyse the contents to ensure quality control standards are maintained.
The Eliixa has the added ability of capturing images very quickly as the produce rolls down a conveyor belt due to the small 20μm distance between the four lines. This two-pixel interline distance is, according to Forthoffer, unique to the Eliixa camera.
A further inspection application that JAI is involved in is sorting eggs. JAI supplies six CVA1 cameras as part of an inspection unit developed by the Danish machine vision company IH Food. The FoodInspector is a fully automated system for inspecting and sorting foodstuffs and has become a standard component in egg processing plants, where it has been dubbed the ‘EggInspector’.
The 1.4 Megapixel cameras are positioned in such a fashion as to capture images from every angle as the eggs roll down a conveyor belt. The cameras monitor the quality of eggs passing through the system and the images are analysed digitally, with complex algorithms identifying any hairline cracks or detritus on the egg’s surface. Eggs that do not pass the inspection are removed by a robot.
‘How the eggs are illuminated is key to identifying cracks in the shell,’ explains Jonson of JAI. Light is shone through the egg from the rear thereby illuminating any cracks or debris. The EggInspector can sort up to 120,000 eggs an hour.
Manual inspection of eggs is an extremely laborious process and is prone to error. By automating the process the level of accuracy in identifying defective eggs increases, the rate of sorting is higher, and an otherwise unpleasant task for humans can now be carried out by a machine. ‘This is a nice example of how vision technology, together with automation, can increase productivity in a process that’s not pleasant for employees,’ comments Jonson.
The volume of dough in automated baking procedures can be monitored using laser triangulation to build up a 3 D image. Image courtesy of Sick IVP.
The Swedish company, Sick IVP, part of the global company Sick, which sells industrial sensors for factory automation, supplies machine vision systems for many different applications within the food and beverage industry. Its 3D smart camera builds up a 3D image of an object moving along a conveyor belt by a process termed ‘laser triangulation’. Here a laser line is projected across the surface of the object, whilst a camera views the line at an angle and analyses its shape. Any object moving across the laser line will disrupt its profile, with any undulations in the 3D geometry of the object being reflected by the laser line falling across it. Sick’s IVC-3D smart camera processes images of the laser line at a rate of 35,000 profiles per second, thereby building up a 3D image of the object.
This method is ideal for measuring the shape and identifying any defects in food produce moving along a conveyor belt. One such application is the portioning of fillets from fish, poultry or meat. Supermarkets demand precision in their portion sizes, in that fillets identical in weight can be sold at a fixed price. Sick’s system calculates the volume, and from this the weight, of each virtual portion as the fillet moves along the conveyor belt. This information is relayed back to a control system, which then instructs an automated knife to portion the fillets accordingly. The system operates to an accuracy of three to five per cent per portion.
Anders Murhed, business development manager at Sick IVP, notes that one of key applications in using this technology is to avoid blockages in the machinery. For instance, he cites checking cookie height and shape as an important application of this technology. ‘It may sound strange, but several companies are doing this,’ he says, going on to explain that even millimetre excesses in height, when amplified across the number of cookies in a box, can lead to packaging malfunctions further down the production line, as the cookies won’t fit into the box.
The volume of dough in automated baking procedures can also be monitored using laser triangulation. This, again, avoids stoppages in the production line, as any misshapen or wrongly portioned dough can be removed before baking.
The 3D images captured by the IVC-3D camera can be processed using 2D tools allowing further analysis to take place. The camera is used to guide a robotic arm in handling hamburger buns. Murhed explains that touching the buns too much or too roughly could damage them. The 3D camera minimises this by locating the centre of the bun for exact placement of the robot arm. ‘At the same time, the buns are inspected for quality,’ says Murhed, with criteria such as the number of seeds on top of the bun being analysed.
Pick-and-place applications, in which vision sensors guide a robotic arm to sort produce as it travels down a conveyor belt, are a common use of machine vision. Cognex, a global manufacturer of vision systems, supplies vision sensors to direct flex-picker robots from ABB, an engineering company based in Zurich, Switzerland, to process Unilever’s sausage snack, know as Bifi, at its plant in Ansbach, Germany. The sausages move down a conveyor belt and four robots continually pick them and lay them into rows of thermoformed cavities for vacuum packaging. Any sausages stuck together, and which would prove difficult for the picker to handle, are identified by the vision sensors and subsequently left alone to be recycled back into the system.
Cognex’s vision systems have been used in inspection applications, such as checking the presence and quality of wafers in chocolate biscuits, or ensuring that moulds used in the production of confectionery are empty and properly cleaned.
‘The demand for traceability has led to a huge growth over the past two years in using vision to check labelling,’ says Leigh Jordan, product sales manager at Cognex UK. In the case of best-before dates, or to ensure that the product marries up with its packaging, optical character verification (OCV) is employed and allows for full traceability of the product. This is especially important when it comes to providing allergy advice for specific food products; the advice printed on the packaging must match the food inside. ‘The only way to implement traceability effectively is to use machine vision,’ states Jordan.
‘Products made by God, such as fruits and vegetables, can be more of a challenge for vision systems,’ Jordan says, due to their inherent variations in measurable criteria. However, he adds that: ‘The advanced software tools on the vision systems can be trained to look for inherent product defects under varying manufacturing conditions.’ Therefore, once a measurable property has been identified, such as colour or shape, a vision system can be programmed to carry out checks on it. Jordan also comments: ‘A lot of food products still come down a high volume manufacturing environment and so are regimented to a certain extent.’ It is due to this that automated systems using vision are so effective in food processing.
Food manufacturers require systems that can work at high speed, notes Jordan. For instance, systems checking fill levels on bottling lines can operate at 20 to 30 parts per second. In addition, the equipment used has to meet IP68 standards, in that it must be housed in such a fashion as to be completely washed down with no detrimental effects to its running. Much of the equipment used in the food and beverage industry is made out of stainless steel to maintain a hygienic working environment.
In the past, product checks using machine vision were implemented only at end-of-line inspections, where the items had already been assembled. In this scenario, the end product could potentially undergo a lot of processing before being rejected at the final inspection for a malfunction that occurred at the beginning of the production process. ‘As the cost of machine vision decreases, we are seeing lots of vision inspection systems being used all along the production line,’ explains Jordan, ‘which allows any faulty items to be removed as soon as the defect appears, maximising profitability.’
Jonson of JAI says that the food and beverage industry in general is cautious when it comes to installing automated systems. The system must be reliable and be able to operate 24 hours a day with very few failures. Any stoppages in the system could result in backlogs of produce in the production line, ultimately costing the company money.
‘Automation in general has been slow to meet these criteria,’ suggests Jonson and it has only been in the last 10 years that machine vision in fully automated systems has begun to penetrate the food processing industry. Jonson sites McDonalds’ automated system for inspection of potato quality, which was installed approximately 15 years ago, as one of the first major automated systems employing machine vision.
The capabilities of machine vision are impressive, with systems designed to sort a wide range of foodstuffs. Jonson sites one example as vision systems being used to sort beans falling off the end of a conveyor belt. The cameras capture images as the beans are in mid-air, identify the produce that do not meet the quality standards, and direct air nozzles to pick these out. He has even seen this type of system sort rice, with air nozzles removing black grains and bits of grit and stones, along with the rice grains in the general area of contamination.
Murhed of Sick IVP cites bakery as one area that will see vision systems for quality control being used more frequently. As the technology develops, the use of vision in the food and beverage industry will continue to grow and support inspection and automated processes in a wide range of applications.
Bees are loaded into a detector, and their responses recorded using a camera. Courtesy of Rothamsted Research
The threat of contamination is a big concern to producers and suppliers of food, whether it is from an item of food going bad or from a rogue substance finding its way into the food. Detecting such contamination could become easier, however, if the goals of a small UK-based firm are realised.
Inscentinel, which is based in Hertfordshire, has devised a novel way to detect chemical compounds in very small quantities. Instead of the traditional electronic sensors, Inscentinel uses honey bees and it believes that they can do a much better job.
The idea behind these sensors is straightforward: when a bee smells food it sticks out its proboscis (or tongue). If the bee expects food in response to another scent then it will stick out its proboscis when it smells that compound too.
Bees are very sensitive to smells and highly selective – they can even detect between different qualities of wine. They are also quick and easy to train to different compounds. While a sniffer dog might take months to be trained to spot drugs or explosives, for example, bees can be trained to a compound in a matter of minutes. And, because the bees show a reflex reaction to the expectation of food, they won’t become distracted or bored.
The potential for error – a bee sticking out its proboscis without sensing the compound of interest – is further reduced by using several bees within a detection system. A greater number of bees could also enable several compounds to be detected at the same time. For example, 10 bees might be trained to one compound while another 10 are trained to a different compound.
So how does the Inscentinel system work? Core to Inscentinel’s technology is its system for restraining insects in holders. According to the company, the holders do not hurt the bees and after their time in the holders – anything from a day to five days – they are free to fly away and not be bothered again.
The bees are loaded into a detector and their responses are recorded using a video camera within the detector. Image recognition software on a computer then records the bees’ lengths, which change dramatically when their proboscises are extended. According to the company, the bees almost always agree with each other, giving a simple ‘yes’ or ‘no’ result for a particular compound.
The company has already tested its honey bee approach in a number of food applications. One example that the company is working on with researchers at Rothamsted Research in Hertfordshire is to identify the compounds emitted by oranges in the early stages of their infection by medfly. Honey bees are trained to recognise the smell of infected oranges and discriminate these from uninfected oranges. Without this early detection, costly monitoring programmes are required to ensure that end products such as orange juice are not infected with medfly maggots. Inscentinel is also working with the Norwegian research institute Matforsk to develop a honey bee-based sensor to detect tainted meat products. The company expects food sensors to be a major area of its work in the future.