Skip to main content

VISION 2024: Imaging expertise for a range of applications

Excelitas at VISION 2024

Excelitas displayed a range of product at the recent VISION 2024 in Stuttgart, Germany

Among the many innovations on the Excelitas booth was a demonstration of Swept-Source Optical Coherence Tomography (SSOCT). This laser-based imaging methodology works in a similar way to ultrasound, but uses infrared light rather than sound waves to provide measurements. The penetration depth is less than ultrasound, but the resolution is much higher.

 

OCT has long been used in ophthalmology, but there are emerging uses in industrial applications. The on-booth demo showed some quality control and defect detection examples, including a printed circuit board (PCB), with the SSOCT able to measure the presence and thickness of the spray-on conformal coating.

 

The system works by using a sweeping laser, with a centre wavelength of 1060nm. The laser sweeps +/-50nm either side of the centre frequency. Light is sent from the laser source, through the fibre optic to a galvo scanner, which raster-scans the beam across the subject. The backscattered light can then be detected and used to generate a cross-sectional image.

 

Separately, a multispectral VIS-SWIR demonstrator comprised of a PCO.Pixelfly 1.3 camera with shortwave infrared (SWIR) sensitivity - from 400nm through to 1,700nm - connected to an Optem Fusion SWIR micro-imaging lens system, featuring a 7x variable zoom range. Coupled with an LED ring light capable of illuminating multiple colours, the system can be used for quality control in applications where the human eye would not always be able to detect differences. This demonstration showed a mixture of rice and plastic rice and, when the process was run, the resulting images showed a clear difference between actual and plastic rice (although in reality, the contaminants might be plastic packaging). The system is currently being deployed in multiple applications, including smart farming, where it’s being used to separate weeds from crops using spectral analysis.

 

A VIS-NIR AI imaging set-up showed why Excelitas has steered away from smart cameras.

 

Dr Gerhard Holst, senior imaging product and application scientist at Excelitas, says, ‘We’ve found that having a built-in FPGA is not the best approach for many applications, because it creates more problems than it solves. Every time you load a program for the FPGA, you are changing the precise timing sequences and, as a result, you cannot guarantee that the image quality remains. Therefore, we believe that a camera should do what it does best - deliver the best possible images at top speed - and we then connect directly to popular small computers, like the Jetson boards from Nvidia, to process the images.’

 

Another demo, featuring flying alphabet noodles in a perspex container, showed high speed AI imaging. The PCO.dimax 3.6 ST high-speed 3.6MPxl streaming camera does not contain any image memory; instead images are fed directly to the PC, which identifies the letters captured in any given frame. This demonstration was set to alert the operator whenever the letters P, C and O were captured in the same frame and was complemented with Excelitas’ award-winning, high-resolution Linos d.fine HR-M lens for optimal uniform image quality. This set-up is suited to any high-speed application that requires a fast response to the image intelligence.

 

The final camera demonstration showed AR/VR micro-LED inspection, featuring a widefield industrial microscope with a closed loop integrated auto-focus. The sensor works by laser triangulation, with the closed loop auto-focus continually providing feedback to the actuator to keep the object - in this case a micro-LED - always in focus. It is coupled with a PCO.panda 26, a 26MPxl camera, and an Excelitas X-Cite high-power LED light source. Again, the use case is quality control and defect detection, with micro-LEDs a particularly good example as the resolution is high enough to detect single LEDs in an array. The lens ensures a wide field of view and super-high resolution of approximately 1um.

 

Also on display were a range of lenses, including the Linos d.fine HR-M lens, designed for really high resolution for large sensors. The 80mm lens covers an image circle of up to 60mm and has an aperture of 2.8, making it a suitable lens to partner with Sony’s newest high-end sensors, such as the IMX811 and 411.

 

The brand new Linos inspec.x L 5.6/106 VIS-NIR lens is colour-corrected for visible to the NIR, from 400 to 1,150nm without any focus shift. It enables the use of any wavelength in this broad range without refocusing. The demonstration displayed blue, green and red lights, with the images always in focus. More and more inspection tasks require both visible and NIR light to highlight different features, so this single lens can fulfil both requirements without the need to swap out for a different lens.

 

The final demo was a real-time solid state lidar point cloud demonstrator, using an array of high-powered 905nm pulsed laser diode sensors and a single photon avalanche photodiode detector array. The output is a 3D point cloud that is colour-coded to show the distance to each individual point. This typical set-up is used for range-finding applications within automated guided vehicles in factories, or for driving assistance in autonomous cars.

Topics

Read more about:

Cameras & imaging, SWIR

Media Partners