Greg Blackman reports from Image Sensors Europe in London, where Sony’s dominance in the image sensor market, along with sensing for automotive and virtual reality were discussed
‘Is Sony the mobile king forever?’ asked Vladimir Koifman, chief technology officer at Analog Value. He was speaking at Image Sensors Europe, a conference held in London from 15 to 16 March covering all aspects of image sensor development and uses.
Sony owns around 35-40 per cent of the global image sensor market, according to Koifman, and so he asked the question with a view to discussing how the sensor market might change in the future.
In terms of new technology, Koifman said that Sony is open to talks with start-ups and will allow them to use the top layer of its CMOS architecture leaving the start-up to develop its own logic design, although he didn’t specify what sort of volumes would be required to take advantage of this. The big question then, said Koifman, is whether there is a place in the market for such a start-up offering a better logic layer?
Koifman added that it’s hard to beat Sony on price, so another option is to develop something slightly different, such as a time-of-flight sensor.
Machine vision makes up a tiny portion of the global image sensor market, so it is reliant to a certain extent on developments made in CMOS sensors for mobile phones and consumer devices. At the same time, Sony is one of the main, if not the major, supplier of CMOS image sensors for industrial cameras with its Pregius global shutter CMOS sensor.
The danger of being such a small part of the market is if Sony decides to cease fabrication of a certain product line, like it did for its CCD sensors, a shift in strategy that heavily impacted the machine vision sector. Big changes like that, driven by the fast moving world of consumer sensor technology, can be at odds with industrial cameras that have to have lifetimes of 10 years or more.
When trying to answer the questions of whether Sony will continue to dominate the mobile image sensor market, Koifman said that knowledge about Sony’s technology is slowly being disseminated as employees leave the company, and that there are huge price pressures in China where Sony might therefore concede ground to competitors.
Virtual world
Automotive and virtual reality (VR), both areas that could see huge demands for image sensors in the future, were also covered during the conference. Chiao Liu, lead sensor architect at Oculus VR Research, gave a list of image sensor requirements for virtual reality devices, which included: 2-3µm pixels, global shutter, low light sensitivity, wide dynamic range, high frame rate, monolithic RGB and depth sensor for camera consolidation, and low power consumption, among other factors.
Liu said that, in 10 to 15 years, billions of VR devices are expected to be produced, which equates to a lot of image sensors.
Virtual reality uses imaging for aspects like face tracking, operating at 90-100fps, and gaze tracking, which requires faster sensors running at 240fps or higher, according to Liu, as well as high sensitivity in the near infrared. Eye tracking is required to minimise latency in the VR experience.
Filip Geuens, CEO of Xenomatix, a Belgian firm manufacturing CMOS-based solid-state lidar, gave a presentation discussing whether lidar will make cameras redundant for autonomous vehicles. Geuens noted that lidar will be increasingly important for autonomous driving, because driving is a 3D activity and therefore requires 3D sensors, but added that cameras and lidar will most likely work together on self-driving cars as both have their strengths.
Xenomatix’s technology is a multi-beam lidar system, sending out thousands of laser pulses in parallel and detecting the returning light with a CMOS sensor. It offers 200-metre range and operates in sunlight and during the night. Geuens played videos of the system being used for active suspension, sensing what’s coming up on the road in front of the vehicle and allowing the suspension to adapt accordingly. Lidar gives a measurement of objects like speed bumps and potholes in the road, which a camera can’t do.
Geuens showed Xenomatix’s lidar operating through water and along a snowy track. In oily conditions, the system can still see the road, but the range reduces, Geuens said. He added that the 860nm laser penetrates fog better than visible imaging, but not as well as radar. He also noted that because the system works with multiple beams and creates a reference pattern, it won’t be confused by light from another autonomous vehicle’s lidar mapping.