Computer vision technology from Zensors, a firm spun out of Carnegie Mellon University, is being trialled in Pittsburgh, New York City and Mumbai to help enforce social distancing.
Zensors' algorithms analyse feeds from existing CCTV cameras to provide real-time data on the number of people in an area and whether safe distances are maintained between them.
The company says its platform can answer questions like: the number of people in a facility or public space; whether people are following social distancing policies; when and where cleaning staff should clean based on the surfaces people have been using; and the number of people passing through a scene that are wearing face masks.
In a blog post, Zensors said it is providing its platform for free until 1 June to help fight the Covid-19 pandemic. It is also inviting machine learning researchers and experts to collaborate on using the platform.
In the post, it was stated: 'We have an open API and can enable anyone with technical knowledge or unique data sources to build novel systems and solutions. We invite machine learning collaborators to join our team of Carnegie Mellon University computer scientists in designing more robust and helpful deep learning models during this unprecedented crisis.'
The company is also interested in working with geographic information system experts who can help integrate Zensors' data onto ESRI and other mapping platforms.
The technology was developed by the Future Interfaces Group, led by Chris Harrison, an assistant professor in Carnegie Mellon University's Human-Computer Interaction Institute (HCII).
In a Carnegie Mellon University article, Anuraag Jain, an HCII alumnus and inventor of the Zensors technology, said airports and other potential clients contacted the company as the Covid-19 threat grew, seeking assistance in using computer vision to help them manage their facilities.
Other companies are employing computer vision in similar ways. Swisstraffic has modified its traffic and people monitoring system to analyse the places where it's difficult to maintain a 2m distance. It's platform uses existing cameras and infrastructure to provide video analysis. It can create a heat map of an area showing where people are getting closer than the recommended two metres.
Irida Labs has done a similar thing with its neural net, which can run on edge devices and also blurs the people in the video to make it anonymous.