Pairing thermal sensor data with a visible spectrum camera for non-contact body temperature screening.
While immunization vaccines are rolling out at an impressive pace, and as society slowly reopens, our best defense against the Coronavirus continues to be early detection and rapid response (such as self-isolation).
An early symptom of having the virus is an increased body temperature, which can be easily measured using contactless methods such as thermal sensors or cameras sensitive to IR radiation.
However, general purpose cameras still have a role to play – in augmenting and putting into better context the thermal data.
Imagine two cameras – one IR and one standard – observing the entrance to a place of work or indoor public venue. If the image captured by the standard camera feeds a system with face detection software, then the thermal image can be made more meaningful: yes, that heat source is a human face.
Paired thermal and standard optical images can be at the heart of a largely automated human temperature screening processes for detecting COVID-19-infected people.
As a supplier of Zynq MPSoC based embedded prototyping boards, Aldec has risen to the challenge with our brand-new AI-based thermal vision camera demo application.
It leverages the I/O expansion flexibility of our TySOM-3-ZU7EV board, which supports several different sensor interfaces as well as high-performance UltraScale+ FPGA fabric that has been proven in convolution neural network (CNN) acceleration and sensor fusion tasks at the edge.
The demo application runtime is shown in figure 1.
Fig. 1: Demo application runtime.
The main idea of the demo application is to create a visual representation of the IR sensor data, locate the position of the human face using a standard camera image stream processed by a CNN-based face detection algorithm and calculate body temperature. For the best visual effect, both video streams are merged.
The imagers used in the project are:
A top-level view of the project is shown in figure 2.
Fig. 2: Demo application overview.
This is done using the GStreamer Linux media framework – see figure 3.
Standard framework plugins are marked in orange and custom software such as mlx-grabber and sdxfacedetect are marked in blue. The entire software flow is split up into three separate GStreamer pipelines (IR flow, BlueEagle camera flow and legend) producing independent 24-bit BGR video data streams which merged then into common output HDMI image using Video Mixer.
HDMI Video Mixer is a hard IP block commonly used in video output interfaces to combine several video data streams into a one, which is then passed to a video output device. It is configured for four separate overlay BGR layers (planes 29-32), three of which are used to compose the final video stream for an HDMI monitor.
The IR flow is mainly based on an mlx-grabber application, which takes IR 32×24 array data from the MLX90640 IR sensor via i2c bus, calculates absolute temperature values for all of 768 array elements and then creates a 3-color heatmap with blue/green/red temperature data color representation.
The temperature sensitivity range for the heatmap is set to 28 to 38oC, hence normal human body temperature will be shown in green, switching to red to indicate a high temperature. The background will be blue in most cases.
The resulting heatmap image stream is scaled to 1280×720 to match native BlueEagle camera pixel resolution.
The camera flow performs face detection CNN model inference on BlueEagle camera input image, locates the position of the human face and draws bounding boxes sharing the same color palette with IR flow heatmap. The CNN face detection algorithm is accelerated in the PL part of the Zynq MPSoC chip using a Deep-learning Processing Unit (DPU) IP and Xilinx DNNDK inference toolkit.
IR sensors are extremely useful for non-contact human body temperature measurements, in our fight against COVID-19, even with its occasional and more resistant-to-vaccines variants. Moreover, paired optical and thermal imaging, have a major role to play in largely automated early-detection screening systems in the fight against all viruses (past, present and future) for which an early symptom is raised body temperature.
In our demo we illustrated how to combine thermal sensor data with a visible spectrum camera image and how AI-based computer vision algorithms can automate and significantly improve the overall temperature screening process.
It was also great to showcase the power of our compact TySOM-3-ZU7EV board prototyping board with its Zynq UltraScale+ MPSoC device, which provides 64-bit processor scalability while combining real-time control with soft and hard engines for graphics, video, waveform, and packet processing.
Leave a Reply