Visibility measurement for ADAS and Autonomous Vehicle
Advanced Driver Assistance Systems (ADAS), and partial or total delegation of car control systems will integrate more and more cameras. Those cameras are used to capture video and images are inputs for obstacle detection algorithms, road detection algorithms, detection of pedestrians systems, …
However, a camera can “see” only under certain conditions, and the algorithms used to exploit image need a certain level of image quality. It is possible that some algorithms test themselves if they are in a case of good image quality or not, but in the general case, they don’t, and it is then prudent to have a qualification system that is independent of the detection systems.
The company NEXYAD has worked for years on atmospheric visibility measurement for military application, and was able to develop predictive models of the ability for a human to detect objects. This work can be easily set to pass from a performance prediction of the human vision to a prediction of performance for a machine vision system.
The models consist in comparing the contrast in the scene with the required contrast for detection and / or pattern recognition.
Such a system requires that is respected a compromise between several characteristics of the image:
. number of different gray levels (for a digital camera, it depends on the number of bits)
. size of the objects to be detected
. contrast of objects from their background
Note for Automotive engineers : a performance specification for a camera-based detection system, without giving the minimum contrast, le maximum number of pixels, the number of bits … does NOT have any sense. It is important to know that fact in order to make applications that work and application that know when they work.
For instance, we are all able to detect stars in a dark night sky : the size of objects is very small, the number of Grayscale is very low (pure black and pure white), and the contrast of objects from the background is huge.
Similarly, we are able to distinguish clouds over gray sky : the size of objects is very large, and even on edges there is no detail (no high frequency / contours), and the number of different gray levels is very large (gradual grey scale from black to white).
Between these two extremes are all possible cases, and in particular with all traffic scenes that may vary greatly from one to another :
. sunny day, overcast day, dark night, undergrowth, sunset, night in headlights, fog, rain, etc …
In addition to these technical compromise, there are criteria (eg criteria Johnson) that allow to objectify the subjective.
NEXYAD has developed a tool called VisiNex that integrates models and criteria described above, which led to two products:
. VisiNex Lab : test bench for visibility measurement. It sets a vehicle with calibrated visibility disturbances (rain machine, fog machine, …), and VisiNex Lab measures the evolution of the available visibility during the disturbance and during activation of visibility restoration systems (lighting, demisting, wiping, …).
VisiNex Lab is used to adjust the rain sensors, the wiper systems, the lighting systems. VisiNex is a world leader on this type of use : https://nexyad.net/Automotive-Transportation/?page_id=159
. VisiNex Onboard : NEXYAD took his model into onboard applications to apply and qualify road visibility along the route running (important place to qualify for the road safety applications).
VisiNex Onboard is currently being integrated into the framework for asynchronous real-time applications development RT-MAPS, and will soon be in the NEXYAD vision modules pack for ADAS and driving delegation applications.
Standard visibility on a highway scene. Degraded visibility when approaching a tunnel
VisiNex Onboard can be used in automotive application on the following topics :
. visibility measurement to control Visibility restoration systems (wiper, lighting, …)
. qualification of visibility conditions where an obstacle detection or road detection system will work properly.
The second point is important because road safety applications require to maximize the reliability of vision systems.
To know more :email@example.com