RoadNex v2.2 Road with markings on both sides

RoadNex v2.2 Road with markings on both sides.
RoadNex detecting the lane on road and detecting the surface of the road with strong sun light on front of the car and the camera…

Nexyad provides modules for ADAS (Advanced Driver Assistance Systems) : some of those modules such as RoadNex road detection or ObstaNex obstacles detection are competitors of the famous modules of the company Mobileye.

RoadNex v2.2 Desert Track

RoadNex v2.2 Desert track
RoadNex detecting the lane on the desert track : sand road, no markings, stones, etc…
Contrast of colors is very poor, detection is still fine.

Nexyad provides modules for ADAS (Advanced Driver Assistance Systems) : some of those modules such as RoadNex road detection or ObstaNex obstacles detection are competitors of the famous modules of the company Mobileye.

USING NEXYAD ADAS MODULES
FOR AUTONOMOUS VEHICLE AND SAFETY/RISK ESTIMATION



USING NEXYAD ADAS MODULES FOR AUTONOMOUS VEHICLE AND SAFETY/RISK ESTIMATION

by NEXYAD


INTRODUCTION

The company NEXYAD developped software modules for Advanced Driver Assistance Systems :
. RoadNex (Road detection) : lane detection, detection of the borderlines of drivable area in the lane, detection of the surface of drivable area in the lane.
Sensor : camera (color)

. ObstaNex (Obstacles detection) : obstacles detection (if they have a vertical dimension or – inclusive – if they have their own movement)
Sensor : camera (N&B or color), accel, gyro

. VisiNex onboard (weather visibility measurement) : visibility measurement (quality and distance)
Sensor : camera

. SafetyNex : onboard road safety / risk estimation
Sensor : navigation map, gps, accel or car speed

Those modules were made to develop very efficient ADAS.
There are many ways of comining those modules, depending on the function that should be developped.

LANE KEEPING AND AUTOMATIC BRAKING : FOR CAR MANUFACTURERS AND TIER ONE COMPANIES

For this function, modules may be integrated in a rather complex way :
Nexyad Suite 1
Such an application needs to know where it works and where it doesn’t work (reliability). For that, VisiNex helps because it measures weather visibility and the nit is possible to know in which context artificial vision algorithms are efficient or not. It is also possible to switch setting parameters of artificial vision based algorithms using visibility characteristics, in order to expand the range of good performance of the global system (this is robustness).

NEXYAD applies a validation methodology called AGENDA (see papers in CESA Automotive 2014 in Paris and in SATETYWEEK 2015 in Aschaffenburg). This methodology is the onlt approach that allows to know what the system is supposed to do in a functional point of view, with measurable characterisctics of road scenes.
NEXYAD of course uses the NEXYAD ADAS validation data base : a part of this validation data base for artificial vision-based ADAS will be soon online for free (usable by every researcher or engineer in the world).

Note : the AGENDA methodology also provides a method to measure the similarity of a road scene in the validation data base anda current road scene : this is applied to estimate a confidence score.

SAFETY / RISK ESTIMATION FOR INSURANCE COMPANIES

SafetyNex measures the adequation of driving to road infrastructure characteristics.
It generates then a risk if the driver goes too fast when approaching a crossing road or a dangerous curve.
Of course, a poor visibility should lead the driver to drive slower.
In addition, there could be auxiliary inputs that would tell SafetyNex if there are obstacles on the pathway :
Nexyad Suite 2
This scheme is the same than the previous one but the outputs of RoadNex and ObstaNex are used INSIDE the scheme (they don’t provide an output of the global scheme).

DEMOS OF NEXYAD MODULES



REFERENCES

Validation of Advanced Driving Assistance Systems by Gérard Yahiaoui & Nicolas Du Lac
CESA Paper by Gérard Yahiaoui & Pierre Da Silva Dias
Road detection for ADAS and autonomous vehicle
Using the NEXYAD road detection (RoadNex) to make obstacles detection more robust
Real Time Onboard Risk Estimation Correlated with Road Accident
Visibility Measurement for ADAS and Autonomous Vehicle

Visibility Measurement for Road Safety

Visibility Measurement for Road Safety by NEXYAD

INTRODUCTION

Visibility is one of the structural elements of road safety. Indeed, the sense of sight is the only one that let us perceived the future path of the vehicle and then let us act on it : the driver “can see” in front of the vehicle, he predicts where the vehicle will go, and he can act on the controls (brake, steering wheel, …) in order to control the trajectory.

No other way allows us to anticipate.

If we model the task of driving with an automatic control engineering scheme, then we can notice that vision is used quite everywhere :

Set Point of Trajectory Modification

Vision plays a critical role in driving task, and what sizes the efficiency of this sense is « visibility ».

Visibility can be affected by many kinds of factors:
. the absence or insufficience of light (that is why the infrastructure is sometimes illuminated at night, and why vehicles are equipped with lighting.
. rain deposited on the windshield (that is why vehicles are equipped with wipers)
. mist on the windshield (that is why vehicles are equipped with demisting systems)
. humidity, fog or mist suspended in the air in the road scene.

Experts of road infrastructure add elements to enhance the visibility of the path :
. lane markings (white lines), reflective elements.

Similarly, automobile experts equip their vehicle with systems enabling them to improve visibility for the driver, but also allowing the vehicle to be more easily seen by other drivers.

We then understand that measurement of visibility is an important area of potential improvement of road safety in via ADAS.

VISIBILITY MEASUREMENT

The founders of the company NEXYAD have been working since the 80s on the measurement of visibility, early on military applications.
Indeed, it is the military who have studied since the 60’s which criteria allow human visual perception system to detect objects on their clutter.

For the military, the constant search for stealth (camouflage, for example) requires modeling the performance of the detection by human, depending on the light of a scene in the visible wavelength.

The work carried out tests on panels of thousands of soldiers, and led to predictive models for human vision of the ability to detect objects or not, depending on the image quality.
NEXYAD is one of the very few companies in the world to hold these models and have experience of their implementation for more than 20 years.
In simplified terms, we can consider that our eyes and brain need, depending on the size of the objects to be detected, a different contrast level.
We can then compare the contrast available in a scene (eg a road scene) with needed contrast to detect, , for each size of objects.

The comparison results in two scores :
. the apparent size of the smallest detectable object : as the apparent size of an object decreases with distance, it can then be deduced the maximum distance of detection for a reference object (a car, a truck, a pedestrian). Distances will obviously be different for every object because they don’t have the same size. Johnson criteria give let also estimate the maximum distance for object recognition, and the maximum distance for object identification.
. ease of interpretation of the visual scene. NEXYAD summarized this in a score computed from available and needed contrast: the Visual Quality Score (VQS).

This measure of visibility enables automotive application objectify the subjective. NEXYAD has developed two product lines from the same technology :

. a visibility test bench : VisiNex Lab https://nexyad.net/Automotive-Transportation/?page_id=159

VisiNex Lab

Place a vehicle on a test bench and VisiNex Lab measuring visibility among time. If there are disturbs of visibility from rain, for example (using NEXYAD RainNex rain machine, or another rain machine), then we see scores for degraded visibility. If one starts the vehicle visibility restoration systems (eg in the case where the disturbance is the rain : the wipers), then we measure the performance of the visibility restoration.
VisiNex Lab is used by the automotive industry and is still the only tool for measuring the performance of wipers, demisting system, lighting system, …

. an embedded module for ADAS : VisiNex Onboard https://nexyad.net/Automotive-Transportation/?page_id=438
VisiNex Onboard measures the image quality and predicts the detection power of the driver and onboard artificial vision modules. So we get a rating of confidence for artificial vision systems.
Again, NEXYAD is the only non military company to dispose of this technology.

Road Scene

CONCLUSION

Every tier one company or car manufacturer should use NEXYAD modules VisiNex in order to measure performance, robustness, and reliability of their wipers, lighting, and of their camera-based ADAS.
VisiNex Onboard is currently under implementation into the asynchronous real time framework RT-MAPS.

For more information : sales@nexyad.net

Visibility Measurement for ADAS and Autonomous Vehicle

Visibility measurement for ADAS and Autonomous Vehicle
By NEXYAD

Advanced Driver Assistance Systems (ADAS), and partial or total delegation of car control systems will integrate more and more cameras. Those cameras are used to capture video and images are inputs for obstacle detection algorithms, road detection algorithms, detection of pedestrians systems, …

However, a camera can “see” only under certain conditions, and the algorithms used to exploit image need a certain level of image quality. It is possible that some algorithms test themselves if they are in a case of good image quality or not, but in the general case, they don’t, and it is then prudent to have a qualification system that is independent of the detection systems.

The company NEXYAD has worked for years on atmospheric visibility measurement for military application, and was able to develop predictive models of the ability for a human to detect objects. This work can be easily set to pass from a performance prediction of the human vision to a prediction of performance for a machine vision system.

The models consist in comparing the contrast in the scene with the required contrast for detection and / or pattern recognition.
Such a system requires that is respected a compromise between several characteristics of the image:
. number of different gray levels (for a digital camera, it depends on the number of bits)
. size of the objects to be detected
. contrast of objects from their background

Note for Automotive engineers : a performance specification for a camera-based detection system, without giving the minimum contrast, le maximum number of pixels, the number of bits … does NOT have any sense. It is important to know that fact in order to make applications that work and application that know when they work.
For instance, we are all able to detect stars in a dark night sky : the size of objects is very small, the number of Grayscale is very low (pure black and pure white), and the contrast of objects from the background is huge.

Contrast

Similarly, we are able to distinguish clouds over gray sky : the size of objects is very large, and even on edges there is no detail (no high frequency / contours), and the number of different gray levels is very large (gradual grey scale from black to white).

Clouds

Between these two extremes are all possible cases, and in particular with all traffic scenes that may vary greatly from one to another :
. sunny day, overcast day, dark night, undergrowth, sunset, night in headlights, fog, rain, etc …

Visibility_Measurement

In addition to these technical compromise, there are criteria (eg criteria Johnson) that allow to objectify the subjective.

NEXYAD has developed a tool called VisiNex that integrates models and criteria described above, which led to two products:

. VisiNex Lab : test bench for visibility measurement. It sets a vehicle with calibrated visibility disturbances (rain machine, fog machine, …), and VisiNex Lab measures the evolution of the available visibility during the disturbance and during activation of visibility restoration systems (lighting, demisting, wiping, …).
VisiNex Lab is used to adjust the rain sensors, the wiper systems, the lighting systems. VisiNex is a world leader on this type of use : https://nexyad.net/Automotive-Transportation/?page_id=159

VisiNex_Banc

. VisiNex Onboard : NEXYAD took his model into onboard applications to apply and qualify road visibility along the route running (important place to qualify for the road safety applications).
VisiNex Onboard is currently being integrated into the framework for asynchronous real-time applications development RT-MAPS, and will soon be in the NEXYAD vision modules pack for ADAS and driving delegation applications.

VisiNex Onboard
Standard visibility on a highway scene.                             Degraded visibility when approaching a tunnel

VisiNex Onboard can be used in automotive application on the following topics :
. visibility measurement to control Visibility restoration systems (wiper, lighting, …)
. qualification of visibility conditions where an obstacle detection or road detection system will work properly.

The second point is important because road safety applications require to maximize the reliability of vision systems.

To know more :sales@nexyad.net

Validation Database New
Road Detection & Road Safety
NEXYAD tools for ADAS

NEXYAD Automotive & Transportation Newsletter #4, the 7th of September 2015



Validation database for camera-based ADAS

The company NEXYAD started building a database for validation of advanced driver assistance systems (ADAS and Autonomous car) using the methodology AGENDA published in the 90 by Gérard Yahiaoui (methodology initially developped for control construction of learning and test databases for the implementation of artificial neural networks).
This database has two essential characteristics:

1) Known life situations
Indeed, the methodology AGENDA proposes to describe potential changes of signals and images came into factors of variability and their crosses.
Example, for obstacle detection :
   . weather (dry overcast, sunny weather, rain, fog)
   . overall brightness (low, medium, high)
   . speed of the carrier vehicle (low, moderate, high)
   . type of road (highway, road with marking, road without marking …)
   . coating (bitumen 1, bitumen 2, …, cobblestones)
   . day / night (headlights and the lights switched infrastructure)
   . season (spring, summer, autumn, winter)
   . etc …

      > type of obstacle :
           – stopped
                      . infrastructure-related: work terminals, tolls, …
                      . related users: tire on the road, parcel felt from a truck lying on the road, biker following a road                       accident, disabled vehicle stopped on the floor, standing pedestrian on roadside edge (dodger /                       no sniper)
           – moving
                      . truck, car, vulnerable (pedestrian, bicycle, motorcycle) each with types trajectories (longitudinal
                      in rolling direction, longitudinally in the opposite direction of rolling side) and position (opposite
                      to right, left).
                      . Etc…

We see that if we cross these factors, we find fairly quickly a huge number of cases. However, the development of ADAS systems is complex, and it is necessary to proceed by successive iterations, starting from simple situations to move to complicated situations.
Our database allows this, since all records are described in terms of crossing the terms of the factors of variability. Thus knows exactly which cases were tested or not by the system.
Formalism ‘crossing of variability factors of the terms’ allows using design of experiments, and in particular orthogonal fractional plans to sharply reduce the number of cases to be tested while ensuring maximum coverage of life situations. One can in this context to develop a fractional ADAS on an orthogonal plan and test other hard fractional orthogonal planes for example.

2) Reality reference
This is to crop images barriers and infrastructure elements (markings, roadsides, etc.) so as to constitute a reference to measure system performance.

. Examples of life situations:
Life Situations


1.1, summer, overcast, unmarked road, moderate speed tire on the floor, dry weather
1.2, summer, overcast, unmarked road, moderate speed, parcels on the floor, dry weather
2.1, summer, overcast , unmarked road, moderate speed, standing pedestrians non ambush at the edges of the floor, dry weather
2.2, summer, overcast, unmarked road, moderate speed, lying on the floor human, dry weather
etc …

Not sure that you would meet those few cases, even with on million kilometers on open roads.



Our Goal

NEXYAD starts his collection of images and data:
      . video (towards the front of the vehicle) Color
      . accelerometers
      . gyros

The files are synchronized by RT-MAPS tool INTEMPORA society.
The files are saved as RT-MAPS format and replayable directly by this tool.

NEXYAD currently looking for contributors on this internal project. Co contributors fund and in return free access to the database, unlimited in time. This contribution will accelerate the work of collecting and labeling.
NEXYAD wishes to provide this basis before June 2016, free way to give the material to the community and the ADAS autonomous vehicle for a smaller version of the database, and pay way (as subscriptions) for complete database.
NEXYAD’s ambition is to spread its methodological expertise and allow everyone to assess the performance of vision systems for ADAS, whether systems developed by NEXYAD, or others.

References
“Methodology for ADAS Validation: Potential Contribution of Other Scientific Fields Which Have Already Answered the Same Questions”, Gérard Yahiaoui, Pierre Da Silva Dias, CESA congress Dec 2014, Paris, proc. Springer Verlag
“Methods and tools for ADAS validation”, Gérard Yahiaoui, Nicolas du Lac, Safetyweek congress, May 2015, Aschaffenburg


Contact
For questions, or if you wish to become a contributor, please contact NEXYAD : +33 139041360


*****



Road detection for ADAS and autonomous vehicle :
NEXYAD module RoadNex V2.1

A useful complement to markings detection

The detection of the road is a key element of driver assistance systems (ADAS) and autonomous vehicles.
Indeed, objects, obstacles, other road users, must be detected but also positioned relatively to the road.
The detection of the entire route, that is to say not only its markings or edges, but all the way, should enable
embedded intelligence to select appropriate action.

The company NEXYAD has been working on this issue for over 20 years without interruption, and has accumulated a large number of cases of road types, of coatings, in various atmospheric conditions.
This is to detect the rollable area on the road, without regard to, in a first step, lane markings.
Indeed, in Europe, there are many unmarked roads, and work on a marked road may change the markings and
make a « follow the markings strategy » dangerous.

In the images below you can see on the left a typical French countryside road with no markings, and on the right image, new markings was achieved while former markings still strongly visible.
Road without MarkingRoad with old and new Markings

These cases are quite common on our European roads and a driver assistance system, or a driving delegation
system, must at least understand such cases and if necessary tell the driver to cope with it by himself.

The NEXYAD road detection module, RoadNex V2.1 is a brick to go further to cope with these cases :
RoadNex V2.1

RoadNex V2.1 should be coupled with road signs detection, road markings detection, obstacle detection, in order to build an intelligent perception system. RoadNex is then a key module of such a system.

The road detection module NEXYAD, RoadNex V2.1 is available as a component into the asynchronous real time framework RT-MAPS : See HERE


*****



Road Safety for ADAS and autonomous vehicle :
NEXYAD module SafetyNex running as real-time component
of Framework RT-Maps

SafetyNex (safety level estimation for ADAS)
SafetyNex Onboard is a high level functional bloc (software) of safety measurement, taking into account map and GPS geolocation (shape of the road, crossing roads, … ahead), speed, accelerations, visibility, adherence, distance to obstacle, etc.
SafetyNex measures adaptation of the driving style to infrastructure topology, and possibly Dangerous situations.
Two main applications :
_ Car industry : intelligent Navigation system providing valuable advices to keep the car in a good level of safety; sending alarms on dangers
_ Insurance : driving style measurment correlated with accidentology (insurance pricing, Pay How You Drive)

SafetyNex is now running in RT-Maps by IMTEMPORA
SafetyNex is under fusion with Ecogyzer (eco driving rating system) : this “package” will be the ultimate tool for eco and safe driving combination.

SafetyNex V2.1
SafetyNex v2.1

Driving Delegation: key elements

Driving Delegation: key elements for an artificial perception system
Publication of September 2, 2015
Authors : Gérard YAHIAOUI & Pierre DA SILVA DIAS

INTRODUCTION
The automotive industry starts offering ADAS, and plans to propose in the near future partial or total driving delegation systems.

Main cases to be processed first may be:
. Highway driving, where the number of events per kilometer is small because the infrastructure has been designed to minimize path irregularities (little or no turns, every car in the same direction, wide track, geometric visibility up to several kilometers, enough little interactions between vehicles, at least when the traffic is flowing).
. The city, where infrastructure complexity is very large, where interactions between the road users are very strong, making detection a difficult tasks, but where speed of the vehicle is low.
In all cases, these future ADAS require developing advanced systems of perception.

ADVANCED PERCEPTION
Perception consist in detecting objects, clustering, and possibly tracking them in their own trajectory, from selected sensors (cameras, radar, lidar, slam, ultrasound, …)

It is usually presented as several phases :
. Detection: we perceive that “something” comes off the background, but we do not know what it this is. The Johnson criteria for detection give a theoretical limit of one period, or a minimum width of two pixels to detect a stationary object.
. segmentation and tracking: when zones are detected as being detached from the background (the landscape for image processing, the cluter for a radar, …), the detection must be agglomerated to track large enough objects that may have a meaning.
. Recognition: Recognition is to be able to say what it is. The Johnson criteria for human vision is about 6 periods (for stationnary objects) which gives 12 pixels.
. identification: identification gives, in the recognized class, the precise name of the object.

Detection is by far the most complex. It is potentially based on several principles:
. breakage hypothesis : we made a number of assumptions about world geography. We choose this hypothesis and make sure they are verified for the landscape (or cluter), and not for the objects to be detected. The non-validation of assumptions corresponds to a detection.
. the confrontation of a knowledge of the landscape or cluter: Comparing the “background” as it is supposed to appear in the absence of additional objects with said background which contains objects lead to detection of those objects.
. the knowledge of the shape of the objects to be detected: in this case the detection and pattern recognition are the same. System detects an object in its environment because it recognizes this object.
Human perception jointly implements the three principles.

Perceptions systems incorporate sensors and methods of processing, and are generally effective in a frame capture conditions, and little or not effective in the other frames. For example, a camera in the visible wavelenghts (and its image processing methods), will generally not be effective at night or in fog because “you can not see anything.”

PERFORMANCE, STRENGTH, RELIABILITY
No detection system can operate in any case when dealing with a real problem in the open world.

Designing a detection system then comprises two important phases:
. extend the maximum possible number of cases where the detection system works.
. have a diagnosis that allows to know when it is or when it is not in a position to that the perception system is effective.

We talk about performance (very efficient detection of all objects of interest), strength (number of cases where the collection system remains effective), and reliability (Situational Awareness in which one is and thus the confidence that can be placed in the collection system).
These three elements, performance, robustness, reliability, should be fully known in order to cooperate collection systems (for example, a camera and a radar).

NEXYAD proposed the Methodology AGENDA for characterizing life situations, using the formalism of orthogonal plans of experiments. The recognition of cases of functioning mode can be based on the description of life situations with this methodology. This gives a theoretical and practical framework for an estimation of robustness and reliability.
Performance is measured with statistical comparison operators: in general, it is considered the output of a detection system is a categorical variable with two categories: “detected” and “not detected”. This variable must be compared to a qualitative variable of reference that also has two modalities: “Presence of an object to
be detected” and “absence of objects to be detected.” The comparison can not be made by calculating a percentage (yet it is often that performance is measured this way), but it must use tools such as contingency table, the Khi2, normalized Khi2, khi2 in the box, etc …
To extend the life situations of the domain where the system detects objects correctly, we use to make cooperate several detection systems which use complementary types of sensors (eg in fog, we will trust in radar or infrared detection, but not detection by conventional camera).
A reliable system is one that is able to answer “I do not know”: in the case of driving delegation a system that could detect all objects so powerful, robust, and reliable in 30% of the time has a great value.
The delegation of driving frees 30% time of the driver, which is a real value proposition.

SAFETY OPERATION
Safety is a discipline that encompasses many issues with the objective of ensuring the proper functioning of the system in all cases.
In particular, we must be vigilant concerning detection systems which require to have several measurement channels, such as stereovision.
If detection works only when you can have both cameras, then safety experts refuse such a system because two cameras means 2 times more likely that one fails.
We then see that perception system must have quite still usable “degraded mode” when simulating glitches sensors. A good design of a perception system for ADAS incorporates all these elements.

SYNTHESIS
The race for performance that interests the engineers is rarely the real issue in industrial systems. A system that allows to delegate the driving in 30% of cases (eg clear overcast day dry) and “knows” when there is a case for which it works or does not work, can delegate driving and release the driver for 30% of the time.
This is a proposal for a very high value for the driver.
A system that works effectively in 99% of cases without knowing precisely when it works is absolutely unusable. No manufacturer will put such a system in operation for road safety applications.
The company NEXYAD has been working on these issues for twenty years, especially on road detection, obstacles detection, measurement of visibility (to describe cases where the detection is reliable, for example), the estimation of road safety (suitability driving style with the infrastructure).

NEXYAD developed:
. efficient and very robust basic bricks: RoadNex, ObstaNex, VisiNex onboard, SafetyNex
. a methodology for characterizing life situations in which it develops and tests an ADAS: AGENDA (Improvement performance, the recognition of cases of good performance, and validation of ADAS).
. know-how in collaboration between multiple perception systems.

Validation database for camera-based ADAS

Version française plus bas

NEXYAD Automotive & Transportation Newsletter #4, the 26th of August 2015



Validation database for camera-based ADAS

The company NEXYAD started building a database for validation of advanced driver assistance systems (ADAS and Autonomous car) using the methodology AGENDA published in the 90 by Gérard Yahiaoui (methodology initially developped for control construction of learning and test databases for the implementation of artificial neural networks).
This database has two essential characteristics:

1) Known life situations
Indeed, the methodology AGENDA proposes to describe potential changes of signals and images came into factors of variability and their crosses.
Example, for obstacle detection :
   . weather (dry overcast, sunny weather, rain, fog)
   . overall brightness (low, medium, high)
   . speed of the carrier vehicle (low, moderate, high)
   . type of road (highway, road with marking, road without marking …)
   . coating (bitumen 1, bitumen 2, …, cobblestones)
   . day / night (headlights and the lights switched infrastructure)
   . season (spring, summer, autumn, winter)
   . etc …

      > type of obstacle :
           – stopped
                      . infrastructure-related: work terminals, tolls, …
                      . related users: tire on the road, parcel felt from a truck lying on the road, biker following a road                       accident, disabled vehicle stopped on the floor, standing pedestrian on roadside edge (dodger /                       no sniper)
           – moving
                      . truck, car, vulnerable (pedestrian, bicycle, motorcycle) each with types trajectories (longitudinal
                      in rolling direction, longitudinally in the opposite direction of rolling side) and position (opposite
                      to right, left).
                      . Etc…

We see that if we cross these factors, we find fairly quickly a huge number of cases. However, the development of ADAS systems is complex, and it is necessary to proceed by successive iterations, starting from simple situations to move to complicated situations.
Our database allows this, since all records are described in terms of crossing the terms of the factors of variability. Thus knows exactly which cases were tested or not by the system.
Formalism ‘crossing of variability factors of the terms’ allows using design of experiments, and in particular orthogonal fractional plans to sharply reduce the number of cases to be tested while ensuring maximum coverage of life situations. One can in this context to develop a fractional ADAS on an orthogonal plan and test other hard fractional orthogonal planes for example.

2) Reality reference
This is to crop images barriers and infrastructure elements (markings, roadsides, etc.) so as to constitute a reference to measure system performance.

. Examples of life situations:
Life Situations


1.1, summer, overcast, unmarked road, moderate speed tire on the floor, dry weather
1.2, summer, overcast, unmarked road, moderate speed, parcels on the floor, dry weather
2.1, summer, overcast , unmarked road, moderate speed, standing pedestrians non ambush at the edges of the floor, dry weather
2.2, summer, overcast, unmarked road, moderate speed, lying on the floor human, dry weather
etc …

Not sure that you would meet those few cases, even with on million kilometers on open roads.



Our Goal

NEXYAD starts his collection of images and data:
      . video (towards the front of the vehicle) Color
      . accelerometers
      . gyros

The files are synchronized by RT-MAPS tool INTEMPORA society.
The files are saved as RT-MAPS format and replayable directly by this tool.

NEXYAD currently looking for contributors on this internal project. Co contributors fund and in return free access to the database, unlimited in time. This contribution will accelerate the work of collecting and labeling.
NEXYAD wishes to provide this basis before June 2016, free way to give the material to the community and the ADAS autonomous vehicle for a smaller version of the database, and pay way (as subscriptions) for complete database.
NEXYAD’s ambition is to spread its methodological expertise and allow everyone to assess the performance of vision systems for ADAS, whether systems developed by NEXYAD, or others.

References
“Methodology for ADAS Validation: Potential Contribution of Other Scientific Fields Which Have Already Answered the Same Questions”, Gérard Yahiaoui, Pierre Da Silva Dias, CESA congress Dec 2014, Paris, proc. Springer Verlag
“Methods and tools for ADAS validation”, Gérard Yahiaoui, Nicolas du Lac, Safetyweek congress, May 2015, Aschaffenburg


Contact
For questions, or if you wish to become a contributor, please contact NEXYAD : +33 139041360

RoadNex V2.0 (new film demo on a deep forest road without markings)

NEXYAD is proud to show a demo film of the RoadNex V2.0 (by NEXYAD) module (road detection in front of a vehicle).

This module runs as a component of the framework RT-MAPS (by INTEMPORA), and can recognize the road with or without white lines : even European countryside roads are detected.

RoadNex may be used for developing ADAS (Advanced Driver Assistance Systems) and Autonomous Vehicles.

In this demo, the road has got no markings at all, and is quite dark. RoadNex still works in this king of road.


RoadNex (new film demo)

NEXYAD is proud to show a demo film of the RoadNex (by NEXYAD) module (road detection in front of a vehicle).

This module runs as a component of the framework RT-MAPS (by INTEMPORA), and can recognize the road with or without white lines : even European countryside roads are detected.

RoadNex provides 2 outputs :

. The detection of road sides (lines : you can chose you pasting colors) : as you can see on the film below, this works even without road signs (white lines)

. The detection of the road material (painted area : you can choose your colors) : a road is not only a shape with two sides, but also an aspect. If ever a pedestrian is standing in the middle of the road, then the aspect will change.




RoadNex is an advanced module for Advanced Driver Assistance Systems (ADAS) and also for projects of Autonomous Vehicles.

RoadNex is currently under cross compilation on smartphones environments (Android, IOS, Windows phone).

NEXYAD vision-based ADAS modules for a demo car of “Université de Haute Alsace”

NEXYAD delivered a camera and two high-tech software modules :
. RoadNex : vision-based road detection in front of a car (for ADAS and Autonomous Vehicles)
. ObstaNex : vision and inertial navigation – based obstacle detection (for ADAS and Autonomous Vehicles)

Those two modules are integrated as component into the real time environment RT-MAPS (Intempora), and communicate with de neural network – based pattern recognition module Neuro-RBF (GlobalSensing Technologies).

NEXYAD will support Université de Haute Alsace during 2015 to integrate ObstaNex with their intertial
navigation unit.

NB : the 3 companies Nexyad, Intempora, GlobalSensing Technologies are members of the Mov’eo Groupement ADAS : http://groupementadas.canalblog.com/

ADAS

NEXYAD joins the business cluster ITS Infra (Mov’eo) / NEXYAD rejoint le Groupement de Pme ITS Infra (intégration des technologies de l’information dans les infrastructures routières) du pôle de compétitivité Mov’eo (September 23, 2013)

NEXYAD joins the hich tech SMEs business cluster ITS Infra (Information Technologies for intelligent road infrastructures) :
. Visibility measurement / mesure de la visibilité
. Bad weather identification using artificial vision based-systems (rain, fog, smoke, …) / identification des intempéries par caméra (pluie, brouillard, fumées, …)
. Smart camera based application / applications des cameras intelligentes
. Statistics and data analysis on the road traffic / statistiques sur le traffic routier

This business cluster gathers high tech SMEs of the Competitive Cluster Mov’eo.