Smart data collection tools help efficiently evaluate LiDAR and multi-sensor fusion sensing system performance
The ground truth data, usually stored in the PB-Level, includes dynamic information such as obstacle types, speeds and locations, and static information such as lane lines and road boundaries. Data labelling quality and data generation efficiency are the key factors to ground truth data. The RS-Reference system provides a set of ground truth data generation and evaluation solutions, and outputs detection performance and geometric error indicators with a labelling efficiency close to 1:1, which is significantly more accurate than real-time perception, manual labelling and traditional labelling tools.
The RS-Reference system contains:
⦁ 128-beam LiDAR RS-Ruby
⦁ Leopard camera
⦁ Continental 408 millimetre-wave radar
⦁ GI-6695 RTK
⦁ Two RS-Bpearl LiDAR for near-field blind spots
The RS-Reference system adapts to different vehicle sizes, does not occupy the sensor installation position of the DuTs and directly evaluates the intelligent driving system that is consistent with the sensor sets of commercial vehicles.
Smart labelling of the algorithm instead of manual labelling is responsible for the extraction of ground truth data. The RS-Reference system uses a customised and dedicated offline perception algorithm, which performs a ‘full life process tracking and identification’ for each obstacle data and extracts all ground truth data from each frame. The RS-Reference system can pick up speed and acceleration labelling, and accurately delineate the size of the labelling frame through comprehensive shape and size information. The system is also capable of accurately dividing obstacles that are near each other in complex scenes.
It includes data collection tools, sensor calibration tools, visualisation tools, manual verification tools, evaluation tools, etc. The 2.1 version upgrades the data management platform and adds the scene semantic labelling function that serves every step of the evaluation process.
Not only can the RS-Reference system evaluate the result of intelligent driving’s perception fusion, but it can also provide targeted solutions based on the features of different types of sensors such as LiDAR, millimetre wave radar and camera. Dedicated or customised tool modules can be developed according to customer needs for further in-depth analysis of the performance of the sensing system.
There is huge potential in the extended application value of the RS-Reference system. This includes planning and control algorithm development support, which can generate massive ground truth data to build simulation scenes and can evaluate road-side perception systems.