
In previous segments, we delved into lidar-based obstacle detection and the integration of lidar and lane detection systems for the ADAS system.
If you are a design engineer or a newcomer, understanding the comprehensive design of autonomous electric vehicles (EVs), drones, and ADAS systems is crucial.
(For a comprehensive overview, refer to Part 1 and Part 2: Anti-Collision System articles published in the June and July issues.)
Video Tutorial:
Building upon the foundation of the earlier ADAS system design, we now progress to implementing lidar technology to scan the ROS environment. This scanning process generates real-time maps of the surroundings using simultaneous localization and mapping (SLAM).
Furthermore, our focus is on localizing the vehicle within the map without relying on GPS data. Such a system within the ADAS and autonomous vehicle drive landscape enhances localization accuracy and aids in path planning even in scenarios where GPS connectivity is compromised, such as tunnels or subways.
It significantly aids in navigation by dead reckoning, which is a method of calculating an object’s current position based on a previously established point.

Illustrated in Fig. 1, we observe a vehicle equipped with lidar technology. Subsequently, Fig. 2 showcases the ROS environment mapping the surroundings, generating paths (indicated by green lines) on maps through real-time tracking of the vehicle’s movement.
This fusion of capabilities empowers the ADAS system to navigate and drive vehicles without collisions by efficiently avoiding obstacles.

To facilitate this advancement, a list of components required for the project is outlined in Table 1. If you have previously constructed the ADAS system as per this series of articles, you likely possess these components.