Wednesday, December 7, 2022

Self Driving Cars Platform and Their Trends (Part 1 of 2)

V.P. Sampath is a senior member of IEEE and a member of Institution of Engineers India. He is currently working as technical architect at AdeptChips, Bengaluru. He is a regular contributor to national newspapers, IEEE-MAS section, and has published international papers on VLSI and networks

- Advertisement -

Vehicle metadata provides information about the vehicle that is valid for the entire path. This includes vehicle type information and vehicle reference point. All absolute positions (longitude/latitude) that are reported to the sensor data ingestion interface are expected to be at the centre of the vehicle. All offsets that are reported are expected to be offsets from this centre point of the vehicle.

Altitude that is reported to the interface is expected to be the altitude on ground (not altitude of the location of GPS antenna). Instead of providing the altitude on ground, it is possible to report a different altitude with a constant offset. This offset from the ground must be provided through the vehicle metadata.

Vehicle dynamics are measurements beyond the position of a vehicle. Typically, vehicle dynamics information is measured by the vehicle using onboard sensors at high frequency as compared to positions (for example, 5Hz or 10Hz). Depending on the set of sensors in the vehicle, different values could be provided. In order to keep the complexity at a manageable level, these raw measurements must be converted into meaningful values and, hence, are a result of calculations either in the vehicle or in the OEM or system vendor backend.

Path events
Fig. 2: Path events
- Advertisement -

The fundamental data element for rich data sensor submission is a path, which is a list of position estimates, which are ordered starting with the oldest position estimate. A path can be very short, for example, for near-real-time events that are transmitted immediately after these occur. It can be very long, for example, like an entire drive over many hours that records the vehicle trace and events for later submission.

Vehicles may have different collection policies for different types of sensor data. According to individual priorities of sensor data information, a message could be compiled and sent out the moment specific sensor data reading is detected, or events could be accumulated into one message and submitted after a given amount of time.

By way of example, this could be due to reducing computational performance in critical moments while driving due to reduction in transmission volume through mobile phone connection, or due to accumulation and, thus, referencing all detected path events onto one single path. The latter may happen if referencing of multiple path events to one single vehicle is needed and usage of transient vehicle ID is not supported or not wished.

Self-driving cars

A self-driving car is capable of sensing its environment and navigating without human input. To accomplish this task, each vehicle is outfitted with a GPS unit, an inertial navigation system and a range of sensors including laser range-finders, radar and video. The vehicle uses positional information from the GPS and inertial navigation system to localise itself and sensor data to refine its position estimate as well as to build a 3D image of its environment.

Data from each sensor is filtered to remove noise and is often fused with other data sources to augment the original image. The vehicle subsequently uses this data to make navigation decisions, determined by its control system.

The majority of self-driving vehicle control systems implement a deliberative architecture, meaning that these are capable of making intelligent decisions by maintaining an internal map of their world and by using that map to find an optimal path to their destination that avoids obstacles (for example, road structures, pedestrians and other vehicles) from a set of possible paths. Once the vehicle determines the best path to take, the decision is dissected into commands, which are fed to the vehicle’s actuators.

These actuators control the vehicle’s steering, braking and throttle. This process of localisation, mapping, obstacle avoidance and path planning is repeated multiple times each second on powerful onboard processors until the vehicle reaches its destination.

Mapping and localisation

Prior to making any navigation decisions, the vehicle must first build a map of its environment and precisely localise itself within that map. The most frequently-used sensors for map building are laser range-finders and cameras. Laser range-finders scan the environment using swaths of laser beams and calculate the distance to nearby objects by measuring the time it takes for each laser beam to travel to the object and back.

Where video from camera is ideal for extracting scene colour, with laser range-finders depth information is readily available to the vehicle for building a 3D map. Because laser beams diverge as these travel through Space, it is difficult to obtain accurate distance readings from more than 100 metres away using the most state-of-the-art laser range-finders, which limit the amount of reliable data that can be captured on the map. The vehicle filters and discretises data collected from each sensor and often aggregates the information to create a comprehensive map, which can then be used for path planning.



Tech News

Latest VIdeos

What's New @

Tech Zone