As we are moving towards smart, connected, and autonomous mobility, the interaction between the vehicle and the human has also evolved. The HMI solutions of today not only enhance the in-vehicle experience and convenience but also deliver personalised experiences. Only the information that is critical at a particular moment is communicated to the driver.
Vehicles are transforming from a means of transportation to mobile living spaces. The future of mobility is driven through innovations that make the vehicles safer, seamlessly connected to the mobile ecosystems, and more convenient for drivers and passengers.
Safe, connected, convenient are the critical parameters that the consumers today are evaluating for their buying decision. With autonomous and smart mobility not any longer a discussion for tomorrow, rather a reality of today, the drivers will be co-drivers or passengers soon. Hence one of the major differentiators is the user experience that we can provide.
Human machine interface (HMI) provides this great user experience to see, feel, and hear through the design excellence, technology, and intelligent information management. It has made in-vehicle experiences more connected, informed, and integrated. Further, according to the Future Automotive Industry Structure—FAST 2030 report, HMI is one of the seven trends that will reshape the automotive industry by 2030.
HMI technologies that will enhance safety and in-vehicle experience
Long ago the radio, warning chimes, and, of course, the passengers were the sounds we could hear in the vehicle, but now the vehicle itself is becoming a smart communication partner. Vehicles have more to talk than ever before. An ever-increasing number of sensors and the vehicle being a part of the Internet of Everything create a flood of information that needs to be filtered and communicated to the driver. Moreover, the driver expects the vehicle to be a digital companion. As a result, solutions are required that ensure intuitive, easy, and, above all, safe interaction between the driver and vehicle.
The smart voice assistant is an intelligent speech assistant that can learn and adapt to user behaviour, preferences, and context. It provides fluent and intuitive natural language interaction with context awareness and seamless transition between various functions. For example, the driver might ask for route planning for navigation. The driver might then ask about free parking spaces at the destination and even dictate a text message to book a table at a restaurant in the vicinity of the destination. The system ensures coherent communication, sends the relevant data from the navigation system to the parking space assistant, matches the Internet search for restaurants to the location of the recommended parking garage, and, finally, transfers data from the navigation system (estimated time of arrival) and restaurant search (address and e-mail contact) to the e-mail and voice recorder program in order to reserve a table.
The intelligent assistant even understands the request “search for restaurants there” and correctly interprets “there” as the previously selected destination. The intelligent assistant also detects meaningful connections without the driver having to issue standard commands now and then. The assistant can also handle multiple questions, or two tasks communicated in a single sentence. If the driver says, “I would like to get to the mall as quickly as possible and eat Chinese food somewhere nearby,” for example, the assistant will calculate the route and also search for Chinese restaurants near the destination.
As we move towards autonomous and connected driving, the amount of information to be displayed on the screens also increases. The conventional 2D display would no longer be sufficient. However, the usual 3D displays require additional gear that hampers the in-vehicle experience for the driver and passengers. The new Natural 3D Lightfield display is an evolutionary step in the design of human-machine interaction of vehicles and brings out a new measure of safety and comfort by generating an unprecedented 3D experience without special eyewear or head-tracker cameras.
With this innovative technology, information can be safely conferred to the driver in real time, which allows the interaction between the driver and the vehicle to become more intuitive and comfortable. The safety is enhanced with Natural 3D lightfield as it provides a superior usability, because the perception of information can be accelerated as this visual experience is closer to the real world.
This 3D experience is not limited to driver as the passenger can also experience this in the pillar to pillar displays. The 3D image produced by the Lightfield display consists of a total of eight perspectives of the same object that vary subtly according to the point-of-view.
The display uses parallax barriers, slanted slats that divide the image for the viewer. As if looking at real objects, two different, slightly offset views reach the right and left eye, resulting in a three-dimensional image.
Not just the display, the audio systems are also transforming. The speakerless audio system that uses Ac2ated Sound, enhances the vehicle’s interior with its immersive sound quality. The technology replaces conventional loudspeaker technology with actuators that create sound by vibrating certain surfaces in the vehicle to produce sound. The speakerless audio system is a revolutionary system, where 3D sound reproduction surrounds the passengers in a detailed and vivid soundscape to maximize their in-car audio experience.
This system not only reduces the weight and space by up to 90 percent compared to the conventional audio system but also assists in enhancing the audio quality significantly.
It could be an ideal system for electric vehicles as well, where saving space and weight are top priorities. With this system, many components become unnecessary due to vibrating surfaces in the vehicle just as speaker diaphragms. Actuators cause components, such as the A-pillar trim, door trim, roof lining, and rear shelf, to vibrate so that they emit sound in different frequency ranges.
However, these HMI functions would not be possible without the change in the vehicle architecture. The cockpit high-performance computer (HPC) integrates all cockpit domains like clusters, infotainment, camera into another powerful high-performance computer, which is the centrepiece of user experience in the vehicle. The shift in architecture to move away from numerous individual control units to a few high-performance computers is to reduce cost and complexity for vehicle manufacturers over whole vehicle lifecycle.
The cockpit HPC can drive multiple displays in the vehicle for driver and passengers. The user will be able to distribute content across multiple displays. For example, using gesture control, driver can drag navigation content from the central display onto driver display, and can also arrange various elements in the display areas they like, thus providing a customised user experience. In autonomous driving mode, the driver can immerse into the connected entertaining world as the cockpit HPC merges the multi display content into one big screen providing all the services and apps for entertainment.
Holistic HMI enabled future
Holistic human-machine solutions will lay the foundation for autonomous, connected, electric, and shared mobility by providing an end-to-end experience for users, and not be limited to booking a ride and transporting passengers to their destination. The concept also helps the drivers to plan the next course of their journey and provide relevant information about social events. Information is collected, prioritised, and quickly presented in a focused and clear way. The flow of information is reduced by taking the driving situation and the driver’s condition into account. Only the relevant information and appropriate actions are offered in the respective situation.
Besides, the driver receives an intuitive and reliable confirmation for every action initiated. This strengthens the trust in the vehicle’s functions and clearly indicates that the driver is and will remain in constant control of the situation. This kind of trust is particularly important when looking at future technologies such as automated driving. Autonomous and connected driving increases the complexity of the mobility world and provides so many features for the user, and these features are made accessible to the user with a simple intuitive holistic HMI.
Following the concept of a holistic HMI, the entire vehicle can be turned into a digital companion using artificial intelligence. Based on deep machine learning algorithms, the digital companion remembers and interprets the user’s behaviour, adapts navigation or infotainment offers, and even anticipates the wishes of the driver. A natural dialogue between the driver and the vehicle is realised by using cloud based voice services.
An assistant has been linked to several vehicle functions to support, for example, an interactive user manual that immediately explains to the driver the meaning of warnings or error messages. Depending on the warning and based on the driver’s user behaviour, the digital companion offers smart suggestions on how to proceed.
Summing up HMI’s role
As we are moving towards smart, connected, and autonomous mobility, the interaction between the vehicle and the human has also evolved. The HMI solutions of today not only enhance the in-vehicle experience and convenience but also deliver personalised experiences. Only the information that is critical at a particular moment is communicated to the driver by these advanced HMI solutions. This minimises the distraction and increases the safety of a vehicle, and contributes to the industry’s shared Vision Zero – Zero Fatalities, Zero Injuries, and Zero Crashes. HMI makes the processing and management of information easy, intuitive, and reliable.
Rosemary Joshy is head of Engineering, Human Machine Interface Business Unit, Continental Automotive India