Just imagine a scenario where the ever-increasing vehicular accident number drastically reduces. Reason for this scenario is a potential deployment of technology; IoT and smart concepts to be specific; rather than humans turning compassionate towards fellow humans with automobiles on the road.
“India certainly needs smart cars in the long run, despite the current market not being favourable”, states Nisarg Nirmalkumar, CTO at Skillfinity Technologies who even moderated a panel discussion, during the recent India Electronics Week (IEW 2017) at Bangalore focusing on the role of UX in distraction free driving and the trending technologies in multimodal interaction.
Nisarg also threw light on the various smart concepts that could be key drivers within future automobiles, more than even fuel. “Smart Concepts will even enable drivers to control the infotainment system of cars through eye gestures, without getting distracted”, stated Nisarg during a friendly telephonic interaction with Rahul R of Electronics For You.
Q. It is obvious that Smart Concepts for automotive is the need of the hour in India? Why do you think there has not been any adaption?
A. Firstly, as per the current scenario, there is very little/no market in India for Smart Automobiles. People are just not willing to invest as the cost of smart solutions is really high. Secondly, the Indian road infrastructure needs improvement to accommodate smart automobiles.
However, smart concepts would undoubtedly make it big once properly applied and promoted.
Q. How smart could future cars get, via incorporation of IoT and sensors? Which type of sensors could specifically be developed and used within cars?
A. Degree of smartness depends on the level of programming of Electronic Control Units (ECUs) and usage of sensors. In a modern car, there will be as many as 50 ECUs (Engine Control Module, Body Control Module, Battery Management, ADAS – Advanced Driver Assistance System to name a few) and several sensors. All the sensors will be connected to the ECUs which collect data from the sensors and take suitable action to control operations of the vehicle.
This can also be programmed to implement the Adaptive Cruise Control aspect which also slows down cars automatically in cases of sudden slow vehicles at the front.
Other sensors have already been implemented in many countries of the world. These are also available in the market within few top end models of BMW and few other cars. Specifics here include Eye-Gaze and Heads-Up display technology for smart operations of infotainment units to avoid distraction of drivers. This in turn is directly connected with reduction in the number of accidents.
Q. How are the eye gesture sensors programmed to function within cars?
A. Basically, the sensors act as distraction reducing tools. The underlying sensors track eyeball movements and use this as input for performing essential functions such as control of infotainment systems like switching channels, changing music, and so on. Also, essential elements of the eye such as the blinks are also tracked and this information is also used for infotainment system control.
Now, since the eye gesture aspect has been brought out, smart cars also use a technology called Heads Up Display that throws up the entire infotainment system virtually on the windshield of the car. Drivers/users can then control the system just with their eyes.
The name ‘Heads up’ is because, the driver need not put his head down to look at the screen on the dashboard. He still looks at the road and the display is projected on the wind shield. Important point to note is that the projections do not cause any obstruction to the view of the road ahead.
Q. Eye gesture sounds interesting as it reduces the distraction level when compared to physically operating infotainment systems; are there other smart sensors to help in enhanced safety measures?
A. The long-term solution here would be Natural Language Processing (NLP). Intensive research in this sector is currently underway across the world.
This technology would bring immense changes in the way humans interact with cars (any device for that matter). We do have speech commands technology that can recognize particular commands only and perform corresponding actions. But with NLP, interaction with your cars would just be like natural conversation.
Q. So, if once cars have become smart; will human intervention be completely eliminated?
A. In today’s scenario, the biggest challenge is bringing an autonomous car and human driven car together on the road. Autonomous cars will be a success if the environment (roads, infrastructure, rules, regulations) is made compatible.
Having said that, the computers (which make cars autonomous) can never replicate the gut feeling/humane aspect. Remember that smart cars can only assist the driver, but cannot serve as replacements. This is the reason we do not see autonomous cars running in the open yet.
Q. Finally, any car brands that you can think of as having adapted smart solutions?
A. Even Indian brands like Tata and Mahindra have incorporated smartness into their cars. Look at the cars like Hexa and XUV500. They already have features like hill hold control, hill descent control, rain sensors, ESP, Corner stability control, along with a great infotainment system (that takes voice commands).
What India lacks is not talent to create futuristic technology, but initiative in research and development. All our focus and efforts are spent in developing cost effective mass systems. There is no urgency or craze in India for smart systems.
As far as geographical adaptability for smart cars is concerned, Dubai provides great infrastructure for autonomous vehicles. Japan is also doing great in bringing Electric, hybrid, along with Hydrogen-powered vehicles. Even the Germans have dished out some good offerings over the years, and I feel they would even continue doing so.
However, it is the technology biggies such as Google, Apple, and not to forget Harman that are adding increased smartness into cars in terms of infotainment. Nonetheless, the credit of autonomous vehicle should be given to Google.