Technology, like geology, unfolds in eras.
The ’90s were the era of the PC, and 2006 ushered in the era of the smartphone. Today, we are at the beginning of the third era in end-user devices: the connected car. In some ways, this shift could be even more significant than the previous ones because it combines the digital and physical world in a way we haven’t seen before.
As cars evolve into computers on wheels, the biggest business opportunities will be less about “metal and rubber” and more about services. McKinsey estimates that the value of connected car data could be worth $1.5 trillion a year by 2030. All this data has the market revved up (pun intended) because it will lead to new business models that are not only lucrative, but also transform the driving experience. However, all that data requires infrastructure to process it. Artificial intelligence (AI) is the key to unlocking the full potential of the connected car.
Over the past few years, machine learning and AI have pushed forward the capacity of computers to recognize images, understand context, and make decisions. These capabilities will be at the core of the connected car. A report from IHS Technology expects that the number of AI systems in vehicles will jump from 7 million in 2015 to 122 million by 2025, bringing new opportunities to enhance the capabilities of connected cars as more data becomes available.
AI will change the ways drivers interact with their cars and cars interact with their drivers in a number of ways. One is through infotainment and smarter interactions through features such as voice and gesture recognition, driver monitoring, virtual assistance and Natural Language Understanding. Drivers will be able to give their cars directions. Cars will be able to anticipate needs.
In addition, AI will push advanced driver assistance systems (ADAS) into the mainstream. ADAS need to be able to detect and recognize objects, predict actions, adapt to new road conditions, and more. For that, they need AI, which is what enables the camera-based machine vision systems, radar-based detection units, driver condition evaluation and sensor fusion engine control units (ECU) that make autonomous vehicles work.
The possibilities are clear, but the road to fully autonomous cars is a long one. 2017 will be an important year in this journey, since the industry will hit important milestones as it builds critical data collection infrastructure. Right now, there are two options for achieving this goal and the beginnings of a third.
The first option is to deploy cars armed with instruments that take images and record positions of static objects. This “millimeter precision” is needed to identify precise lane information and directions. However, this option is expensive and time-intensive. It also requires continuous updating to maintain the most current data.
The second option is to use semi-autonomous vehicles to collect data, which requires a new generation of cars with advanced sensors. Even the most bullish predictions show that less than 0.1 percent of all cars are equipped with these advanced sensors.
The third option, which we will see this year, is to use new technology to collect data from the other 99.99 percent of cars that are already on the road. For example, if a sensor detects abrupt steering changes from a multitude of cars at a specific location, that could indicate an obstacle. Or identifying when windshield wipers are on could be used to deliver notifications about micro-weather.
The power and promise of AI is that all of this data will push the next generation of cars forward by enhancing the ADAS. Data infrastructure technology may not be the most buzzed about advancement, but it’s the foundation for the connected car of the future.
John Cordell is the chief product officer of Xevo AI, a leader in data-driven user experiences for the automotive industry.