"The path through car autonomy is certainly ongoing, with the regular addition of new functions for car autonomy. It started in the 2010s with basic functionalities such as automatic cruise control (ACC) and advanced emergency breaking (AEB), it is currently still in progress with the addition of functionalities such as Highway Pilot, and we expect it will continue in the next years with the addition of functions such as City Pilot, where a car can be fully autonomous in a specific area,” said Adrien Sanchez, technology and market analyst, computing and software, at Yole Intelligence, part of Yole Group.

This continuous implementation of new functions directly implies a need for more and more sensors from a wider diversity. More advanced driver-assistance system (ADAS) cameras are needed to realize an accurate detection and classification of objects all around the car, as well as lane and traffic sign detection. Radars are key for ensuring good detection of any object around the car, and LiDARs are increasingly added to enhance the precision of the positioning of detected objects and real-time mapping accuracy.

More Sensors, More Data and More Software Have a Direct Consequence: More Centralization

“On top of this growing number of sensors, which also tend to have higher and higher resolution, the software complexity is increasing sharply. Autonomous driving in an open world is a very difficult problem and reaching a level of security high enough to convince people to put their safety in the hands of a machine is incredibly complex,” said Pierrick Boulay, senior technology and market analyst in the photonics and sensing division at Yole Intelligence.

As we’ve seen, this has led to the multiplication of sensors and to a growing number of software layers for more accuracy in the understanding of the environment, as well as to the introduction of some redundancy to prevent crashes due to system failure. In order to handle this growing amount of data and this pipeline complexity, the computing power required has increased dramatically. This has a direct impact on the car architecture, from a decentralized architecture with many small MCUs, to a centralized architecture with a few powerful processors in an ADAS domain controller. Centralization is the next step for sure, with the need to do sensor fusion. And with the growing number of sensors, nobody wants to conserve a model with one processor by sensors. So, the only question remaining is...

What Will Be the Pace of the Transformation?

Video creation using smartphones is at an all-time high due to the short-video craze. The emergence of TikTok, the favored social media of the younger generation, has been quickly copied by large incumbents, resulting in YouTube shorts and Facebook reels. This demand for high-quality video hardware was temporarily over-met during the out of COVID-19 lockdowns of 2021, and, therefore, the first three quarters of 2022 saw slightly less demand. We have seen even more dramatic but similar patterns with computer laptops and tablets in which cameras played a central role during remote work/school teleconferencing.

Another market that has explosive growth right now is Automotive CIS. The COVID-19 era signaled a turning point in consumer behavior, with demand switching to connected autonomous shared and electric (CASE) vehicles loaded with semiconductor-based features. Overall, the appetite for cameras remains high, but the dominance of the weakened smartphone market translates into the deceptive -0.7 percent CIS growth expected for 2022.

An Evolving Mission, Big Consequences

“At present, radars are used as intelligent sensors with a processing capability to output a classified object list, though it is limited in the number of targets. This approach enables basic ADAS functionalities such as AEB and ACC to be deployed. As the use-cases are growing in complexity (think about automatic lane change), as well as the car rating scenarios, the mission for radar sensors is evolving,” said Cédric Malaquin, team lead analyst of the RF activity within the power and wireless division at Yole Intelligence.

This is no longer about providing range and velocity for a small number of objects. Radar sensors are evolving to literally perceive the scene around the car. The goal is to get a free space mapping by radar only, for obvious reasons of redundancy. With such a sensor, OEMs will have access to path-planning for any time, in any driving scenario. Centralization seems an obvious choice to bridge the gap, as it resonates with resources optimization. But it’s also a massive change in architecture, raising multiple points such as the partitioning of radar signal modulation, data processing, data transport, and even data fusion. Meanwhile, as these problems are clarified, edge processing has room for evolving beyond its current capabilities. In any case, the importance of software in radar sensing is growing and multiple industry players are positioning for either one or the other approach. It will be interesting to track how this industry evolves in the next few years.