
However, many vehicles on the road today already sport intelligent systems to either automate certain functions previously performed by drivers or to enhance the safety of both drivers and pedestrians. These include blind-spot monitoring, forward-collision and lane-departure warning systems as well as active safety systems that provide pedestrian detection and adaptive cruise control.
In the near future, these capabilities will be greatly extended and enhanced until — finally — the capabilities of the automated vehicle will equal or surpass those of the human driver.
A common set of definitions
With the goal of providing a common terminology for those working to develop Advanced Driver-Assistance Systems, SAE International has produced a common set of definitions which were adopted by the US Transportation Department last September. The SAE standard J3016 delivers a classification system and supporting definitions that identify six levels of driving automation from “no automation” to “full automation”.
SAE International has produced a classification system and supporting definitions that identify six levels of driving automation from “no automation” to “full automation”.
The main distinction between the five categories of systems defined by the SAE relates to those systems (in blue) that are employed in a vehicle while a human driver monitors the driving environment, and those automated driving systems (in green) where the system itself monitors the environment and can either alert a driver to take action, or to take action itself. In the fully, automated scenario, the automobile performs the driving function completely autonomously.
To meet the technical challenges of developing a system that can perform high levels of automation is a task that requires co-operation between auto manufacturers, OEMs and new entrants into the automotive market with expertise in radar, vision, image processing and artificial intelligence. This is evidenced by the strategies now being taken by many automakers which are forming alliances to develop autonomous driving systems in partnership with other suppliers.
A multitude of technologies needed
To develop the ADAS subsystems for adaptive cruise control, driver monitoring, lane departure warning and collision avoidance, automotive engineers will need to employ an array of cameras, LiDAR, radar and ultrasound sensors. In addition, they will need to develop sophisticated processing systems that can process the data from the sensors and control systems that will control electromechanical actuators in the vehicle to perform automated steering and braking.
To develop the ADAS subsystems for adaptive cruise control, driver monitoring, lane departure warning and collision avoidance, automotive engineers will need to employ an array of cameras, LiDAR, radar and ultrasound sensors. Image courtesy Cadence.
Data processing approach
According to Steve Roddy, Senior Group Director, Tensilica marketing in the IP Group at Cadence, data from an ADAS subsystem will be processed in one of two ways – either in a distributed or a centralized manner. In a distributed system, the processing of the data from the sensor would be executed locally by dedicated System on Chip (SoC) devices close to each respective sensor or camera. In a centralized data-processing approach, the data would be carried out by a single SoC device or devices interfaced to each of the subsystems over a high-speed automotive network.
Automakers will undoubtedly adopt a variety of ADAS systems with disparate functionality into their vehicles over the next few years. While processing the data from those systems might be performed in a distributed fashion in the short term, as the complementary relationships between the processing functions performed by each of the dedicated SoC systems on the data acquired by the sensors involved in controlling those systems becomes more evident, designers will develop more centralised SoC processing platforms to take advantage of the processing synergy that they can achieve.
According to Shreyas Derashri, Senior Product Marketing Manager at ARM, in a centralized SoC implementation data will be acquired from all the sensors on the vehicle over a high-speed network and handled in real time to open the doors to new and more sophisticated ADAS applications. ADAS systems based on a single central electronic unit, however, would naturally need to sport a more complex SoC device than one based on a number of SoC devices performing distributed processing throughout the vehicle.
ADAS systems based on a single central electronic unit would need to sport a more complex SoC device than one based on a number of SoC devices performing distributed processing throughout the vehicle. Image courtesy ARM.
Due to the heavy computational task involved, the single SoC-based electronic control unit would need to sport at least one or more multi-core general purpose processors as well as an array of dedicated digital signal processors and dedicated image processors in order to fuse and process the data acquired by the cameras, Lidar, radar and proximity sensors. In addition, such an SoC would need to support a number of high speed I/O and peripheral interfaces to transmit signals to the many electronic and electromechanical actuator subsystems in the vehicle.
Building an SoC for an ADAS system requires a developer to integrate many IP blocks together onto a single SoC device using open standard, on-chip interconnects. Image courtesy Cadence.
Building such as system will require many IP blocks to be integrated onto a single SoC device. Fortunately, open standard, on-chip interconnect specifications for the connection and management of functional blocks in a SoC such as the AMBA protocol from ARM are well established. These enable the development of such multi-processor designs with large numbers of processors and peripherals, enabling designers to be assured that the IP from disparate suppliers can be integrated and verified to ensure low latency and high bandwidth communications between each of the IP blocks.
IP building blocks
Aside from a wide range of IP available from ARM, designers of ADAS systems can choose IP building blocks to build an SoC from many other well established IP suppliers. Vendors such as Cadence’s Tensilica group and Imagination Technologies have developed IP specifically for scalar and vector DSP processing as well as image processors. Although much of the IP that is available in the market might still be considered general purpose, an increasing number of vendors are developing IP that is specifically targeted at the ADAS market, as exemplified by ARMs recent announcement of its ARM Mali-C71 image signal processor (ISP) which is capable of handling 24 stops of dynamic range.
ARMs latest ISP, the Mali-C71, is capable of handling inputs from four cameras and processing the data at 1.2 Gpixels/sec. The ISP also provides 24 stops of dynamic range enabling every detail in a scene to be captured from bright sunlight to deep shadows. Image courtesy ARM.
Specialized design skills
Developing such highly integrated electronic control circuits and their accompanying electronic subsystems, however, has not traditionally been an area of expertise associated with many traditional Tier 1 automotive suppliers who have primarily supplied less sophisticated mechanical or electromechanical components to automakers. While some companies are now looking to build up such SoC design skills, others are working with dedicated teams at companies such as KritiKal Solutions who have previous experience in computer vision, image processing, and embedded systems.
With such a high level of integration comes a high level of responsibility. The SoC devices that are built for ADAS control systems around multiple IP blocks will need to implement an unprecedented level of safety mechanisms at the IP level, in the interconnect hardware that links the IP blocks on the device, and in the software that is executed on them.
According to ARM’s Derashri, IP employed in ADAS systems will need to be resistant to both random faults, which occur unpredictably during the lifetime of any given hardware, as well as detect and act upon systematic faults in the hardware that could then lead to erroneous results being passed on to a control system. To handle random faults, IP will need to employ dedicated fault detection circuits and built in self-test modes, while for systematic faults the whole design and verification/validation process needs to follow strict automotive guidelines to ensure that the system incorporating the IP meets Automotive Safety Integrity Level (ASIL) D. Developing IP to meet this standard provides the highest level of assurance that in the event of a malfunction, there will be no chance of severely life-threatening or fatal injury.
Simplifying and shortening the software development cycle
The sheer complexity of next-generation future SOC designs will continue to represent a challenge for SOC designers who must also perform both simulation and verification to ensure that the SoC will meet the requirements of any given application. Recognizing that supplying the IP alone only solves part of the design problem, many IP vendors are now supplying full reference software to control their IP, as well as sets of tuning and calibration tools to ease the system integration process. Such tools should help to simplify and shorten the overall software development cycle and provide performance, memory bandwidth and power consumption savings.
Taking it one step further, many semiconductor suppliers such as Texas Instruments not only develop purpose-built IP, but also seamlessly incorporate that IP into full ADAS SoCs. According to Joe Folkens, Product Marketing & Business Development of ADAS Automotive Processors at Texas Instruments, the TI SoCs address the difficult safety and security requirements and are also complemented with a common set of tools and software to enable designers to easily program in one environment. Texas Instruments’ ADAS processors are scalable and enable discrete or centralized processing, while anticipating the multiple architectures that might be requested in autonomous driving systems of the future.
Written by Dave Wilson, Senior Editor, Novus Light Technologies Today