- US - English
- China - 简体中文
- India - English
- Japan - 日本語
- Malaysia - English
- Singapore - English
- Taiwan – 繁體中文
Autonomous cars are evolving from futuristic dream to modern reality, and as the technology matures, personal and public transportation will be forever transformed. Eventually, driverless cars will take human motorists out of the equation entirely, banishing dangerous drowsy, impaired, and distracted drivers from roadways. Nearly 40,000 people in the United States died on the roads in 2017, and according to the National Highway Traffic Safety Administration₁ (NHTSA), about 90 percent of those accidents were due to human error₂.
But what’s behind the technology, how exactly is a driverless car safer, and what will it take to get to a day when you can commute to work without needing to watch the road?
Artificial Intelligence Drives Autonomous Cars
For an automobile to be autonomous, it needs to be continuously aware of its surroundings—first, by perceiving (identifying and classifying information) and then acting on the information through the autonomous/computer control of the vehicle. Autonomous vehicles require safe, secure, and highly responsive solutions which need to be able to make split-second decisions based on a detailed understanding of the driving environment. Understanding the driving environment requires an enormous amount of data to be captured by myriad different sensors across the car, which is then processed by the vehicle’s autonomous driving computer system.
For the vehicle to be truly capable of driving without user control, an extensive amount of training must be initially undertaken for the Artificial Intelligence (AI) network to understand how to see, understand what it’s seeing, and make the right decisions in any imaginable traffic situation. The compute performance of the autonomous car is on par with some of the highest performance platforms that were only possible just a few years ago.
The autonomous vehicle is projected to contain more lines of code than any other software platform that has been created to date. By 2020, the typical vehicle is expected to contain more than 300 million lines of code and will contain more than 1 TB (terabytes) of storage and will require memory bandwidth of more than 1 TB per second to support the compute performance necessary for autonomous driving platforms.
A self-driving car’s AI system requires a continuous, uninterrupted stream of data and instructions in order to make real-time decisions based on complex data sets. Successful self-driving vehicles exist on the road today, however the success of many of these early vehicles is the result of repeatedly driving the same route consistently over many days, where they learn every detail of the route and generate high-resolution maps that are then used as a key part of the self-navigation system.
With less reliance on needing to recognize the route, the attention of the autonomous computer can be paid to traffic, pedestrians and the other potential real-time hazards. This generally restricted range of operation is referred to as geo-fencing, and reflects the approach that early self-driving vehicles are embracing in deploying vehicles that are truly driverless. While geo-fencing can lead to a solution that can work over a limited route, an autonomous vehicle with heavy reliance on geo-fencing in one part of the world may not function as well in another.
Memory, the Unsung Hero in Autonomous Driving
Whether it’s the memory subsystem associated with sensor fusion processing, path planning, or the storage subsystem associated with a black-box data recorder, the wide range of memory and storage devices from solid-state drives (SSD) to NAND flash, NOR flash through low-power DRAM, and GDDR6 all play an essential role in getting us closer to a future when we can reply to emails, take a video call, or watch our favorite shows while our autonomous vehicles navigate the best route to get us safely to our destination.
According to Robert Bielby, the senior director responsible for automotive system architecture in the embedded business unit at Micron, high performance computers based on artificial intelligence employ deep neural network algorithms, which enable autonomous cars to drive better than human-driven cars₄.
“You’ve got a host of different sensors that work together to see the entire environment in 360 degrees, 24/7, at a greater distance and with a higher accuracy, than humans can,” Bielby says. “Combined with the extreme compute performance that today can be deployed in a car, and you have a situation where it is possible for cars to do a far better job of driving down the road with greater safety than we can.”
Imagine a scenario where a car slams on its brakes while on a busy freeway. Through the introduction of Vehicle to Vehicle and Vehicle to Infrastructure (collectively called V2X) communication, this single event could be wirelessly transmitted to all cars following the lead vehicle, allowing them to understand the situation at hand and proactively slow down and brake to avoid an accident.
High-Speed Memory Is an Essential Component of Autonomous Driving
Remember the statistics that about 90 percent of U.S. fatal vehicle accidents in 2017 were due to human error? Humans are distracted easily, though we can make snap decisions when faced with unexpected hazards. Computers, on the other hand, don’t get distracted by the things that pull humans attention away from the road like a flashy billboard or a favorite song on that radio that can’t help but be danced to. Computers can also react in a more consistent manner and in time even faster than human drivers.
Understandably, safety is of utmost concern with autonomous vehicles. The attention to safety goes well beyond the redundancies designed into the hardware systems to minimize errant decisions and includes an associated infrastructure to enable vehicles to communicate with one another and their surrounding environment. This wirelessly interconnected computing subsystem with hardware redundancies is governed by legislation intended to mandate the level of required safety in direct correlation to the level of autonomy.
As an oversight to development and deployment of autonomous driving technologies, the NHSTA has established a series of levels that identifies the amount of control that a person vs. a computer has over the control of the vehicle. These range from Level 0 (no automation) to Level 1 (driver assistance), Level 2 (partial automation, requiring the drive to keep a hand on the wheel), Level 3 (conditional automation, where the driver may be required to take over at any time), Level 4 (high automation), and finally Level 5 (full automation). Currently the majority of ADAS solutions shipping today are Level 2 capable and are based on computer hardware using memory devices that are relatively mature and low bandwidth.
As driverless cars reach increasing levels of autonomy, the importance of memory technologies, both from a safety and performance perspective, moves memory technologies from the back seat to the front seat of the car. Where historically the personal computer was recognized as the driver of memory technology, it’s now recognized that the automotive industry is going to be the leading driver of future memory technologies. Today, some of the leading autonomous platforms are already illustrating this point.
Nvidia’s recently announced state-of-the-art Pegasus computing platform₃, developed specifically for autonomous driving, is based on the industry’s highest performance, leading-edge DRAM technologies. In aggregate, the Pegasus platform delivers more than 1 TB per second of memory bandwidth in order to deliver Level 5 performance.
The Importance of GDDR6 in the Future of Autonomous Driving
Micron is an industry leader in both automotive memory solutions, and graphics memory solutions GGDR5x and GDDR6. The bandwidth associated with GDDR6 memories enables higher levels of autonomy to be realized in a practical footprint that is viable for deployment in the automobile. An autonomous compute platform that’s rich in memory bandwidth will have the ability to allow for continued evolution and refinement of autonomous driving algorithms. “What you’ll see is that there will be improvements in algorithms that will occur over time,” says Bielby. “But those will be deployed as software upgrades, similar to the way your smart phone receives a regular update to an application or operating system.”
The continued evolution of autonomous vehicles involves many iterations of varying capabilities over the next decade. This requires careful management of human-machine engagement, ensuring that drivers clearly understand what level of autonomy is available at any particular time and what the responsibilities are for “hands-on” and “eyes-on” operation.
GDDR6 is a fundamental technology that provides the essential memory bandwidth that fuels the artificial intelligence compute engines, which underpins the capability of autonomic vehicles to act responsibly and enhance safety in accordance with the industry safety standards as governed by NHSTA. GDDR6 is a high-performance memory technology that is available today and is qualified to operate in the high temperatures and harsh conditions associated with the automobile.
AI is a critical technology required to realize autonomous driving. The extreme compute performance required for an autonomous vehicle based on AI requires an innovative memory and storage system to process and hold the vast amount of data necessary for a computer to make decisions like a human. As autonomous vehicles drive more need for speed from memory, Micron’s more than 25-year commitment to the automotive industry will allow it to continue to drive ahead of the pack, delivering the right level of performance needed to win the race.
Sources and References:
1. National Highway Traffic Safety Administration. “NCSA Publications & Data Requests.” 2017, crashstats.nhtsa.dot.gov/#/.
2. National Safety Council. “Distracted Driving.” Injury Facts, 2018, injuryfacts.nsc.org/motor-vehicle/motor-vehicle-safety-issues/distracted-driving/.
3. Nvidia. “NVIDIA Announces World's First AI Computer to Make Robotaxis a Reality.” NVIDIA Newsroom Newsroom, 10 Oct. 2017, nvidianews.nvidia.com/news/nvidia-announces-world-s-first-ai-computer-to-make-robotaxis-a-reality
4. Bielby, R (2018, February 28). Personal Interview