- US - English
- China - 简体中文
- India - English
- Japan - 日本語
- Malaysia - English
- Singapore - English
- Taiwan – 繁體中文
Imagine the worst: an earthquake, hurricane, fire, flood, or other natural disaster strikes the city or town where you live. Everything is in shambles. People are running every which way; confusion reigns. Where do you go for safety? How do you locate your loved ones? Or—how will rescuers find you?
Or: You’re an emergency responder, and lives are at stake. Buildings are crumbling; you hear shrieks and cries for help, but visibility is so poor that you can’t see where the victims are located or how to reach them safely. When you do find someone in need of help, you must check their vital signs and diagnose their condition, using valuable time while others cry for help. But—you can’t be everywhere at once.
Thanks to digital technologies including artificial intelligence (AI), you soon won’t need to be.
Imagine these same scenarios with AI in the mix. Your mobile phone uses Pokémon™-Go-style augmented-reality overlays to guide you to the nearest shelter, alerting you to take precautions along the way such as covering your nose and mouth when smoke is imminent or heading for higher ground when flood waters approach.
Drones zoom through the air and into the waters, in and out of rubble piles, burning buildings, and other treacherous areas. Data from sensors and infrared devices, sonar, and cameras help the devices search for victims. When they find someone, these intelligent aircraft send alerts to first responders with the vital signs and conditions of the trapped or injured, and the safest routes for reaching them.
Robots, too, make their way through the devastation to administer first aid and even rescue trapped or injured people and animals—lifting and clearing away heavy debris, extinguishing fires, bandaging wounds, and making split-second, life-or-death decisions. Human medical responders move among the robots and drones, able to virtually see through walls using data and augmented reality, and to assist with the most difficult and complex cases needing more or different care than the medi-bots can provide.
A more treacherous world
Natural and human-caused disasters are on the rise. In 2018, the United States alone experienced 14 natural disasters costing at least $1 billion in damages each. 2017 was even worse, with $306 billion in damages from natural disasters—the most expensive year on record, according to the National Oceanic and Atmospheric Administration.
As catastrophes become more devastating and commonplace around the globe, search-and-rescue personnel, firefighters, and emergency medical technicians (EMTs) are facing enormous challenges to fulfilling their mission of saving lives. Fortunately, advances in digital technologies—including a special form of artificial intelligence that mimics the human brain—promise to revolutionize the speed, efficiency, and effectiveness of emergency response and emergency medical care.
What makes this exciting technology possible? Interestingly, it’s the same enabler that drives human thought and processing: memory, without which neither we nor our machines could learn.
Mimicking the human brain
For neural networks to function like our brains, they must be able to analyze the environment quickly, recognize contextual clues, and determine the best action. That means taking in data, processing it by moving from one decision-making algorithm to the next, and then concluding the best course of action.
Our brains complete this circuit using neurons, which are cells designed to transmit information. Electrical signals pass through multiple layers of these cells, collating the message each time to activate the correct neurons to recognize shapes, patterns and features of the physical world, and respond accordingly. From the time we see a problem—a burning building—to the moment we respond—running inside or away—this process is nearly instantaneous, taking milliseconds or perhaps less for trained emergency responders.
Mimicking that structure, computer neural networks are made of nodes arranged, like neurons, into layers:
- Input layers, which collect data
- Hidden layers, which process data
- Output layers, which transfer the processed information to the outside world
Artificial neural networks take in sensory data from cameras, radar, lidar, gyroscopes, accelerometers, and other sources, and filter it to their hidden layers, a series of coded algorithms, for sorting and analysis before transmittal to the output layer, where the machine makes a decision and takes action.
The challenge for AI lies in making the right decisions as quickly as possible—in the instant, as humans can do. In pursuit of that “holy grail,” researchers are mapping neurons and applying what they learn about how we process information to artificial neural networks. At the same time, Micron is making faster and denser memory technologies to achieve data processing at the literal speed of thought—and beyond.
AI to the rescue
Already these advances show great promise for saving lives and property in an increasingly treacherous world:
- For emergency building evacuation, computer neural networks can choose the best escape route for multiple people, guiding each to safety according to their unique location and abilities.
- A neural-network-based AI platform in Copenhagen, Denmark, listens to 911 callers, analyzing not only what they say but how they say it, to detect critical illnesses in real time such as heart attack or stroke. The platform, Corti, reportedly helps dispatchers and emergency responders to assist callers more quickly and effectively.
- In the United States, AUDREY – the Assistant for Understanding Data through Reasoning, Extraction and sYnthesis, developed for firefighters by NASA and the Department of Defense—gauges temperature and danger to guide firefighters and other responders through burning buildings to save lives.
- AI can also deliver real-time flood forecasts to help communities and emergency responders “better anticipate the severity of a flood,” enabling dispatchers to optimize resource deployment.
- AI-equipped drones are saving lives by helping rescuers find missing people, assess damage by going where humans cannot go–or, due to danger, should not go—and delivering supplies.
- Neural networks are even providing disaster assessment and climate change impact analysis to entire cities including Los Angeles and San Francisco, with one platform making predictions with an accuracy of 85 percent within 15 minutes after disaster strikes.
A revolution is coming
If this all sounds too good to be true—think again. Neural networks are on the cusp of revolutionizing not just emergency response but every aspect of daily life. Experts “liken the AI neural network revolution to the turn of the century when electricity was introduced,”
Micron’s Cloud Segment Customer Program Manager Gregg Wolf says.
In any emergency situation, response time is the most critical factor for saving lives. In digital technologies including AI, speed depends on memory—not only how much data is available, but also how quickly it can be accessed.
A neural network’s decision-making algorithms require intensive mathematical processes and data analysis, both of which increase the need for faster memory and storage. This need is especially important in the cloud at hyperscale data centers, where memory products like Micron’s Ultra-Bandwidth Solutions perform key roles in GPU or FPGA-based big data acceleration and processing. To move through the data, Micron’s portfolio of DRAM and SSDs speed up the flow of data throughout the data center.
“There's just a huge volume of data that's flowing through the data center, and it's very difficult for humans to define all the features and code and then pass all that data back and forth,” Wolff said. “A lot of information can flow through scalable neural networks with high-performance hardware, allowing folks to extract value from that information as close to real time as possible.”
Just as your brain reads the signal from your hand that says the stove is hot and causes you to pull your hand back, a neural network can read the mass of data points from a camera and note exactly where a robot should move debris to assist an emergency response. In the case of wearable sensors, data can be collected by multiple sources and correlated in a central data center for processing to get a much better understanding of the entire environment, dispatching robots to the direst tasks.
Micron memory makes it possible
If you want to speed up your brain, you need to improve your memory recall. Endless apps, sudoku workbooks, and other tools are available for people who want to increase their reaction speed.
Similarly, neural network capability improves with better data processing performance. Micron pushes the performance envelope, continually developing faster DRAM, NAND flash, and Ultra-Bandwidth Solutions. Graphics memory like Micron’s GDDR6X are the technology of choice for GPU-based graphic cards used in neural network inference. However, the more intensive process is training the neural network, which requires the highest possible system bandwidth, making Micron’s HBM2E perhaps the best choice for that workload.
“Over time, everyone’s going to try to find that right hardware footprint, and memory is a very important piece of this process,” says Wolff. “Micron is committed to finding value-added solutions that serve the particularly high-bandwidth needs of neural network training and inference deployments.”
The future is in neural networks. As computers begin to respond more like humans, they will form the underlying intelligent fabric of our lives at high levels of speed and efficiency. All the while, Micron will power this revolution, enabling those networks for good.
It all begins, of course, with the power and design of the amazing human brain. What they say really is true: Great minds do think alike.