- US - English
- China - 简体中文
- India - English
- Japan - 日本語
- Malaysia - English
- Singapore - English
- Taiwan – 繁體中文
The Fourth Industrial Revolution is upon us, as digital technologies connect virtually every aspect of life and work. From cars to refrigerators to manufacturing equipment, an increasing array of inanimate objects are becoming “smart,” using data generated by cameras, sensors, software, and other technologies to not only assist us with tasks but even to perform them in our stead, learning with each transaction how to do the job better and even making decisions autonomously using artificial intelligence (AI).
Of all the technologies ushering in this revolutionary new era, AI is arguably the most critical—and the most complex.
AI enables the “autonomy” in autonomous vehicles, teaching vehicles to carry passengers and goods from points A to B safely and efficiently, independently executing the many required actions, reactions, and decisions.
AI helps drones working on farms to decide when crops need water, fertilizer, or pesticides and to apply them in appropriate amounts, as well as to harvest crops when they are ripe.
AI allows robots to work collaboratively with humans as well as to function independently, someday turning factories into cyber-physical systems that run themselves, managing inventory, making repairs, and adjusting production schedules according to changes in demand without human intervention.
For AI to work, however, it needs “fast” data—data processed and analyzed not a day or even an hour after it is generated and collected, but instantly, in real time, as human intelligence is capable of doing. Since Industry 4.0 hinges on AI, we can even say that fast data is the engine driving the new revolution. Fueling that engine: Processing power and vast repositories of memory.
The First Three Revolutions
For most of human history, everything was handmade from materials gathered or, at the dawn of the agriculture age about 10,000 years ago, cultivated. The domestication and harnessing of animals was itself revolutionary, but industrialization really began with the introduction of machines into production processes.
According to Micron IT Director Tim Long, the first three industrial revolutions can be broken down this way:
- Mechanization. The first industrial revolution, occurring over about 100 years from the mid-18th to mid-19th centuries, began with the use of water and steam power to mechanize manufacturing processes.
- Electrification. In the late 19th and early 20th centuries, electric power came to factories, enabling the assembly line and mass production.
- Automation. Digital technologies including robotics came to the manufacturing process in the mid-20th century, automating many tasks that humans had previously performed. The proliferation of computers in the second half of the century and the rise of the internet in the 1990s enabled companies to globalize their operations—and consumers to shop around the world with the click of a mouse. Each of these revolutions brought about a monumental shift in the way businesses produce goods and services—faster and more cheaply, and in greater quantities than was possible before. Each resulted in expanded markets and potential profits, as well, as products became more widely available at lower prices.
Industry 4.0, now underway, promises another major upheaval in business models and practices. Connection is the hallmark of this era. Everything, from cars to computers to robots to toasters, will be linked in the Connected Age, communicating with one another and with us, and adapting to customize users’ experiences and automate menial tasks in every industry. Freed from the humdrum, we can focus on more complex, mission-critical jobs—many of which will involve technology.
Instead of making widgets, people will design factories to run themselves. Instead of driving trucks, people will program them to travel in fleets, and troubleshoot problems. Instead of plowing, planting, and harvesting, farmers will manage an array of technologies doing those jobs for them, and spend their time maximizing their crop yields.
And all of it, the entire interdependent, interlinked digital ecosystem, will rely on data to inform every step in every process.
The Data Explosion
Of all the data that exists in the world, some 90 percent was created in the last two years alone. Every day our devices generate 2.5 quintillion bytes of data, or 2.5 exabytes, a number that grows as the number of phones, tablets, computers, and other connected devices increases exponentially year by year.
The number of internet users is expected to top 4 billion in 2018, more than half the world’s 7.6 billion population. The number of connected devices on the “internet of things,” the network of objects connected to one another and to us via sensors, cameras, apps, the internet, Bluetooth, and other forms of digital communication, is expected to reach 23.14 billion in 2018 and 74.4 billion by 2025.
That’s a lot of data—a veritable gold mine of data for any business wanting to gain insights into its customers’ wants, needs, and purchases to refine its product offerings and marketing approaches, for instance. Many companies use data in just this way, processing it in batches that may be weeks or months old and analyzing the results to decide where and how to fine-tune. Vast repositories of “big data” wait in untapped “lakes” to be sifted and scrutinized or, often, never viewed at all.
Today, neglecting data seems a terrible waste. The longer those data lakes sit, the more stagnant and useless they become.
The Fourth Industrial Revolution, however, makes the lake itself a waste. Data “at rest” will soon be obsolete. Fast is where it’s at.
Speed is of the Essence
If you wake up to billowing smoke and alarms screeching, do you sit and ponder before deciding to leave the house? Of course not: you process instantly what is happening—a fire—and run out the door.
Likewise, artificial intelligence, to be truly intelligent, must be able to process information—data—and adjust its behaviors in real time, or as near to it as possible.
In a factory, this might mean that detecting a flawed or broken component causes a robot to replace the part with a different one before proceeding. The machines cannot sit idle for weeks or months while software processes the information. Nor should the robot ignore the broken part and continue assembly, leaving the information to be scrutinized later, when the manufactured item fails.
To compete in the Connected Age, “fast data” is of the essence. There can be no dilly-dallying, no hemming and hawing, no indecision. On the factory floor, pauses can be disastrous; slowdowns can cause shutdowns, which cost big money—$22,000 per minute in the automobile industry, it is said.
Clearly, backward-looking “batch” data processing, although great for spotting trends and making reflective decisions, will be insufficient for businesses to compete in Industry 4.0. The computers embedded in “smart” devices including robotics, drones, and self-driving cars will need to process data as soon as it is generated in a meaningful way—much as the human brain is capable of doing.
Processing at the Speed of Thought
How do our brains work? Information enters through our five senses and our minds process and analyze it—often instantly. When we touch a hot skillet, we don’t have to think about what to do next. When we smell a fragrant flower, there is no lag time between nose and “rose.”
Our minds can process data—thoughts, sensations, emotions—incredibly quickly. Computers can calculate complex math equations much faster than we, and maybe they are better at chess—but for taking in information and choosing from an array of possible responses, nothing beats the human brain for speed and smarts. For now.
The Fourth Industrial Revolution promises to bring computers closer than ever before to human capabilities. For an autonomous car to avoid a pileup, it will need to process the accident and choose a way around it in the twinkling of an eye. A factory robot should be able to detect and repair a worn-out part before a costly shutdown occurs. A drone must stop spraying a pesticide the instant a child or animal runs into the field.
A machine’s ability to make these calculations rapidly depends, as with our minds, on processing speed and memory. Industry 4.0 needs computers that can take in, sort, and analyze vast quantities of data not in seconds or even milliseconds, but in nanoseconds. Micron is ramping up its research and development of ever-faster and increasingly efficient memory solutions to fully enable AI.
“We see lots of opportunities for memory to play an increasingly important role in the AI application space,” says Mark Helm, senior fellow at Micron. “Data is the key currency enabling AI.”
Compare the way computing systems perform transactional tasks such as checking email or browsing the internet--“historical” workloads—against the way artificial intelligence works, and you’ll quickly see that data is a much more important factor for AI, Helm says.
“That raises an opportunity for Micron to become the keeper of the data,” Helm says. "Be it our GDDR6 graphics memory capable of feeding data to graphic processing units (GPUs) at extreme speeds—much as our brains can do—or our DRAM and NAND memory products that stream data within a system, Micron products are the gatekeepers of AI data."
The future of artificial intelligence depends on data that moves at the speed of thought. Computers will become great enablers, moving civilization a giant leap forward, once they can process data at least as quickly as the human mind can do. To lead humanity into the new frontier, Micron knows that our enterprise must work even faster—“always,” Helm says, “ahead of that curve.”