“Where is everybody?” That was the question Enrico Fermi asked over lunch in Los Alamos, 1950. Fermi, a renowned physicist, was referring to the “maddening silence” of space, given the apparent likelihood of life existing elsewhere in the universe – a concept now known as Fermi’s Paradox. In this blog, we look at the parallels between Fermi’s ideas and the data challenges in manufacturing.
Biologists have found life thriving in every corner of the planet, from hydrothermal vents on the sulfurous fringes of active volcanoes. Life, organisms interacting, proliferating, seems to be the rule rather than the exception. Yet when we turn our eyes and ears to the rest of the universe, we find nothing but silence.
There are many theories explaining Fermi’s paradox. One is that “silence” is not the right word. The International Space Station (ISS) contains around 350,000 sensors just for monitoring the health and safety of the crew. That’s not including those relating to scientific experiments or research. Outside the station, it can hold a further 20 payloads of research equipment – including Earth monitoring, materials science and particle physics experiments (facts and figures on the ISS here).
Back in 2017, NASA was generating 12.1 TB of data every day from thousands of sensors around the world and in space. They predicted that number to double in 2018 with the installation of new optical lasers. They expected the total amount of data generated by 2022 to reach 50PB (a petabyte is 1×1015 bytes). (Keep these figures in mind, because they reveal something astonishing later.)
Perhaps the silence of space is really a deafening roar and the information we’re looking for has been there all along, hidden within it, we may someday discover.
The data challenges in manufacturing
We don’t have to stretch our imagination too far to see how Fermi’s paradox has a striking parallel to the digital transformation of manufacturing.
Despite generating vast amounts of data, we often struggle to derive meaningful insights or achieve improvements in efficiency and innovation. This is especially true when data is fragmented, isolated in silos or we are simply blinded by its sheer volume and quantity. A report commissioned by Amazon Web Services in 2020 reveals something astounding. Manufacturing companies generate a staggering 1800 PB of data each year. In that year, 2020, a single oil platform typically generated 2TB per day. Data scientists spend 80% of their time just preparing data for analysis instead of analysing it .
Instead of the silence of the universe, we have the silence of data. But the silence does not mean we are on our own, either in the universe or the manufacturing space. It tells us that we have challenges to overcome. It tells us that our operations are fragmented, that we have created data silos and it reinforces the need for integrated systems.
For an individual astronomer to map the night sky with a telescope, it would be an enormous challenge. When many observers participate in a joined-up way, sharing information, they can combine their observations to build up a more complete picture of the universe. The same is true for manufacturers. With integrated data management and collaborative systems, we can ensure that every link in the value chain benefits from a continuous and unbroken data flow.
With the power of automation and artificial intelligence, the challenge suddenly becomes less terrifying. We can now analyse inconceivably large data sets in decreasing time periods.
How long does it take to process a petabyte of data?
Say, for the sake of argument, it takes 1 second to process 1 gigabyte of data. If we take 1PB to be roughly equal to 1 million GB, then we’re talking about 1 million seconds or 11 and a half days.
For a human being to complete such a task, even a team of the brightest minds, it would take years. It might even be an impossible challenge. The most sophisticated AI with the fastest computational power might be able to reduce this to a matter of hours. These numbers may seem amazing to us, today, but future generations will laugh at the thought of taking several days to process 1PB of data. After all, it wasn’t that long ago that the 1.4 MB floppy disk was a marvel of information technology.
Now consider the graph below. This graph, based on data from Ray Kurzweil’s awesome new book “Singularity is Nearer “, shows the exponential growth of computing power. For reference, it had a value of 1 around 1963 and 500 around 1988.
Figure. The exponential growth in computing power
In the past, technology was restricted to those who knew how to operate it. Data was restricted to those who could access and analyse it. That’s not the case anymore.
As the curve gets steeper in the coming years, we can expect it to influence our lives in never-before-imagined ways. It will profoundly affect how we work, produce goods, innovate and stay healthy. We are already utilising generative AI for engineering, medicine and social science. We are seeing links across all languages and cultures through advanced language models which can already see beyond the constructs of any particular human language. In the past, technology was restricted to those who knew how to operate it. Data was restricted to those who could access and analyse it. That’s not the case any more.
AI opens the door to a new era of manufacturing
AI has opened the door to a new era of manufacturing. By lifting these restrictions, it enables unprecedented levels of insight and collaboration. Advanced AI systems can find and integrate diverse data from many different sources regardless of the original format or language.
Already, AI is the only technology capable of processing all the data available to us. In the future, this situation will only become more pronounced. AI will continue to advance and will soon have quantum computing at its disposal, compounding its role as the essential tool for managing the ever-increasing volumes of data. This will lead to even greater efficiencies, innovation and collaboration, ultimately transforming how we design, produce and deliver products to market.
As AI continues to evolve, it will drive collaboration, innovation and ensure that manufacturers remain competitive in an ever-changing market. Just as Fermi’s Paradox invites us to explore the unknown. The data challenges in manufacturing lead us to innovate and adapt. Harnessing the power of AI has become an imperative. The early movers are likely to have a strong advantage. They will be the first to turn vast amounts of data into valuable insights.
The opportunity here is not only to improve real-time processes and business outcomes, which itself has immense value, but to investigate long-range outcomes. We can achieve this by correlating massive data sets from overall product lifecycle, from design, production, filed usage patterns and maintenance and then to feed all of these back into improving future products, processes and driving innovation.
Ultimately, the silence of data, just like the silence of the universe, is not an obstacle. It’s not proof we are alone. It’s an opportunity to seek better ways of collaborating and more efficient ways of working.