Reflecting on the technology of the modern Elizabethan era

Since the passing of Queen Elizabeth II, much of the news coverage has understandably focused on the incredible extent of her reign. Spanning 70 years from 1952 to 2022, Britain’s longest-reigning monarch witnessed immense transformation, much of which resulted from technological innovation.

Going back a little further, the Victorian era is now often viewed through the lens of the Industrial Revolution, although it is generally agreed that this world-changing epoch started 70 years before Victoria came to the throne.

The concept of grouping periods of time into named ‘eras’ is of course imprecise, but it does also neatly summarise what humans were doing in various ways at that time. For example, during Iron and Bronze Ages, we were busy transforming materials into tools to keep society moving forward. During the Industrial Revolution, people utilised these materials and tools to create machines that transformed stored energy into power, which would then launch the first factories.

Elizabeth II became Queen 5 years after what is commonly-called The Information Age started. With the benefit of retrospect, we can now see that mechanical machines had begun to create data, which needed to be stored. The first data lakes were filling up.

Dawn of the digital age: the 1940s

The starting point can be linked to the invention of the transistor in 1947 (at Bell Labs in the US), which became one of the key components in the development of electronics.

via GIPHY

From then on, the rate of technological transformation never let up. Transistors enabled the development of room-sized mainframe computers in the mid-to-late 1950s (when Elizabeth became Queen and Winston Churchill was Prime Minister). Around that time, silicon-based transistors were developed, giving better thermal resistance and lowering production costs.

Rapid technology developments continued, which led to the first space missions. On October 4 1957, the Soviet Union launched the first artificial satellite, Sputnik 1, a 83 kg metal sphere.

Moore’s Law, which observes that the number of transistors in a dense integrated circuit (IC) doubles about every two years, was established in 1965 and has (nearly) held true ever since. During this decade, computers could only be afforded by larger businesses, costing tens of thousands of dollars. But the continued democratisation of computing power naturally led to the concept of a global computer network, which would eventually become the internet we know today.

The genesis of the internet dates back to the early 1960s and MIT computer scientist, J.C.R. Licklider, but it seems inconceivable that during those early years they could have imagined the impact their project would have on the world.

via GIPHY

In 1969, man walked on the moon for the first time. This was a towering achievement in human space exploration, itself an enormously influential crucible of technological innovation in sectors beyond aerospace.

By the mid-1970s, the first personal computers (PCs) were manufactured, completely transforming individual ability to innovate and also starting the process of creating the staggering level of data we have today. Computers became smaller and more powerful, continually entering new spaces and delivering new insights.

via GIPHY

During the 1970s, many of the major consumer computing global players, such as Microsoft and Apple made their first products, but it was during the 1980s and 1990s that home computing, followed by the wide adoption of the internet, saw the concept of connectivity really take off. As computers (and then a wave of other devices) became more and more connected, their potential power was multiplied.

Entering the age of the metaverse?

We are now standing at the edge of a new era, where all the data created by connected devices is utilised much more efficiently and powerfully. This process will speed the development of more collaborative, immersive and autonomous solutions as Artificial Intelligence- and Machine Learning-powered software takes much of the decision-making burden from our mobility, work, manufacturing and home lives.

The past 70 years has witnessed immense change, which makes it all the more incredible that for many millions of people, it has all happened under the reign of a single monarch.

How the innovators of the coming decades take on the baton for future generations remains to be seen, but there is no doubt that intelligent, immersive use of data will be a central part of tomorrow’s landscape.

Author

  • Richard Scott

    With more than a decade of experience editing B2B publications, Richard joined Hexagon in 2021 as Global Content Programmes Lead. Located in the UK, Richard has written for and edited a wide range of journals focused on subject areas such as electrical engineering and the chemicals industry.

  • Recent Posts

    More