by Oliver Rochford, Senior Director of Content Marketing
Did you know that the earliest mention of the steam engine was in ancient Greece? Heron of Alexandria’s “aeolipile” was described as early as the 1st century AD, with possible references even a century earlier. My own hunch is that the basics of the steam power have been rediscovered many, many times over in history. Anecdotes from across the late middle ages and renaissance show that diverse thinkers saw the potential quite early, with experimental turbines and pistons used to power water wheels or even raise weights. Yet it took until the end of the 17th century (1698) and the determination of Thomas Savery to develop the first commercial steam-powered device, a water pump designed to keep mines dry.
1800 years is a long time to learn about something, yet not translate that knowledge into something useful. By all accounts, ancient humans were no less intelligent than modern ones, so smarts cannot have been the problem. People also must have had a need for steam power during that time – mines must have flooded, ships got stranded in doldrums, and the cost of transporting anything of any mass or volume over land was not just unaffordable, but practically impossible. The inhibitor was obviously not need or demand.
The simple explanation is that while the knowledge of steam power was available, most of the other knowledge and infrastructure required to practically implement and maintain it – what we would today call the supply chain – was not. You need good steel, the ability to cast it at scale, a steady supply of high-quality coal and the list goes on. Yet when the time was right and all prerequisites were met, steam took off like, well, steam.
We experience this sort of thing in tech all the time of course. Voice and facial recognition are good examples of technologies that always seem to have been tried just too early to be effective, and many of the machine -learning algorithms everyone is marketing as AI have been around for decades as well. But constraints in processing power, the cost of storage, and the availability of data all contrived and led to what have been termed “AI winters”, phases where interest and investment cooled down on the topic due to overpromising and under delivering – the dreaded “Trough of Disillusionment” in Gartner’s’ Hype Cycle parlance.
Cybersecurity is no exception in this regard. We have always had innovative technologies that seem to be just a little too early, or dependent on other technologies that just are just not quite there or ready yet.
Early SIEM’s struggled to scale due to underlying architectural limitations. Early attempts at automated response were stymied due to a lack of sufficiently smart analytics. Analytics were limited by the lack and availability of data. Data availability was hampered by cost and bandwidth availability. The list of dependencies and technical prerequisites is considerable. We have had to fight on the frontline against adversaries that are nothing if not inventive, and so many of us have been open to experimentation.
But now an inflection point seems imminent. With improved cloud and data architectures we can do better analytics and machine learning, and we can do it bigger than before.
And with detection approaches such as XDR providing unprecedented visibility across the entire attack surface and feeding high-fidelity data into advanced analytics, we can get closer towards realizing the dream of precise and targeted automation. And we can do this at massive enterprise and cloud-scale.
That is what made me jump at the opportunity to join Securonix. To be involved in all of these trends and technologies coming together seems like having the rare privilege to be there at the right time, in the right place, when a paradigm shift occurs.
So I hope you are as excited as I am, and you’ll join me in exploring this new age of digital steam!