If history indeed repeats itself first as tragedy and then as farce, then the current frenzy around Artificial Intelligence might well represent technology’s encore performance of cloud computing’s initial spectacle—this time, complete with fireworks, fanfare, and an equally baffling lack of audience comprehension.
Some years ago, in the decidedly less glamorous phase of my professional life (a period defined by ill-fitting suits and an unfortunate preference for instant coffee), I found myself at a conference on the very cutting edge of technological innovation—cloud computing. It was, at least according to the poster in the hotel lobby, “The Future—Today!” My employer had seen fit to send me, presumably under the misapprehension that proximity to brilliance might induce it by osmosis. (If that were true, my seat placement next to the exit sign was an ill omen indeed.)
Presenters, resplendent in their youthful enthusiasm and mastery of arcane acronyms, spoke fervently about scalability, elasticity, and disruptive technologies, phrases tossed about with the reckless abandon of confectionery hurled from carnival floats. Yet, despite their vigour, or perhaps because of it, the substance of their discourse evaporated swiftly, leaving behind only the sticky residue of confusion. I found myself diligently scribbling notes that looked suspiciously like the ramblings of a philosopher whose flask was empty—distributed infrastructure (?), virtualisation??? and the eternally baffling Platform-as-a-Whatnow.
At coffee breaks, desperate attempts at casual networking ensued, conversations that invariably dissolved into nervous laughter as we collectively realised none of us had any real idea of what we were discussing. “Is your company already cloud-native?” someone asked earnestly, as though the question itself wasn’t proof of our collective ignorance. I nodded sagely, hoping my expression conveyed depth rather than the truth—that my mental image of a cloud was stubbornly confined to cumulonimbus.
Now, years later, the buzzword carousel spins again, with AI taking its turn in the spotlight, albeit with marginally less fog and slightly more dread. We are daily inundated with AI’s boundless promise—its ability to write essays, pilot aeroplanes, diagnose illness, perhaps even compose poetry. (Frankly, I’m still waiting for the day it can successfully order a takeaway without adding unwanted mushrooms to my pizza.) Yet, beneath this parade of marvels, one cannot help but recall the cloud’s earlier extravagant entrance: grand pronouncements of revolutionary change, closely followed by a discreet shuffling of feet when reality proved slightly more pedestrian.
My scepticism, it must be said, stems not from any profound technological insight—of which I have demonstrably little—but rather from a long-standing, uneasy acquaintance with the human capacity for enthusiastic misunderstanding. AI, like cloud computing before it, promises liberation from the mundane. It offers us a digital valet—impeccably mannered, tirelessly efficient, and devoid of any inclination toward unionisation.
Yet, is this not the same siren song we heard before? Cloud computing was to transform our digital lives utterly, making storage devices as obsolete as my 1970s flared trousers (although the trousers, to my embarrassment, enjoyed a brief but ill-advised resurgence). Today, however, we still find ourselves grappling with temperamental servers, inexplicable software updates, and customer support calls whose scripts seem borrowed from Beckett.
Furthermore, there remains the uneasy suspicion that beneath the sleek exterior of the latest AI marvels lies nothing more extraordinary than elaborate guesswork dressed up in algorithmic finery. My interactions with AI chatbots, which regularly misunderstand simple requests while confidently assuring me of their mastery, do little to dispel this suspicion. There’s something oddly comforting in knowing the robot overlords of our imagined future might be just as clueless as the average human.
It seems to me, therefore—although given my history, my foresight should be taken with generous quantities of salt—that AI might represent cloud computing’s slightly more ambitious, flamboyant sibling, eager to leap from grand promise directly into the chasm of inevitable disappointment. Yet, even as I express these doubts, I must admit to a sneaking admiration for humanity’s relentless optimism, our incurable susceptibility to novelty, our boundless enthusiasm for believing this time it will be different. And perhaps it will.
Or so I thought, as I confidently attempted to conclude this reflection, only for my computer to inform me politely—too politely—that my AI-assisted grammar checker had “improved” my prose into near incomprehensibility. It was, as usual, an inauspicious sign.
Andrew McLean is the Studio Director at Disruptive Live, a Compare the Cloud brand. He is an experienced leader in the technology industry, with a background in delivering innovative & engaging live events. Andrew has a wealth of experience in producing engaging content, from live shows and webinars to roundtables and panel discussions. He has a passion for helping businesses understand the latest trends and technologies, and how they can be applied to drive growth and innovation.