We're Building AI Tools to Fix AI Tools That Were Supposed to Fix Everything Else
Remember when AI was supposed to simplify our lives? Those halcyon days of 2023 when we thought ChatGPT would just handle our emails and call it a...
4 min read
Writing Team
:
Jun 4, 2025 8:00:00 AM
Remember when tech conferences actually discussed technology instead of just breathlessly evangelizing about artificial general intelligence? SXSW London's inaugural week reminded us how far we've fallen down the AI hype rabbit hole, where even a music and arts festival can't resist turning into another ChatGPT worship session.
The scene in Shoreditch was peak 2025 tech conference absurdity: murals plastering every surface, $24 branded socks in the merchandise tent, and Google DeepMind's Demis Hassabis delivering the same AGI sermon we've been hearing since 2022. Meanwhile, attendees queued for half an hour just to watch him repeat that artificial general intelligence will be "bigger than the Industrial Revolution"—a prediction that sounds increasingly hollow as we enter year three of ChatGPT's plateau.
Here's what nobody talks about: we're drowning in AI conferences. A quick survey reveals dozens of AI-focused events scheduled for 2025 alone—SuperAI in Singapore ($299-$2999), The AI Conference in San Francisco, AI Everything Summit in Dubai, and countless others promising to reveal "the future of artificial intelligence" while recycling the same speakers, same talking points, and same breathless predictions.
Generative AI, for example, was a relatively small field just last year. It is now projected to become a $1.3 trillion industry in the next ten years, organizers tell us, as if market projections automatically validate the technology's transformative potential. But projections aren't performance, and conference attendance isn't adoption.
The real question nobody's asking: when did "innovation" become synonymous with listening to the same handful of AI executives deliver variations of the same presentation? Thomas Wolf discussing open-source AI models, Hassabis warning about responsible development, startup founders promising their latest LLM will change everything—it's Groundhog Day, but with more expensive merchandise.
Hassabis's prediction that AGI would be "bigger than the Industrial Revolution" perfectly encapsulates the conference circuit's fundamental delusion. Marcus's criticism centers on a fundamental belief: generative AI, the predictive technology that churns out seemingly human-level content, is simply too flawed to be transformative, argues Gary Marcus, one of AI's most prominent skeptics.
Yet conference organizers continue booking speakers who treat AGI as inevitable rather than speculative. Not all offerings will deliver meaningful impact; we anticipate that maybe 5% to 10% of the solutions will have real, measurable value, notes a recent healthcare industry analysis about AI solutions, but you wouldn't know that from conference programming that treats every incremental improvement as revolutionary.
The cognitive dissonance is staggering. We're supposed to believe that technology struggling with basic arithmetic and factual accuracy will somehow achieve superintelligence within years. Critics often highlight AI's surprising errors in seemingly simple tasks, using these failures to question its reliability for critical applications, yet conferences persist in promoting AGI timelines that make venture capitalists salivate and engineers cringe.
What's particularly infuriating about events like SXSW London is how they've transformed substantive technology discussions into marketing theater. The format itself—keynotes, panel discussions, networking sessions—prioritizes soundbites over substance, promotional promises over practical implementation.
There is another school of belief here — AI is fake and sucks — and it goes something like this. Large language models built with transformers are not technically capable of creating superintelligence, because they are predictive in nature and do not understand concepts in the way that human beings do, writes tech journalist Casey Newton, summarizing the skeptical perspective that conference organizers studiously ignore.
The disconnect between conference promises and workplace reality is profound. 46 percent of leaders identify skill gaps in their workforces as a significant barrier to AI adoption, and Employees in the public sector, as well as the aerospace and defense and semiconductor industries, are largely skeptical about the development of AI's future. Yet conferences continue promoting AI as universally transformative while ignoring implementation challenges and adoption barriers.
The most telling detail about SXSW London wasn't Hassabis's AGI predictions—it was the branded merchandise pricing. Twenty-four dollars for socks emblazoned with conference logos captures perfectly how AI conferences have become premium-priced lifestyle brands rather than technical education events.
There's a worryingly large amount of reporters who write with the immediate acceptance that A.I. will be artificial general intelligence, or A.I. will be good, or that this stuff is already proven and already powerful, because there's so much money behind it, observes AI critic Ed Zitron, highlighting how financial investment drives narrative rather than evidence.
This creates a feedback loop where conferences validate AI hype, media coverage amplifies conference messaging, and venture funding flows toward increasingly speculative claims. Meanwhile, practical applications remain limited to coding assistance and content generation—useful but hardly revolutionary.
The tragedy of AI-saturated conferences isn't that they're overselling technology—it's that they're crowding out more substantive discussions about what AI can realistically accomplish and how to deploy it responsibly. Bender and Hanna show you how to spot AI hype, how to deconstruct it, and how to expose the power grabs it aims to hide, write researchers challenging the AI narrative, but their voices rarely make conference keynote slots.
What would genuinely useful AI conferences look like? They'd focus on specific implementation challenges, discuss failure modes openly, present realistic timelines for capabilities development, and prioritize practical deployment over speculative futures. They'd feature more practitioners and fewer executives, more debugging sessions and fewer vision presentations.
Instead, we get SXSW London: AI everywhere, substance nowhere, and $24 socks that somehow symbolize an entire industry's misplaced priorities.
Conference saturation reveals something important about the current AI moment: when everyone's talking about transformation, nobody's discussing implementation. Smart marketing leaders should be asking harder questions about what AI can actually deliver rather than getting swept up in conference excitement.
The proliferation of AI events creates an illusion of progress while obscuring fundamental limitations. Performance metrics show steady improvement in reliability, and the technology is proving useful even in high-stakes domains like healthcare, suggesting that while perfect reliability remains a challenge, it's not an insurmountable barrier to creating transformative impact, but "useful" isn't "revolutionary," and "improving" isn't "ready for universal adoption."
The real insight from SXSW London wasn't Hassabis's AGI timeline—it was how easily an arts and music festival transformed into another AI marketing opportunity. When everything becomes about artificial intelligence, maybe the problem isn't the technology—it's our inability to think critically about what we're actually building and why.
Stop chasing conference hype and start building realistic AI strategies. Contact Winsome Marketing's growth experts to separate AI substance from silicon valley spectacle—because your growth strategy deserves better than $24 socks and empty AGI promises.
Remember when AI was supposed to simplify our lives? Those halcyon days of 2023 when we thought ChatGPT would just handle our emails and call it a...
While techno-optimists trumpet statistics about 170 million new jobs emerging by 2030, they're selling a fantasy. The harsh reality is that we have...
3 min read
Salesforce's $8 billion acquisition of Informatica isn't a strategic triumph—it's the most expensive admission of failure in enterprise software...