Getty's AI Lawsuit Signals Hope
Here's a thought that might sound radical in 2025: artists deserve to get paid when their work powers billion-dollar AI companies. Revolutionary,...
Disney's $300 million lawsuit against Midjourney reads like a masterclass in corporate theater. The entertainment giant, clutching its Mickey Mouse pearls, is suddenly appalled at AI's "bottomless pit of plagiarism"—apparently forgetting its own aggressive embrace of the very same technology. It's rather like Jeff Bezos complaining about monopolies: technically accurate, morally vacuous.
The House of Mouse's Memory Hole
Let's establish some inconvenient truths. Disney formed a dedicated AI task force in early 2023, then escalated to create an Office of Technology Enablement in November 2024—a 100-person unit explicitly designed to "coordinate the company's use of emerging technologies such as artificial intelligence." Bob Iger himself has been evangelizing AI as offering "some pretty interesting opportunities" and "substantial benefits," while admitting Disney is "already starting to use AI to create some efficiencies."
Disney Research is actively collaborating with Google DeepMind and Nvidia to develop Newton, an AI physics engine for robotics, while using AI for everything from bringing back young Luke Skywalker in The Mandalorian to making "fire and water come alive" in Pixar's Elemental. The company that built its empire on a cartoon mouse drawn with a digital stylus is now pearl-clutching about digital reproduction.
Here's where the story turns truly nauseating. While Disney deploys battalions of lawyers against Midjourney, individual artists have been screaming into the AI void for years with barely a whisper of support from legacy media.
Visual artists like Karla Ortiz (who designed Doctor Strange), Sarah Andersen, and Kelly McKernan have been battling AI companies since January 2023 in landmark lawsuits—often scraping together legal fees while their life's work gets processed through algorithmic meat grinders. The LAION dataset alone contained 5 billion scraped images, representing countless hours of human creativity reduced to training data.
These artists aren't seeking $300 million windfalls. They're fighting for basic recognition that their work has value beyond its utility as machine learning fodder. The lawsuit is "one of the furthest along the road to a showdown that could shape the industry's future," yet it took Disney's corporate grievance to generate meaningful media attention.
Disney's lawsuit reveals the brutal mathematics of modern IP protection. Midjourney generated $300 million in revenue last year using what Disney characterizes as a "bootlegging business model"—but when individual artists make similar claims, they're told to embrace "innovation" and "disruption."
The difference? Disney can afford to spend millions on litigation. Most artists cannot. OpenAI, Meta, and other tech giants warn that paying copyright holders for content "could cripple the burgeoning U.S." AI industry—a threat that rings hollow when applied to corporate titans but becomes existential when aimed at freelance illustrators.
Disney's legal filing describes Midjourney as facilitating "calculated and wilful" copyright infringement—language that could easily apply to Disney's own AI initiatives. The company's "hundreds" of employees working on AI post-production and visual effects aren't creating ex nihilo; they're building on the same foundation of scraped internet content that powers every major AI system.
The cruel irony? The U.S. Copyright Office has been rejecting copyright applications for AI-generated art, ruling that creative works must have human authors—meaning the very content Disney uses AI to create may not even be protectable under the laws Disney now invokes.
Disney's lawsuit isn't wrong—it's just embarrassingly late and selectively applied. Every argument Disney makes against Midjourney could be weaponized against virtually every major AI company, including Disney itself. The entertainment giant is essentially arguing that AI training constitutes theft while simultaneously scaling its own AI operations.
We've created a bifurcated intellectual property regime where corporate scale determines enforcement reality. Artists with recognizable IP portfolios worth hundreds of millions get court injunctions. Artists whose work appears in training datasets get told that resistance is futile.
The bottom line: Disney's lawsuit represents corporate self-interest masquerading as principled copyright enforcement. If they truly believed AI training constituted wholesale theft, they'd be divesting from their own AI initiatives instead of expanding them. Instead, we get the spectacle of one AI adopter suing another while millions of individual creators remain collateral damage in the algorithmic gold rush.
Perhaps it's time we stopped pretending this is about protecting artists and started calling it what it is: a turf war between tech titans, with actual creators left picking up the pieces.
Ready to cut through the AI hype and focus on what actually drives growth? Contact our team to discover how authentic, human-centered marketing strategies can deliver results that last beyond the next algorithm update.
Here's a thought that might sound radical in 2025: artists deserve to get paid when their work powers billion-dollar AI companies. Revolutionary,...
4 min read
While American tech executives obsess over coal-powered data centers and billion-dollar infrastructure spending, Chinese startup DeepSeek just...
1 min read
Here's the statistic that should make every CMO pause: While 19% of consumers now use generative AI tools like ChatGPT or Gemini to find businesses,...