Adobe—the company that built its empire on tools for human designers, photographers, and illustrators—just told Fortune that job candidates who submit AI-generated work samples tend to stand out in the hiring process. Stacy Martinet, Adobe's chief communications officer, framed it as a search for people who "combine creativity with technical skills," noting that "as AI reshapes how we communicate, market, and create, those who pair creative skills with AI fluency will have a competitive edge."
Let's pause on the historical irony here. Five years ago, the creative professional community viewed AI image generation as an existential threat. Designers, illustrators, and artists organized campaigns against AI training on copyrighted work, filed class action lawsuits against Stability AI and Midjourney, and publicly worried that generative models would commoditize skills they'd spent decades developing. Adobe itself faced internal and external pressure over its AI strategy, eventually implementing content credentials and model provenance systems to address creator concerns about training data and attribution.
Now, the same company is signaling that AI fluency is not just acceptable in creative hiring—it's preferred. Candidates who demonstrate they can use Firefly, DALL-E, Midjourney, or other generative tools to produce compelling work samples have an advantage over those who rely solely on traditional skills. The creative industry isn't resisting automation anymore. It's selecting for it.
This shift is happening faster and more thoroughly than almost anyone predicted. And it raises uncomfortable questions about what "creativity" means, what professional expertise looks like, and whether the skills that defined creative work for generations still matter in a world where AI can generate publication-ready designs in seconds.
Adobe's position, as articulated by Martinet, rests on a familiar framework: AI is a tool that amplifies human creativity rather than replacing it. The company is looking for candidates who can pair creative judgment with technical fluency—people who know what to create and can use AI to execute efficiently.
This is a defensible position. Adobe's own tools have always been about augmenting capability. Photoshop didn't eliminate photography—it expanded what photographers could achieve. Illustrator didn't replace illustration—it changed how illustrations were produced. By this logic, Firefly and AI-powered features in Creative Cloud are just the latest evolution in a long history of tools that make creative work more accessible and efficient.
The argument goes: a designer with AI fluency can explore more iterations, test more concepts, and deliver client work faster than one working purely manually. In competitive client services, that efficiency translates to business value. Agencies pitching against each other need teams that can turn around mood boards, mockups, and finished designs at the speed clients now expect—which is increasingly "same day" rather than "next week."
From Adobe's hiring perspective, candidates who demonstrate AI proficiency signal several valuable traits:
These are legitimate professional qualities. If Adobe's business strategy involves integrating AI throughout Creative Cloud—which it clearly does, given Firefly's rapid feature expansion—then hiring people who already understand and use these tools makes operational sense.
But this pragmatic argument sidesteps deeper questions about what's being lost in the transition.
The concern isn't that AI tools exist—it's what happens when AI fluency becomes more valued than traditional creative fundamentals. If hiring managers prioritize candidates who can prompt Firefly effectively over those who can sketch, understand color theory, or execute complex illustration techniques manually, the incentive structure shifts dramatically.
Why spend thousands of hours developing drawing skills if employers prefer someone who can describe what they want to an AI model? Why master typography if generative tools can produce "good enough" layouts instantly? Why study composition, lighting, and visual storytelling if clients just want output that matches a reference image?
This isn't hypothetical anxiety—it's already happening in adjacent fields. Consultants who used AI tools extensively showed productivity gains but declining ability to perform tasks without AI assistance over time. The tools created dependency, not augmentation. Skills atrophied because the incentive was to use AI for everything rather than develop deeper expertise.
The risk for creative professions is similar. If entry-level designers learn to rely on AI from day one, they may never develop the foundational skills that enable sophisticated creative judgment later in their careers. They'll know what they want a design to look like, but not why certain choices work and others don't—because they never did the manual iteration that builds tacit understanding of visual principles.
Experienced designers have decades of pattern recognition developed through hands-on work. They can look at a layout and immediately identify balance issues, hierarchy problems, or typeface mismatches. That judgment doesn't come from theoretical knowledge—it comes from making thousands of mistakes and learning what works through deliberate practice.
If the path to employment now runs through AI-generated portfolios rather than traditional skill development, we're selecting for a different capability profile: prompt engineering and curation over execution and craft.
Adobe's endorsement of AI-generated work samples introduces a thorny attribution question: when a candidate submits AI-generated designs, what are they actually demonstrating?
If someone uses Firefly to generate ten variations of a poster design, then selects and refines the best one, their contribution is primarily curatorial and editorial. That's valuable—taste matters, and knowing what works is a genuine skill. But it's fundamentally different from the skill of making a poster from scratch using traditional tools.
The problem is that portfolios have historically served as proof of technical capability and creative execution. When you look at a designer's portfolio full of hand-crafted work, you know they can execute at that level. When you look at a portfolio of AI-generated work, you know they can prompt and curate—but not necessarily whether they could produce that quality manually if AI weren't available.
This creates asymmetric information in hiring. Employers can't easily distinguish between candidates who use AI to amplify strong foundational skills and those who use AI to compensate for lack of foundational skills. Both might submit similarly impressive portfolios, but their actual capabilities differ dramatically.
Adobe's position implies that this distinction doesn't matter—that what employers should care about is output quality and AI fluency, not whether candidates can draw. But that assumes AI tools will always be available, always function as expected, and never hit use cases where manual skills become necessary.
What makes Adobe's stance particularly significant is the company's market position. Adobe isn't a scrappy startup disrupting the creative industry from outside—it's the incumbent platform that defines professional creative workflows. When Adobe signals that AI fluency is advantageous in hiring, the entire creative ecosystem adjusts.
Design schools will shift curricula to emphasize AI tools over traditional techniques. Bootcamps will teach prompt engineering alongside composition principles. Junior designers will prioritize learning Firefly over mastering Illustrator's pen tool. Portfolios will increasingly feature AI-generated work because that's what Adobe—and by extension, the industry—is selecting for.
This is market adaptation in real time. And it's happening with remarkable speed considering how recently the creative community was organized in opposition to generative AI.
Workers in arts, design, and media occupations report among the highest exposure to AI tools, with 62% saying AI is used in their workplaces. That's up from 31% in 2023. The adoption curve is steep, and Adobe's hiring practices both reflect and accelerate that trend.
But "adaptation" and "capitulation" can look similar from the outside. Is the creative industry evolving to integrate powerful new tools, or is it surrendering core professional identity to economic pressure? The answer probably depends on whether traditional skills remain valued alongside AI fluency, or whether they're systematically devalued as less efficient than AI-first workflows.
There's a historical parallel worth examining: what happened to photography when digital tools became dominant.
Thirty years ago, professional photography required mastery of film chemistry, darkroom techniques, exposure calculation, and physical craft. The barrier to entry was high—equipment was expensive, mistakes were costly, and developing expertise took years. That scarcity created professional value.
Digital photography collapsed those barriers. Suddenly, anyone with a DSLR could take thousands of photos at no marginal cost, review results instantly, and edit in Lightroom rather than spending hours in a darkroom. The technical skills that defined professional photography became less relevant. What mattered was composition, lighting, subject interaction, and post-processing—skills that were always important but previously gated behind technical execution barriers.
The result? Professional photography commoditized. The number of working photographers increased dramatically, but average earnings declined. According to Bureau of Labor Statistics data, median photographer income adjusted for inflation has been essentially flat or declining for two decades despite massive growth in image consumption.
The profession didn't disappear—it just became much harder to make a living doing work that clients could increasingly approximate themselves or source cheaply from massive stock libraries and amateur photographers.
Generative AI is doing to design and illustration what digital photography did to film photography: dramatically lowering execution barriers while shifting professional value toward judgment, taste, and strategic creative direction. That's not necessarily bad for everyone—clients get cheaper access to good-enough creative work, and the most talented creatives can focus on higher-value strategic work rather than execution. But it's clearly bad for the middle tier of professionals whose value proposition was competent execution of standard design work.
Adobe's hiring preference for AI fluency accelerates this transition. It signals that even at Adobe—the company that built the tools traditional designers use—execution skill is less valuable than strategic creative judgment plus AI proficiency.
There's a case that the creative community's anxiety about AI is overblown, and Adobe's position is simply realistic about where the industry is heading.
Argument 1: Tools have always changed creative work. Brushes replaced fingers. Printing presses replaced hand-copying. Photoshop replaced darkrooms. AI is just the latest iteration. Professionals adapt or get left behind—that's not new.
Argument 2: Craft for craft's sake is gatekeeping. Insisting that designers must master manual illustration before using AI tools is like insisting pianists must build their own pianos. What matters is the final output and whether it serves client needs.
Argument 3: AI democratizes creativity. Lowering barriers to entry means more people can participate in creative work, more diverse voices get heard, and economic opportunity expands beyond those who could afford expensive education and equipment.
Argument 4: High-end work remains human. Clients who want truly original, strategically sophisticated creative direction will still hire senior designers with deep expertise. AI commoditizes mediocre work, not excellence.
These arguments have merit. The creative industry has survived every previous tool transition, and there's no reason to think this one is categorically different. The designers who thrive will be those who use AI to amplify their strategic and conceptual capabilities rather than viewing it as a replacement for skill development.
But there's a difference between "the industry will survive" and "individual practitioners will thrive." The transition may be fine for Adobe, for clients getting cheaper creative work, and for the top 10% of designers who can position themselves as strategic directors. It may be much worse for the middle 60% of professionals whose primary value was competent execution—work that AI can now approximate at near-zero marginal cost.
Adobe's endorsement of AI-generated portfolios clarifies the company's strategic priorities: efficiency and tool adoption over craft preservation. That's a rational business decision. Adobe makes money when more people use Creative Cloud, and AI features make the tools more accessible to non-experts. Expanding the addressable market is good for revenue, even if it commoditizes professional expertise.
But it also reveals a broader industry shift. The creative professions that once defined themselves through mastery of craft are redefining around mastery of tools and strategic judgment. Execution is becoming commodified; strategy and taste are becoming premium.
This mirrors what happened in other knowledge work domains. Accounting software didn't eliminate accountants—it eliminated bookkeepers and pushed accountants upmarket toward strategic tax planning and CFO services. Legal software didn't eliminate lawyers—it eliminated paralegals and pushed lawyers toward complex litigation and strategic counsel.
AI is doing the same to creative work: pushing professionals upmarket toward strategic creative direction, client relationships, and high-touch customization, while commoditizing execution-focused roles.
That's not inherently bad—it's structural economic change that creates winners and losers. The winners are clients who get better creative work for less money, platforms like Adobe that sell tools to more users, and senior creatives who can leverage AI to deliver more strategic value. The losers are mid-level professionals whose primary skill was competent execution, and junior practitioners who can't develop foundational skills because entry-level work gets automated.
For creative professionals navigating this transition, Adobe's hiring preferences offer a clear signal: hybrid skills are now table stakes. You need traditional creative judgment and AI fluency, not one or the other.
That means:
The professionals who thrive will be those who treat AI as amplification for deep expertise, not as replacement for skill development. But that requires deliberate effort to maintain traditional capabilities in an environment that increasingly rewards AI-first workflows.
Adobe's position may feel like validation if you're already AI-proficient. It may feel like betrayal if you spent decades mastering traditional tools and techniques. Either way, it's a market signal worth heeding: the creative industry is selecting for AI fluency, and traditional craft alone is no longer sufficient for competitive advantage.
Whether that's progress or decline depends largely on where you sit in the professional hierarchy and whether you believe creative work is fundamentally about strategic judgment or skilled execution. Adobe has made its position clear: judgment matters, execution is increasingly automated, and the future belongs to those who can do both.
The irony is that the company built its fortune empowering human creativity through tools is now accelerating the transition toward AI-augmented creativity where the "human" part is optional for large categories of work. That's market evolution, not malice. But it's worth acknowledging what's being lost in the transition: the craft skills that defined creative professions for generations are becoming optional nice-to-haves rather than core professional requirements.
We're watching an industry rewrite its own rules in real time. Adobe is just reflecting and accelerating a change that was already underway. The question now is whether the creative professionals who built their careers on mastery of craft can adapt fast enough—or whether they'll be left behind by a generation that learned to curate AI outputs instead.
If you're navigating the integration of AI into creative workflows and need strategic guidance on building teams that balance AI fluency with foundational expertise, we're here. Let's talk about the skills that still matter.