Adobe Hirers Encourage AI-Generated Portfolios
Adobe—the company that built its empire on tools for human designers, photographers, and illustrators—just told Fortune that job candidates who...
4 min read
Writing Team
:
Nov 5, 2025 7:59:59 AM
Adobe Max 2025 just wrapped, and the through-line is unmistakable: Adobe doesn't want to sell you tools anymore. It wants to be the creative team. The company unveiled AI assistants for Express and Photoshop that edit entire projects from text prompts, generative audio tools that add soundtracks and voiceovers to videos, and Project Moonlight—an AI agent that acts as a "centralized creative director" for social media campaigns. According to The Verge's coverage, Adobe is planning to eventually bring AI assistants to all of its design apps.
This isn't incremental feature updates. This is Adobe redefining what "creative software" means. The tools don't just assist anymore—they execute. You describe what you want, the AI builds it, and you approve or iterate. The creative professional is becoming a creative director. Or, depending on how you look at it, a quality control manager for AI output.
The question Adobe isn't asking: What happens to the humans who used to do this work?
The flagship announcement is Adobe's new AI Assistant in Adobe Express, launching in public beta. It's a conversational interface that replaces the usual tool menus with a chatbot-style text box. You describe what you need—"fall-themed wedding invitation" or "retro-inspired poster for school science fair"—and the AI generates options. You can then edit with natural language prompts: "make the text bolder," "change the background to sunset colors," "add a vintage filter."
Adobe describes it as empowering "people of every skill level" to create visual content "without having to understand specific design terms or creative tools." Translation: you don't need to know how to use Photoshop anymore. You just need to know what you want.
The live demo was telling. According to The Verge's reporting, the host asked the AI to "change this raccoon to be wearing a vampire halloween costume with fangs, holding a pumpkin, and remove his sunglasses." The AI did all that—and also turned the photograph into a cartoon, which wasn't requested. The host laughed. The audience laughed. But the subtext is uncomfortable: the AI made a creative decision the user didn't ask for, and the workflow assumes you'll just roll with it or iterate until it's right.
This is the new creative process: describe, generate, evaluate, refine. Repeat until acceptable. Not "design," but directed generation.
Adobe's new generative audio tools—Generate Soundtrack and Generate Speech—are launching in the redesigned Firefly AI app. Generate Soundtrack analyzes uploaded video footage and creates instrumental audio clips that "automatically synchronize" to the footage. You can select style presets (lofi, hip-hop, classical, EDM) or describe the vibe in a text prompt: "more sentimental," "aggressive," "uplifting."
Generate Speech adds AI voiceovers. Upload a script, select a voice, and the tool generates narration that syncs to your video timeline. According to the announcement, the tool will "suggest a prompt based on the uploaded video footage that can be used as a starting point."
This eliminates two entire creative disciplines: composers and voice actors. Not because the AI is better—it's not. But because it's instant, cheap, and good enough for most use cases. If you're making a corporate training video or a YouTube explainer, you don't need a professional composer. You need 90 seconds of inoffensive background music that doesn't distract from the content. Adobe just made that free.
One of Adobe's experimental "sneaks"—Project Frame Forward—might be the most technically impressive and creatively terrifying tool announced at Max. It lets video editors make changes to the first frame of a video, and the AI automatically applies those changes across the entire footage. No masking. No keyframing. No frame-by-frame editing.
The demo showed an editor removing a person from the first frame of a video. Frame Forward identified, selected, and replaced her with a natural-looking background—then applied that removal across every frame automatically. According to The Verge, it works "similar to Photoshop tools like Context-aware Fill or Remove Background," but for video.
This is the kind of tool that collapses hours of tedious rotoscoping work into a few clicks. Which is great if you're the editor under deadline. Less great if you're the freelancer whose business model is "I do the tedious work agencies don't have time for."
Adobe is solving for speed and cost, not creativity. And speed and cost are what most clients actually care about.
The most audacious announcement is Project Moonlight, an AI agent built on Firefly that acts as a "centralized creative director for social media campaigns." It integrates with Adobe's creative apps, pulls from your existing social media channels, and brainstorms content "consistent with your personal style and voice."
You describe your vision to the chatbot, and it processes those ideas through Adobe's AI editing tools to create "personalized images, videos, and social posts." The pitch is efficiency: one AI agent coordinating across multiple tools to produce an entire campaign. The reality is Adobe is building a creative department in software form.
This isn't a tool anymore. It's a colleague. Or a replacement, depending on where you sit in the org chart.
For SMBs and solo creators, this is transformative. You get creative director-level output without hiring a creative director. For agencies and in-house creative teams? This is compression. Why pay five people when one person with Project Moonlight can do 80% of the work?
Adobe's messaging at Max was relentlessly optimistic: these tools "empower creators," "democratize design," and "free up time for strategic work." What they didn't say:
Adobe is selling this as democratization. But democratization of tools always means commoditization of labor. When everyone can do the work, the work becomes less valuable. That's not a bug. That's economics.
If you're a designer, video editor, or content creator, Adobe Max 2025 is your three-year warning. The tools you spent years mastering are being abstracted into conversational interfaces. The technical skills that differentiated you—layer masking, color grading, motion tracking—are being automated. Your competitive advantage can't be execution anymore. It has to be taste, strategy, and creative judgment the AI can't replicate.
And if you're a business leader evaluating creative resources, here's your reminder: the cost structure of creative work just changed. Adobe is betting you'll choose speed and cost over craft. And for most use cases, you probably will.
The companies that win won't be the ones with the best AI tools. They'll be the ones who figure out how to use AI for leverage without sacrificing the judgment that makes work actually good. Because "good enough" scales. But "great" still requires humans.
Want to build a creative strategy that uses AI without replacing judgment? Let's talk. Because tools like these are powerful—but they're only as good as the strategy behind them.
Adobe—the company that built its empire on tools for human designers, photographers, and illustrators—just told Fortune that job candidates who...
The cocktail conversations along the Croisette at Cannes Lions 2025 carry an undercurrent of existential dread wrapped in champagne-fueled optimism....
1 min read
you're scrolling Instagram and see an ad for cupcakes. The image is glossy, slightly plasticky, and feels like it was birthed from the fever dream of...