Adobe is launching its Firefly AI Assistant, a public beta that the company says can use multiple Adobe Creative Cloud applications to complete tasks. The announcement builds on an earlier preview of the assistant under Adobe’s “Project Moonlight” moniker from last October, and it positions Firefly as an agentic layer that can move between tools like Photoshop, Premiere, Lightroom, Express, and Illustrator.
For creators and developers watching the shift toward AI agents, the key technical story is not just that Firefly can generate images or video. Adobe is describing an assistant that can orchestrate actions across apps, offer interactive controls, and package multi-step capabilities into reusable “skills.” That combination—cross-application task execution plus user-controllable prompts and UI elements—suggests a specific approach to how agentic systems could fit inside established creative software workflows.
From Project Moonlight to Firefly AI Assistant
According to TechCrunch, Adobe previewed a new assistant last October under the “Project Moonlight” name. That preview described an assistant that could do tasks by tapping different Adobe apps such as Acrobat, Photoshop, and Express. Adobe is now launching the same concept as Firefly AI Assistant.
Adobe says the assistant will be available in a public beta in the coming weeks. The company did not specify whether the assistant will be priced differently from Firefly’s credit-based subscription tiers, which matters because agentic features often change the compute and workflow value delivered to users.
Technically, this is an evolution of the “describe what you want” interaction model already familiar in creative AI tools: TechCrunch reports that the Firefly AI Assistant works like other creative tools by letting users describe what they want it to create, and then “it will handle the rest.” But Adobe’s differentiator, as described in the report, is that the assistant can operate across multiple Creative Cloud apps rather than staying within a single generation interface.
Cross-app orchestration and interactive controls
Adobe says Firefly AI Assistant can work across a set of apps including Firefly, Photoshop, Premiere, Lightroom, Express, and Illustrator, plus its other apps. In practice, that implies the assistant is designed to translate a user’s intent into a sequence of actions that may span different tools—an approach aligned with “agentic workflows,” where an AI system coordinates steps rather than producing a single output.
TechCrunch also reports that users can control outputs using both text prompts and buttons and sliders. The assistant can suggest actions, orchestrate between actions and apps, and execute workflows, while leaving “room for users to interject at any time.”
One example from Adobe illustrates how the assistant may adapt its UI controls to the project context: if a user is editing a product photo set in a forest, the assistant might provide a simple slider to “increase or reduce the amount of trees and foliage.” Adobe says the assistant will “learn more about your creative preferences over time” and suggest actions accordingly.
From a product-engineering standpoint, this combination of free-form prompts and structured controls points to a hybrid interaction model. The assistant can propose and run steps, but the user can steer via explicit controls. That design could reduce the friction often associated with fully autonomous agents, where users need visibility into what the system is doing and how to correct it.
Skills: packaging multi-step workflows
Beyond app-to-app orchestration, Adobe is also releasing skills for the assistant. TechCrunch describes skills as “multiple steps” bundled into capabilities. The report gives a concrete example: the “social media assets” skill can adapt images to different platforms by cropping or expanding, optimizing file sizes, and storing the outputs.
This matters because it frames how agentic systems could scale inside creative suites: rather than requiring users to repeatedly prompt for the same workflow logic, skills can standardize common multi-step tasks. Even though the report does not detail the internal implementation of skills, the functional description suggests a move toward reusable workflow units that the assistant can invoke when a user’s goal matches a known pattern.
TechCrunch also notes that Adobe is exploring improvements to how its assistants work with third-party large language models, stating that the company is “exploring having these assistants work better” with such models. Adobe has been working steadily to launch AI-powered assistants for Photoshop, Express, and Acrobat, and this exploration implies an effort to broaden the assistant’s underlying language understanding without replacing the creative tool surface area.
Competitive pressure and expanding Firefly capabilities
Adobe is not operating in a vacuum. TechCrunch points out that competitors like Canva and Figma are also working on agentic workflows. Adobe’s response, according to the report, is that its strength is in unifying its existing and popular tools. That claim is consistent with the Firefly AI Assistant’s described ability to coordinate across multiple Creative Cloud apps.
In addition to the assistant itself, Adobe is adding new features to Firefly. TechCrunch reports that the AI video editor is getting an option to reduce noise in speech, adjust reverb and music, and a color adjustment tool. The editor also integrates with Adobe’s stock library.
Adobe is also expanding Firefly’s library of third-party AI models by adding Kling 3.0 and Kling 3.0 Omni. While the report does not explain how these models are used inside the assistant, the inclusion indicates that Firefly’s ecosystem is being widened at the model layer, not only at the workflow layer.
Looking ahead, observers may watch for how Adobe operationalizes its cross-app agentic approach in beta. The report does not specify technical constraints, supported task types, or how pricing will work, but it does establish the product direction: an assistant that can orchestrate between actions and apps, provide interactive controls, and bundle multi-step processes into skills. If that approach performs well in real workflows, it could influence how creative platforms think about the balance between AI autonomy and user steering—especially as other tools pursue similar “agentic” capabilities.
Why this matters for creative software
Adobe’s Firefly AI Assistant is positioned as a bridge between AI generation and established creative software operations. By describing an assistant that can move across Photoshop, Premiere, Lightroom, Express, and Illustrator, Adobe is effectively treating the creative suite as an environment the AI can work inside, not just a set of separate tools.
Even with the details provided, the implications are largely conditional: the report suggests a framework for future assistants—one where tasks are executed as workflows, users can intervene at any time, and the system can adapt controls based on the project. For tech enthusiasts and product teams tracking agentic systems, the beta will be a practical test of whether cross-app orchestration and skill-based workflows translate into usable, controllable creative productivity.
Source: TechCrunch