Behind the design: Adobe Stock Customize
How we solved a “last mile” challenge—helping buyers find the right asset and make it perfect—with generative AI
In the past, that meant jumping between tools to rework an image. But recently, the Adobe Stock design team set out to simplify that process by tackling a “last mile” challenge: helping people find the right asset and make it perfect—without ever leaving the platform.
The result is Customize, a new feature that combines the editing capabilities of generative AI technology with user needs. It enables Stock customers to quickly find assets and make precise AI edits while ensuring contributors are compensated whenever their assets are licensed.
Group design manager Sunny Yang, who leads an eight-member team, shares a look at the considerations behind designing a generative feature that benefits both stock buyers and providers.
What was the primary design goal when you set out to design Customize?
Sunny Yang: Stock customers aren’t necessarily typical Adobe Firefly users who are willing to spend time perfecting prompts to generate the perfect image. They want to do a quick search, choose an asset that meets their visual and licensing needs, buy it, and go. One of the biggest challenges with an audience focused on a singular need and used to a specific way of working is that they don’t use new features unless they’re obvious and easy to access. We knew our designs would have to keep the Edit panel visible while also keeping it out of the way.
Stock customers also tend to have less complex editing needs, so we didn’t want to crowd the panel with unnecessary generative capability. We identified four key features—Apply Style, Apply Composition, Change Background, and Expand Image—that would cater to the essential editing needs of Stock users. Since we’d be asking them to make quick edits, our designs needed to make the process as simple as possible (preferably only requiring a single click).

From past research, we’d also noticed that Stock customers repeatedly use the same keywords, and that they often lose track of when and where they’ve seen an asset (whether on Stock, on a competitor's site, or in their own files). Drawing inspiration from other generative AI tools, we recognized the value of saving our customers’ generation and search histories to create a “timeline” that allows users to view and refine images searched within a specific session.

What user insights did you leverage to help inform the design solution?
Sunny: The new Customize feature was shaped by multiple user research studies conducted throughout 2024 by our talented researcher, Jenna Melnyk. She spent a year exploring customers’ perceptions of generative AI, understanding how they were using the technology, and gathering their input on the features we were considering.
In one iteration of our design, all search results would change with the "Replace Background" feature. When Jenna’s research revealed that users found it unsettling when every image changed (and sometimes moved), we revised our timeline design to incorporate a selection mode. Changes are applied to the selected assets, while those that aren’t, remain static. Additionally, once edits are made, a new timeline block of edited images is created above the original, indicating clearly which images have changed.

What was the most unique aspect of the design process?
Sunny: Many design teams do vision work to guide the design process. Whether it’s per quarter or per year, a north star vision can be extremely useful. We worked on several different Stock and generative AI visions—and even launched one previously—but in the end, we limited ourselves to only technology that was mature enough for stable usage.
We evaluated all the current generative AI technology available to us, considered what was possible and practical, and combined it with our knowledge of actual user needs and workflows. By merging all these elements, we developed a solution that was both achievable and useful to our customers, while preserving the livelihood of Stock contributors. It was incorporating those technical constraints as part of the design process that led us to the final version of Customize.
What was the biggest hurdle in designing Customize?
Sunny: Our biggest hurdle wasn’t design related. It was finding ways to use existing generative AI technology to support Stock's contributors and customers. Instead of designing for a future that we couldn’t implement immediately, we centered our vision work on existing technology that could improve user workflows now.
How did the solution improve the in-product experience?
Sunny: Our solution focused on the "last mile" problem—helping our customers find the right asset and make it perfect. Previously, buyers had to download multiple assets, preview them, and make edits in Adobe Photoshop. With Customize, they can quickly find and perfect the right asset without leaving the platform. With generative edits, even non-professionals can customize a Stock asset to create the perfect image. The process preserves the value of the original assets while providing the perfect final product. By incorporating generative AI, we’ve created a process that’s faster and easier for our customers—and still gets them the perfect image quickly.

The basis of Customize is still search, but rather than spending a lot of time editing and curating prompts to get exactly what they want, buyers can search Stock, view results, and quickly select images that are “almost right.” It’s the perfect combination of professional, high-quality, curated images and video with a level of personalization that adds value to the stock images that are chosen—while also ensuring stock contributors are compensated for their expertise, skill, and quality.
What did you learn from this design process?
Sunny: Typically, design visions don't account for technical constraints until later stages, but recognizing technical limitations early in the design process was crucial. The pressure to pivot as an entire industry was also shifting, made it essential to consider user needs and how we could best apply existing technology to support them.
Another significant learning was understanding how different groups of people are using generative technology. What works for our Adobe Firefly customers ended up not being useful to our Stock customers. In retrospect, as designers, we need to pause and ask how our designs and uses of technology can address the needs of our customers and make their workflows more efficient, rather than completely transforming them.
What’s next for Adobe Stock?
Sunny: As generative AI continues to improve and evolve, contributors to Adobe Stock will continue to play an important role in our business: Our focus will be on finding more ways to implement technology in our purchasing process to enhance buyer workflows, while also supporting the livelihoods of professional stock contributors.