INTRODUCTION
The business team at Adobe Stock, an e-commerce marketplace for stock photos and other digital assets, wanted to introduce a new purchasing option to an existing business model. Expectations for project were:
Minimize expected negative impact to recurring revenue
Market a new, more flexible purchasing mechanism
Build trust with customers who, based on industry standards etc., were suspicious of business motives
BEFORE & AFTER REDESIGN
TOP-LINE RESULTS
Additionally, this was a successful introduction of the new purchasing mechanism to target customers (without confusing non-target customers), and helped our business leadership build empathy for our customers.
PROJECT OVERVIEW
PROCESS
In summary, the process for this project consisted of:
An initial competitive landscape
Wireframes and user testing
Iterative prototyping based on paralleled user testing
Building a case with our business team through ongoing meetings and share-outs
Implementation and QA
TIMELINE
The total length of this project was approximately 8 months, which meets Adobe’s internal expectations
2 months initial research
3 months case-building with the business team (protracted due to sensitivity — more on this later)
3 months implementation
In another organization, I believe this timeline could have been shorter had we been working within a more robust data tracking platform.
CONSTRAINTS
Our system was not mature enough to be able to test iterative rollouts effectively or target particular customers; as a result, our stakeholders found themselves other information, like the qualitative user research I conducted. While healthy appetite for user testing is something to be celebrated, in a perfect world we would have been equipped to move with more agility and precision.
Evidence of constraints
As a small personal anecdote on this point, our business leadership almost pulled the re-design just 3 days after going live: our data tracking was not instrumented correctly, resulting in incorrect reporting. Luckily our analytics team sorted this out and we were able to see the amazing results!
DISCOVER
COMPETITIVE LANDSCAPE
Learning #1
Don’t chase local maximum
I captured this quote from a user testing participant who was reviewing the a competitor, iStock.com
Originally, our stakeholders believed this pricing page to be an industry-leader in terms of design and performance; my early research insights debunked that hypothesis and exposed the crushing cognitive load that such a design placed on users’ decision-making processes.
We shifted from trying to “copy-cat” a competitor and to charting our own design & financial destiny.
Learning #2
Own the complexity of the pricing structure
Stakeholders initially thought we were taking steps to be more in line with the competitors shown at the top; however, the visualization of our structure shed light on the detailed information we were asking customers to understand.
DEFINE
WIREFRAMES & USER TESTING
LEARNING
The wireframes demonstrated to business stakeholders that, while customers were capable of understanding the complex system with clear design, there were negative perceptions of the business model itself due to its complexity
CONTINUED TESTING
Multiple rounds of wireframes resulted in a strong signal of similar feedback: design clarity could help people understand a complex business model, but it couldn’t strip away transparency without stripping away trust.
STRATEGY
MORE ITERATIVE DESIGN
FINAL DESIGN
RESULTS
Along with significant increase in quarterly gross ARR, the design accomplished the following
UX - BUSINESS IMPACT BREAKDOWN
Details of which UX choices influenced these strong business outcomes are below