Mar 15, 2022
Dual Agile & A/B Test
Building the PDCA Cycle
During implementation, websites often face multiple issues that require major redesigns. However, it becomes challenging to validate hypotheses and address all the issues through A/B testing simultaneously. While following the PDCA (Plan-Do-Check-Act) cycle, limited user data restricts our ability to fully answer complex questions related to large-scale redesigns. To overcome this, we break down the problems into smaller, measurable objectives, utilizing data-driven approaches to optimize business outcomes. This breakdown concept resonates with the concept of sprints.



Design Sprint and Development Sprint
The integration of designer characteristics into agile development has long been explored through theories like Lean UX. Design thinking and design sprints have provided practical techniques applicable to enterprises:
Starting is more important than being correct.
Diverging with multiple designs before validation.
Discussing specific designs or executions before further deliberation.
Iteratively iterating designs as needed.
Agile development exhibits several characteristics:
Prioritizing individuals and interactions over processes and tools.
Delivering working software or comprehensive documentation.
Fostering customer collaboration or contract negotiation.
Responding to change or following a plan.
While these approaches have their own distinct iterative cycles, they share similarities and can be effectively combined.
The Design Process for Continuous A/B Testing
We have adopted the Design Sprint process and implemented the following steps: Understand, Diverge, Converge, and Deliver. Google Optimize serves as our tool, enabling improvements to be divided into resources requiring development and designs that designers can independently release. We also incorporate seminars to explore user behavior and psychology.
The process begins by defining the problem using the "How Might We" approach. A brainstorming workshop follows, where we meticulously write down all possible ideas on the page. We then employ PIE Score, evaluating factors such as CVR confidence, experience enhancement level, and development difficulty. Ideas with lower development difficulty are planned for A/B testing, while those with higher difficulty are planned for enhancement.



Understand
We utilize behavioral data and heatmaps to understand user click patterns. During workshops, we mentally write down questions tailored to different page contexts, which become our "How Might We" design goals.

Diverge
In workshops, we use paper prototypes to generate as many possible solutions as we can.

Decide
We evaluate ideas using PIE Scores, which encompass CVR confidence, experience enhancement scores, and development difficulty. Various roles, including product managers, engineers, and designers, assess the ideas. This process generates a list of A/B test and enhancement ideas.




Empowerment and Faster
In the pursuit of faster A/B testing, we have streamlined the traditional process, including experiment planning, wireframe creation, development, QA, and analysis. Leveraging the power of Google Optimize, we equipped our designers with the skills to autonomously plan experiments and swiftly deploy simple tests using the visual editor. This design thinking approach optimizes efficiency and empowers designers to make data-driven decisions with ease.






Dual Agile & A/B Test
This is our implementation of the Dual Agile development process. Designers utilize divergent and convergent thinking, incorporating market feedback into their designs and iterating continuously. The output becomes development documentation. By breaking down development into different units, similar to agile development, we can steadily progress toward business objectives.
Through testing, we have made intriguing discoveries. Some examples of our tests include user color preferences for coupons, attractive marketing copy, and user reading habits for product descriptions across different devices. We have also identified the specific content users need at different purchasing stages. I believe this testing approach is applicable to any internet product, reducing reliance on speculative discussions and enabling rapid generation of numerous ideas and optimization plans. Ultimately, we have successfully improved CVR, reduced bounce rates, and conducted twice as many tests as before.

