The Problem

The Team

Organizations don’t have the resources or capabilities to make educated marketing decisions that will acquire new donors and increase their donations. 

The team changed throughout this project. For this particular feature, there were two main designers (a design lead and myself), and a design shadow allocated to help out when needed while learning about the design process.  My role was to lead this feature by initiating the discovery activities, ideate solutions, review progress with the client, and design the final screens and prototype. I delegated tasks to the design shadow, and mentored her throughout the process. When it got time for implementation, both the design lead and the design shadow rolled off the project and I remained to support and collaborate with the engineers and product owners. 


Empower the Organization

Right now, organizations have to resort to expensive resources like Optimizely or Google Optimize. They would also have to get IT involved to do custom coding. With A/B testing, an organization marketer can feel empowered to optimize their giving without extending to so many other resources.


The purpose of A/B testing is to learn. The practice of A/B testing prompts the members of an organization to document the results of their experiments so that knowledge can be transferred. 


Through designing this feature, we followed the Design Thinking framework. We first empathized through research in conducting interviews with other individuals in the digital fundraising space. We also did a competitive assessment in order to understand the needs and goals of people who heavily utilize A/B testing.  We then defined by creating user flows and schematics, and ideated with sketching and wireframes. We prototyped and set the development team up for success, all while making improvements and tweaks along the way. 






Over the course of this engagement, the team developed a great relationship with the stakeholders at iDonate. Through standup, design reviews, workshops, and demos, we truly became one team with one objective: making this feature an amazing tool for non-profit organizations to utilize and do more good in the world!

competitive assessment

Designing an A/B testing feature was a new opportunity for the entire team. While a few of us had experience using these tools, we wanted to dig deeper to develop an understanding of all the functionality that is necessary for a useful testing tool. We identified four different products that utilize A/B testing to help users identify patterns, gain insights, and make data-backed decisions. Because of the strict time restraint for this project, we had to focus on just a few features to study within these competitors. We knew the basic functionality of what was needed in this A/B testing feature, but were curious about whether or not some functionality would be reasonable for MVP.  

We chose to look into four capabilities: visual editor, ability to run multiple experiments on the same page, heat maps, and real-time statistics. This exercise really informed our ideation and design decisions, because we learned which features were an absolute must and which ones could be put on the back-burner. Our hypotheses were validated by the visual editor and the real-time statisics being not only capabilities that were often included, but a main feature for the product. We were surprised by heat maps not being utilized as often or seeming as valuable as we initially thought. 



By establishing the user flow, the team embarked on a journey to envision the comprehensive array of screens that would be required in this process. By creating a schematic, we were able to contemplate the seamless integration of this feature within the existing iDonate platform's design framework.

Strategizing the optimal point of entry and interaction for this feature within the platform became paramount. This required a thoughtful deliberation on how to effectively communicate its significance to our organizations and empower them with the necessary guidance, ensuring they comprehended not just how to employ this feature, but also why it's pivotal to their mission.

Recognizing the predominantly marketing-oriented nature of this feature, a deliberate choice was made to introduce an innovative product section: Strategy. Although users retain the ability to initiate A/B tests within individual giving forms and campaigns, the creation of a dedicated space solely for refining marketing strategies was invaluable.

This strategic decision opens the gateway to accommodate forthcoming marketing-centric functionalities, setting the stage for the progressive introduction of future features that are poised to elevate the product's capabilities.

After thoroughly exploring competing solutions and delving into the mechanics of other A/B testing tools, the natural progression was to conceptualize the adaptation of this feature for iDonate's non-profit clientele. This task presented several intricacies that required careful consideration.

For instance, how might a user respond when Test A and Test B yield statistically insignificant differences? Similarly, how do we address situations where the variant performs worse than the original form? Delving into these scenarios prompted us to envision the wide spectrum of actions a user might need to undertake to optimize their forms for their specific audience.

This process of thinking through these steps and dissecting the myriad choices a user might face played a pivotal role in outlining the necessary screens and complexities of design.


Once we got a grasp on the user flow, entry points, and how exactly this new and exciting feature would fit into the design of the current product, it was time to start taking those plans to paper. Even though we were a remote team, I wanted to make sure we still gave ourselves time to collaborate and ideate together, as if we were in person! There were a few rules I established for the team to make this process as innovative as possible.

1. Cultivate 5-10 Iterations for Each Main Screen. It's so easy to get stuck in the same idea and not allow yourself to think outside the box. Forcing multiple drawings that are required to be different allows each team member to push past those initial ideas and reach for something new. 
2. Embrace the Bold Stroke. Our team only used thick pen or marker. When you use a pencil, it's almost impossible to push past the temptation of using the eraser or getting too into the details. We want big picture ideas here, the details will come later! 
3. Unveil All: the Good and the Bad! The cradle of brilliant ideas often rests in the fertile ground of perceived "bad" ones. As a team, we engage in candid conversations around every sketch, including those that diverge from the fitting path. After all, it's through this process that we pave the way to pinpoint the ultimate solution.

We uploaded the jpgs of our sketches into Figma and discussed each iteration we had for the main screens online. We starred the ideas we loved and were able to move forward with an amazing starting point! 

Applied designs

Having dedicated a span of months to refining the design of the iDonate platform, the stage was set for the commencement of the A/B testing feature's journey. Gratefully, the bedrock of a robust design system was already laid, offering a seamless trajectory for the applied design of A/B Testing.

Countless cycles of internal collaborations, client reviews, and meticulous iterations resolved in a resounding achievement—our feature's transformation into a state ready for development! I stayed on the team solo for a a few more months to collaborate and support the engineers, while working on other new features for iDonate's platform. It was an honor and joy to see this feature come to fruition!


Once the applied designs were established, it was time to put it all into an interactive prototype. This prototype was used to validate the designs, discover minor interactions, review with stakeholders, and use as a marketing tool for iDonate's Customer Advisory Board. 


The first thing a user does is head over to the Strategy tab to learn about the marketing capabilities that iDonate enables them with. The first thing they see is a little blurb about A/B Testing and the benefits of implementing testing into their giving form.  Then, when a user decides to create an A/B test, they are brought to a page that showcases all of their giving forms to filter and choose from. 


Once a user names their test, they are brought into Step 1 of the creation process: determining their hypothesis and metrics. This step can be skipped, but since the whole purpose of A/B testing is to learn, we wanted to create a space for the user to determine what they will be changing and their prediction - just like any other scientific experiment! 
One thing our team went back and forth on was determining significance. It was difficult to predict when a good time to end a test would be, because that would most likely look different depending on the organization. 

Some orgs might get 1,000 giving form visits in a day, other smaller organizations might never reach that kind of number! Therefore, we wanted to provide a way for the organization to choose significance for them. A user can choose to be notified after a marketing metric (form visits, number of donations, or conversion rate) or by an amount of time (days, weeks, months). 


The next step is the fun part: Editing Form B! In this step, a user is brought to the Form Editor - a tool they are extremely familiar with. They see a duplicate of their giving form where they can make changes to test. In this example, a user changes the name of their header to be shorter and more impactful, as well as the overall color scheme of the form. Users are able to preview their forms side-by-side to ensure they're happy with the changes implemented and the overall test. 


Last in the the creation process is a simple review. We want to give the user an opportunity to review the predictions and metrics they set and the changes they made. It's uncertain how long a user would take to actually execute this A/B test, so it is feasible they would forget the parameters and modifications established.  Once they've reviewed both forms and are satisfied, it is finally time to publish the test! A user clicks publish and gets a quick confirmation of the launch of their new A/B test. 


After some time has passed, a user might either get a notification that their test has reached it's metric of significance or they will just want to check in on the results without prompt. From the Strategy tab, a user selects "View A/B Tests" and they can see all the tests they've conducted and their varying statuses. Then, they can go to an overview page where they can take a closer look into how that test is performing. From there they can see valuable information, like the data for each form (total donations, average donations, conversion rate,  and form visits). They're able to easily see the delta for total donations and number of transactions, two key metrics significant to determine performance. A use can take notes on what they are observing during the test so they can remember and communicate findings with other teammates within their organization. Finally, when the results feel significant enough to determine a path forward, the user can end their test. In this process, they determine which form they want to move forward with. Once they select the form they feel is performing better, that will be the form published from then on, replacing all instances of the underperforming one.  


Unfortunately, in this fast-paced feature project, we did not have the time or funds to do usability testing. However, we did have multiple reviews with the stakeholders and Subject Matter Experts to review the designs and prototype. We received great feedback from them, both positive and constructive. Throughout these reviews, we made a few tweaks to ensure an amazing, usable product. 


Originally, the A/B test cards visualized a little glimpse of Form A and Form B in a way where one was in the foreground, and the other was in the background. In one of our review sessions, we discussed how this might suggest that one form is better or more significant than the other. We want these to portray equal hierarchy, as the point of the test is to determine which form was higher performing. Therefore, we made a design change that showcased both forms in equal placement - and with a bit more color and excitement! 
Another change we made to this card was how we communicated the delta. In the first design, we only included the delta percentage, but it was unclear how that was calculated without having both form metrics. When a user saw an 8% increase in donations, they did not know which form was resulting in that inflation. Therefore, in the re-design, we included the metrics for both form A and B, allowing the user to get a quick understanding of the performance of both forms. 


As we continually had review sessions, we were concerned that the process of editing Form B was unclear. After a user submits their hypothesis and metric of significance, they are taken to the form editor - an interface they are very familiar with. However, originally there was nothing on this screen to indicate a different experience to what they have been used to. We were concerned that the user would not understand that they are editing a duplicate of the form - not the original. In order to eliminate this possible confusion, we incorporated two solutions:
1. A Transition Modal: I designed a modal to appear after a user fills out the settings for their A/B test. This modal communicates with the user what is happening behind the scenes: the giving form is being duplicated! 

2. Form B Label: I added a discreet visual element to educate the user that they are editing Form B - not the original. I did not want this to appear as a warning, but wanted it to call enough attention to the user. The design was finalized with a thick orange outline around the form, along with a small but legible tag on the top center labeling Form B. 


When it was finally time to start development of the A/B testing feature, I was the only designer to remain on the team. It was the first time I had ever been the sole design representative for engineer support, and I learned so much about cross-team collaboration! We followed an agile process working in two-week sprint intervals. After a couple of sprints, we made a great adjustment to start doing internal kick-offs and PR's. We utilized discord, and whenever an engineer was either kicking off or reviewing a story they worked on, they messaged in Slack to see who was available. Since I was the only design representative, I was present in almost every kick-off and story review! Every member of the team was so detail-oriented and receptive to feedback, and the collaboration really showed itself in the final product!