Designing for Growth

Handy is an online two-sided marketplace for cleaning and home services. One of their biggest revenue drivers is recurring home cleaning– offering customers 3mo, 6mo, and 12mo subscription plans. As the lead designer for the cleaning growth team, I helped design and optimize opportunities for new customers to commit to cleaning subscription plans.

My role

As a product designer on the growth team, I worked closely with product managers, engineers, and marketers to define what we were going to test, what metrics we wanted to improve, and then analyze the results to decide how best to move forward.

My job was to figure out what those tests would look like, whether as simple as a few copy changes to as "meaty" as a new user flow. I was also responsible for user research – customer calls, surveys, and competitor analyses– and regularly presented key insights to my team. My research both informed potential solutions to test and helped us contextualize results.

The Ask
Design Challenge

How can we get more customers to sign up for cleaning subscription plans without changing our pricing model?

The process
What did we do? How did we get there?

As a team we brainstormed a bunch of strategies for improving conversion (subscription sign ups), some of which would live within the product itself (e.g. changing UI elements in the checkout flow) and some that were outside the product (e.g. automated bounce emails to customers).

Understanding the customer journey– how customers engage with Handy across all touchpoints– was key to identifying both online and offline opportunities to draw customers further down the purchase funnel.

The customer journey

Most customers come to Handy’s website from a google search, which lands them on what we call the “When” Page. After entering basic job details, customers click ‘Get a Quote’ which takes them to the “Payment” page. Here, the price of the service is revealed for the first time.

As you can imagine, most first-time customers who land on the Payment page bounce (especially on mobile web). This could be for various reasons, like wanting to shop around for a better deal or not having a credit card on hand.
Identifying drop off points and contextualizing the data with qualitative user research (e.g. customer calls) helped us identify opportunities to engage customers more effectively through each phase of the purchase funnel.
The customer mindset
Whenever I could, I would reach out to customers at different phases of the purchase funnel (pre to post-booking) to listen and understand their perspective. I also looked at CX tickets, read reviews on social channels, and regularly sent out email questionnaires to identify reasons customers either bounce or cancel their subscription early.
Across each feedback channel, common pain points emerged. I looked at competitors to understand how they tackle some of these pain points and created an affinity map to help me visually patterns and opportunities.


Strategy
Opportunities for testing

After breaking down the customer journey and understanding their mindset, the team distilled three main opportunities for engaging customers more effectively:

  1. Optimize Payment Page

    Given that… (the data)
    70% of users drop off from Payment page to When page and drop-off is even greater on mobile web than desktop.

    How can we...(the challenge)
    Optimize the payment page (especially on mobile) to encourage new customers to make a booking?

  2. Personalize Return Experience

    Given that… (the data)
    People who reach the payment page (without completing a booking) are 50% more likely to return at least once within 14 days.

    How can we...(the challenge)
    Personalize the return experience so we can make the process from payment to completed booking more seamless?

  3. Prep customers for first cleaning

    Given that… (the data)
    People who book cleaning subscription plans are 40% more likely to cancel before their first cleaning.

    How can we...(the challenge)
    Get new customers excited about their first cleaning and set them up for success?



What we tested, why, and outcome


Both Product and Marketing built a roadmap of experiments which we tracked together. Below I outline the main tests I helped design.

TEST 1: UI changes on Payment Page



Hypothesis: If we add visual cues to help reduce anxiety around purchase commitment, more customers will book.

Research to support: feedback from customer calls around purchase anxiety

Test overview: We A/B tested a bunch of options from improving the receipt UI (see image above) to testing different cta button copy.

Outcome: Win for Payment call outs

Overall most simple ui changes didn’t move the needle. Letting customers know we won’t collect payment before 24 hours of service was the biggest win for When to Payment, with an even greater win on mweb than web. This told us that we were on the right track in terms of reducing customer anxiety around payment.

TEST 2: Personalization



Hypothesis: If we personalize the return experience (for logged-out users), new customers will feel more at ease about booking and will more likely commit.

Research to support: a) Majority of customers return to site at least twice before making a booking. b) Customers are accustomed to personalization.

Test overview: Normally returning customers have to re-enter their service details on the When page in order to complete a booking. We wanted to personalize the return experience for logged out customers to make completing a booking more seamless. The team used Google Optimize to test personalized copy changes to avoid unnecessary development work (to note: the designs were limited by the constraints of Google Optimize software).

We created a simple A/B test: the experimental group of returning customers were brought to a personalized "When" page that was auto-filled. Control was brought to the normal “When” page. The key metrics we measured was click rate (W->P) and conversion rate (completed bookings).

Outcome: Inconclusive / Paused

Click rate from When to Payment was higher for the experimental group (as expected), meaning more returning customers were going through the checkout flow when landing on a personalized page. However, in terms of conversion (actually completing the booking) we did not see a significant increase.

The test taught us that sometimes customers need more convincing than simply a reminder that they have already visited the site. It also taught us that personalization extends beyond “welcome back” copy.

TEST 3: Book now, Pay later



Hypothesis: If we remove payment from checkout flow, more customers will book. The more customers that complete a booking, the more opportunities we have to acquire paying customers.

Research to support: a) Price is the number one reason customers hesitate (due to lack of perceived value or inability to pay full amount upfront) b) Customers expect to be able to pay after service is complete (common expectation in service industry) c) Customers desire instant gratification. d) Flexible payment options (e.g. Interest payments) is becoming a new norm.

Test overview: This was our most risky test and most intensive from a development standpoint because it required breaking legacy code. Our primary metric was completed conversion (paid bookings), which would determine which test “wins”. We also measured booking conversion (W->P) and monitored the two metrics for comparison.

Outcome: Temporarily paused

During testing we saw some surprising data: big win from the experiment in terms of booking and then major losses in completed bookings (collecting payment). On one hand, we understood this was to be expected. By lowering the risk of making a booking we would capture more low-intent customers. However, we later learned that there was a bug in the test that prevented customers from inputting payment details.

Due to development constraints, we decided to pause the test.

TEST 4: Reposition Cleaning subscriptions



Hypothesis: If we reposition cleaning plans into fixed bundles (e.g. 3, 5 cleanings, 10 cleaning packs) then more customers will feel comfortable committing because the offer seems more flexible.

Research to support: a) Customers who cancel say they are looking for a more flexible plan; they want to be able to schedule on own terms. b) Customers are fearful of commitment plans.

Test overview: We wanted to test interest in cleaning packs as a potential alternative to subscription plans without cannibalizing conversion of packs.

So, we set up a quick test to gauge customer interest in packs. I built a landing page using Unbounce. We set up a PDP on Handy’s site to collect payment for cleaning packs. New customers who had visited Handy were sent an email with a link to the product offerings. We measured click rate from Unbounce -> PDP on the Handy website. We also measured conversion: PDP-> payment.

Outcome: Temporarily paused

Click rate from bounce to PDP was about equivalent to when->payment. This suggested that customers are interested in packs and that they could be a potential valuable offering alongside cleaning plans. However, completed bookings were not high enough to suggest that we offer packs as an alternative option to subscription plans at this time.

Key Learnings

Closing thoughts

Designing for growth is really a psychology game. We were challenged to put ourselves in the shoes of the customer and come up with creative ways to influence their purchase behavior. However, to truly impact change we would need to solve the chicken-and-egg problem: create a system where professionals on the platform are incentivized to deliver the best cleaning experience for customers, which, in turn, would translate to more happy customers and more bookings.