Certain fields on DriveTime's financing form were driving customer exits, but we didn't know which were causing the most friction. I designed a multi-step version of the form to test, which increased completions by 1.3% and helped us identify the problem fields.
My role: User Research • Competitive Research • UI/UX Design
Intimidating form: One of DriveTime's core product offerings is real online approval for vehicle financing. In order to do this we had a daunting lead form: two pages, multiple questions, and we need to ask for sensitive information (including social security number). As you can imagine, this led to high rates of form abandonment - 60% of users dropped off the first page, and another 20% on the second page.
Going into this experiment, our goal was twofold:
Hypothesis: If we broke the lead form into individual steps and made it feel more conversational, more users would complete it and get approved for financing. We measured form completions as our primary success metric, and tracked progress through each step to identify where people were still dropping off.
What are best practices? Before digging into more specific user or competitive research I researched the current landscape around conversion form best practices. I looked into case studies, and feedback from the UX community to pull out some of the top tips.
What is the competitive landscape? I looked at banking apps, other vehicle financing flows (like LendingTree), and other buy here pay here dealerships (Carvana, CarMax). I was specifically focused on how they were talking to their users, how they were indicating progress, and how they were breaking up their forms into different steps.
What do our customers want? All form questions were necessary for credit approval, so we needed to understand which specific questions were creating the most friction. Our data and user research revealed three key insights:
Reordering the questions: Our research showed that reordering the form fields could reduce early form abandonment so my first step was to define the new question order. Since ZIP code is familiar and low-risk, I started with that as the first question. I moved high friction fields toward the end of the form, after users had already invested time in the process.
Multivariate experiment: Based on best practices and our competitive research, we wanted to test the verbiage, progress bar, and navigation. We decided to run a multivariate experiment to find the best combination of elements.
We ultimately decided to test the following:
The redesigned form required full completion to capture lead data, while the original form captured leads from partial completions. This meant we needed to increase completion rates by more than 3.3% to offset the lost partial leads. While we did see a 1.3% increase in completions, it did not clear that threshold so the new form wasn't considered a winner.
1.3% increase in form completions (44.5% ➡ 45.1%)
The highest-performing combination of elements included a percentage-based progress bar, explanatory verbiage (short headline with subhead), and forward-only navigation.
The experiment revealed specific drop-off points that informed our next test: a single-page form designed to eliminate step-based abandonment entirely.