Data-Driven Redesign: A Quantitative Study on a New Mobile App Navigation

Project Overview

  • My Role: Lead UX Researcher

  • Timeline: 4 Weeks

  • Methods: Quantitative Usability Testing, Between-Subjects Study, First-Click Testing

  • Participants: 374

The Challenge

The design team had proposed a new persistent bottom navigation for the mobile app, aiming to modernize the UI and improve access to key features. However, redesigning an app's core navigation is inherently risky. A wrong move could frustrate loyal users and decrease engagement. The business needed quantitative data to answer a critical question: Was the new design actually better than the current one, or just different?

My Approach

To get a definitive, data-backed answer, I designed and ran a large-scale, unmoderated between-subjects study. 374 participants were randomly shown either the current navigation or the new design and asked to complete the same set of 7 core tasks. This method allowed us to rigorously compare the performance of each design on key metrics:

  • Task Success Rate: What percentage of users could complete the task?

  • Time on Task: How long did it take them?

  • First-Click Success: Did they click in the right place first?

  • User Confidence & Ease of Use: How did they rate the experience?

The Data-Driven Findings

The results were not a simple "win" for either design. My analysis uncovered a more nuanced story that was crucial for the business to understand.

The Good News: Core Tasks Improved The data showed the new navigation was a clear winner for the app's primary tasks. Users were more successful and faster at booking a rental (+2% success) and finding their loyalty information (-3 seconds on task). This validated the core hypothesis of the redesign.

The Red Flag: Support Tasks Suffered However, the data also revealed a significant downside. The new design made it much harder for users to complete secondary but important "support" tasks. For example, the success rate for changing a password dropped from 96% to 70%, and finding specials/deals dropped from 87% to 60%. First-click success on these tasks plummeted, indicating the new information architecture was hiding these critical items.

The Impact

My findings were helped the company from launching a flawed experience that could negatively impact the navigation on the app.

  • Prevented a Negative User Experience: I successfully demonstrated with hard data that while the new design had merits, launching it as-is would have frustrated a significant number of users trying to manage their accounts or find deals.

  • Provided a Clear, Nuanced Path Forward: Instead of a simple pass/fail, I delivered a strategic recommendation: "Proceed with the persistent navigation concept for core tasks, but iterate on the 'More' menu to improve the discoverability of support and account management features before launch."

  • Established a Benchmark: The quantitative data from this study now serves as a performance benchmark for all future iterations of the app's navigation.