UX/UI Case Study

Enterprise Survey Logic & Analytics Reimagined

RoleProduct Designer
Team1 PM, 2 Developers, 1 Designer
Timeline8 Weeks
Enterprise Survey Logic & Analytics Reimagined
Overview

The 2-Hour Survey Tax

Skill Constellation

Primary

Interaction DesignUser ResearchInformation Architecture

Supporting

Inline Logic DesignDesign Systems

Emerging

Enterprise UXSelf-Serve Platform Design

John Deere's 250+ products all need user feedback. But building a single survey took ~120 minutes of back-and-forth between a Product Manager and an engineer — filing tickets, reviewing builds, requesting changes. For an organisation that needs feedback constantly, this wasn't just inefficient. It was a bottleneck slowing down product decisions across the entire company.

I was asked to fix it. Before touching a pixel, I set three baseline metrics with the product owner so we'd know whether we actually succeeded:

Time to build a survey

Baseline~120 mins
Target< 48 mins

User satisfaction (CSAT)

Baseline3.8 / 5
Target> 4.25 / 5

Survey completion rate

Baseline50%
Target> 62.5%
Skill Spotlight

Metric-Driven Design

Set explicit success criteria with the product owner before starting design. Baseline metrics became the benchmark for every design decision.

Evidence: 3 measurable targets defined pre-design; all exceeded post-launch.

The Craft

How UX Research Reshaped the Entire Workflow

I started with 45-minute semi-structured interviews across product owners, engineers, and UX researchers — everyone who touched a survey. The finding that changed the entire project came from observing users building branching logic: they had to save their question, switch to a separate "Logic Rules" tab, manually look up question IDs, and write conditional rules by hand. This cognitively heavy process led to a 60% drop-off rate when building complex surveys.

Skill Spotlight

Contextual User Research

Conducted 45-min semi-structured interviews AND observed real survey-building sessions. Observation uncovered the branching logic pain point that interview questions alone missed.

Evidence: 60% drop-off rate at branching logic identified through task observation.

The Solution: I fundamentally restructured the mental model. By introducing an Inline Branching UI directly on the Question Card, users could define rules contextually without ever leaving the builder canvas.

Before (Legacy Workflow)
Create Question 1
Save & Switch to 'Logic' Tab
Context switch
Lookup Question ID (Q1)
High cognitive load
Write Rule: If Q1 = Yes, goto Q4

Fragmented cognitive load: Users had to memorize question IDs and navigate across multiple disconnected tabs to link logic paths.

After (Redesigned Workflow)
Create Question 1
Click 'Add Branch' Inline
Zero context switch
Select Target (Q4) from Dropdown
Visual mapping

Contextual execution: Branching rules are defined directly inline with the question, dropping cognitive load entirely.

Skill Spotlight

Inline Logic Design

Replaced disconnected tab-switching workflow with contextual inline branching directly on the question card, eliminating the need to memorize question IDs.

Evidence: Hesitation around branching dropped to zero in Round 2 testing.

This single design decision eliminated the biggest usability barrier. Round 2 testing confirmed that hesitation around branching dropped to zero — users no longer had to hold question IDs in their head while switching between disconnected tabs.

The Interface

Designing a Self-Serve Platform

Research revealed four core capabilities the platform needed: a drag-and-drop question builder, inline branching logic, real-time preview with John Deere branding, and an analytics dashboard that eliminated the manual CSV-to-Excel pipeline entirely.

Solving the Analytics Dead Zone

The old process: export a CSV, open Excel, build a chart, share it in an email. By the time insights reached decision makers, the data was stale. I designed an expandable data table — expanding a survey row reveals interactive analytics right where PMs track their surveys. No separate reports page. No context switch.

Skill Spotlight

Analytics-in-Context Design

Eliminated the CSV→Excel→Email pipeline by embedding interactive analytics directly within the survey management table.

Evidence: Insights accessible in one click, not a multi-step export workflow.

Customer Exp
3455972.5%01.12.2024 20:00PMInternalDP14539AP12684
Surveys received and abandons over time
# of surveysSurveys received / abandons
Placeholder
10007505002500012345678
The Evidence & Growth

Every Baseline Target Was Exceeded

The pilot launched with a 50-person PM cohort across four product lines, tracked over 4 weeks.

-60%Creation Time

Survey build time dropped from ~120 minutes to just 48 minutes.

+25%Completion Rate

Survey completion jumped from 50% to 62.5% due to better logic and branding.

+16%User CSAT

Platform satisfaction score hit 4.4 / 5, easily beating the target.

100%Self-Serve

Completely eliminated engineering dependency for routine surveys.

I can spin up a survey in under five minutes — no coding needed.

Product Manager, Pilot Participant
Skill Spotlight

Enterprise UX & Self-Serve Design

Transformed a tool requiring engineering support into a fully self-serve platform, eliminating dependency entirely.

Evidence: 100% self-serve adoption; pilot success led to funding for advanced features.

What Came Next

The pilot's success led directly to funding for Multi-Language Support, Advanced Cross-Tabulation Analytics, a permanent Template Library, and full CRM integrations. A tool meant to solve a 2-hour bottleneck became the foundation for how John Deere listens to its users.

This case study documents design and research work conducted at John Deere. Metrics and details have been shared with permission. Visual assets have been omitted to respect internal confidentiality.