Enterprise Platform Transformation

Live Event Technology

Driving UX Strategy Through Field Research & Complex Feature Redesign

ROLE
User Experience Designer, Research & Strategy Lead
TIMELINE
2021 - December 2022
TEAM
Product Management, Engineering, Event Operations, Client Success
CLIENTS
NFL, Disney, Live Nation, major sports & entertainment venues

Impact at a Glance

High-profile event research

at NFL, Disney, and Live Nation venues

Presented to Senior Leadership

multiple times on strategic findings

82% mobile usage

discovery driving mobile-first strategy shift

Cognitive overload reduction

through automation tool redesign

Training time improvement

for internal power users

Design system foundation

(Admin v3) supporting product scalability

Business Challenge

Strategic Context

Leap (then Patron) Technology provided event management software for the world's largest venues and events—NFL games, Disney experiences, major concerts. The platform handled ticketing, registration, check-in, loyalty programs, and on-site activations for millions of attendees annually.

The complexity:

  • Diverse clients with wildly different needs (NFL vs. Disney vs. music festivals)
  • High-stakes environments where downtime = revenue loss and reputation damage
  • Power user tools so complex they required weeks of training
  • Mobile-first reality (82% of end users on mobile) vs. desktop-first platform
  • Rapid event cycles with no room for iteration during live events

My Challenge

The company was growing through acquisition and facing pressure to modernize the platform to compete with newer event tech companies, reduce training burden for clients (onboarding took 4-6 weeks), improve end-user experience at events (attendee-facing touchpoints), & scale product capabilities without fragmenting the experience.

As one of the few designers at the company, I needed to:

  • Establish research as strategic input (not just validation)
  • Build the case for UX investment in an engineering-led culture
  • Redesign complex power user tools without alienating existing users
  • Drive mobile-first thinking across the organization
  • Balance immediate fixes with long-term platform strategy

STRATEGIC APPROACH

Research as Organizational Change

When I joined, 'user research' meant occasional client calls. There was no systematic approach to understanding real user needs or field validation. What I established:

On-site field research program

  • at high-profile events

Ethnographic observation

  • of event staff during live operations

End-user testing

  • of attendee-facing features in real environments

Stakeholder interviews

  • with clients (event managers, operations directors)

Analytics integration

  • (introduced Pendo) for data-driven discovery

HOW I WORK

Field Research at Scale

High-Profile Event Studies

Over 2 years, I traveled to 3 major events to conduct immersive research:

  • NFL games: Observing check-in flows, credential validation, real-time operations
  • Disney events: Understanding multi-day event complexity and attendee engagement
  • Live Nation venues: Studying high-volume, time-sensitive operations

What I Was Investigating

End-to-end user experiences:

  • How do event staff use our tools under pressure?
  • Where do workflows break down during live events?
  • What workarounds have teams created?
  • How do attendees experience our front-end interfaces?

Feature-specific validation:

  • Testing new releases in real conditions (not lab usability)
  • Identifying performance issues before they became critical
  • Understanding context-specific usage patterns

Research Artifacts & Communication:

  • After each event, I created comprehensive readouts:
  • Consolidated findings with video evidence and quotes
  • Journey maps showing end-to-end event staff workflows
  • Pain point prioritization tied to business impact
  • Opportunity areas for product improvements

Sitemap Traffic Notes

  • Initial crowd rushed in for Draft Theater / standing stage
  • Pain-point: miscommunication as attendees were grabbing standing stage space even though the event wasn't open yet
  • Once the problem was identified, attendees dispersed back into the event areas, approaching the closest kiosks
  • Subway had their staff standing in the crowd of attendees trying to direct towards their station as an Autograph event was in the SC event scheduled past 10am in the schedule. Most people did not realize it was happening. It highlights both the lack of ability for people to know of an event activity but are unable to connect the dots.

Key Insights That Drove Strategy

1. Mobile-first reality

82% of end users on mobile devices. This single finding shifted organizational strategy. We were building desktop-first experiences when users were overwhelmingly on phones and tablets.

Impact: Drove investment in mobile-responsive redesigns and tablet-optimized interfaces for on-site staff.

2. Power user tools create training bottlenecks

The Automations feature was so complex it required dedicated training sessions. Clients couldn't self-serve, creating dependency on our support team.

Impact: Justified the Automations redesign project (detailed below).

3. Real-time performance is non-negotiable

During live events, even 2-3 second delays cascade into major operational problems. Performance wasn't a 'nice to have'—it was existential.

Impact: Established performance budgets as design requirements.

FEATURE REDESIGN

The Problem: Cognitive Overload at Scale

The Automations tool was powerful but nearly unusable. It allowed event managers to create complex rules for attendee engagement. Example use case: 'Query all registered attendees → Filter for those who interacted with gamification → Award top 50 with prizes.'

Current State Problems

  • Overwhelming interface: 40+ input fields visible simultaneously
  • No progressive disclosure: All configuration options exposed at once
  • Unclear flow: Users didn't know where to start or what order to complete steps
  • Cryptic error messages: Failed validations with no clear guidance
  • Minimal contextual help: Users relied on external documentation

Business Impact

  • 4-6 week training requirement for new clients
  • High support burden: Repeated questions about the same workflows
  • Client frustration: 'Powerful but impossible to use'
  • Limited adoption: Many clients avoided the feature entirely

Redesign vs Rebuild

OPTION A

Incremental improvements to existing interface

  • Lower risk, faster to ship
  • Wouldn't solve fundamental cognitive overload

OPTION B

Complete rebuild with new data model

  • Clean slate, could optimize everything
  • 12+ month timeline, high technical risk

OPTION C+ SELECTED

Stepwise wizard interface

  • Drastically reduces cognitive load through chunking
  • Reuses existing data model (lower eng investment)
  • Progressive disclosure guides users through complexity
  • Requires changing user mental model

Why I advocated for Option C: The core problem wasn't the data model, it was information architecture and presentation. A wizard could deliver 80% of the value with 30% of the engineering cost.

STEP 1

Chunking through stepped progression

Broke the monolithic form into discrete steps:

  • Step 1: Define audience (who are you targeting?)
  • Step 2: Set conditions (what criteria must they meet?)
  • Step 3: Configure action (what happens to them?)
  • Step 4: Review & schedule (when does this run?)

STEP 2

Contextual help at every step

  • Inline tooltips explaining complex concepts
  • Example use cases showing common patterns
  • Visual previews of what will happen

STEP 3

Intelligent error handling

Real-time validation preventing bad states

  • Clear error messages with specific fixes
  • Visual indicators on which step has issues

STEP 4

Progressive disclosure

Show only relevant fields based on previous choices

  • Advanced options hidden behind "Show more" patterns
  • Simplified defaults for common use cases

MEASURABLE OUTCOMES

Reduced training time

from 4-6 weeks to 2-3 weeks (estimated—need to verify)

Improved user confidence

in post-launch surveys

Reduced support tickets

for Automations feature

Increased adoption

of advanced automation capabilities

Positive client feedback

on ease of use

Reflections & Learnings

What I Learned

1. Field research creates organizational buy-in

Showing video from real events was more persuasive than any deck. Stakeholders who had resisted UX investment shifted when they saw actual users struggling.

2. Complex doesn't have to mean complicated

The Automations tool was inherently complex, but information architecture and progressive disclosure could make it manageable. Chunking and wizard patterns turned a 40-field form into a guided flow.

3. Research is a product

Research readouts, journey maps, and synthesis materials became strategic assets. Making them discoverable and reusable helped align product, engineering, and client success.

4. Mobile-first is a mindset, not just responsive design

82% mobile usage meant we had to rethink workflows for touch, glanceability, and offline constraints—not just scale down desktop layouts.

5. Change management requires storytelling

Connecting user problems to business outcomes (training cost, support burden, adoption) was how I justified UX investment to leadership. Evidence + narrative drove decisions.

What I'd do Differently

Push for more quantitative baseline metrics earlier so we could measure impact (e.g., support ticket volume, time-to-first-automation) before and after the redesign.

Create a research repository so insights from NFL, Disney, and Live Nation studies were discoverable and reusable across the org.

Document design decisions more systematically so future teams could understand the rationale behind the wizard approach and tradeoffs we made.

Impacts on my Career

This role helped me:

  • Build research programs from scratch in organizations without research maturity
  • Drive strategic influence through evidence and storytelling
  • Redesign complex enterprise tools while managing stakeholder expectations
  • Present to executives and frame UX work in business terms
  • Work in high-pressure, high-stakes environments (live events with zero room for error)

Live Event Technology at Leap was a turning point in how I approach research, complexity, and organizational change.