Case Study: UX Strategy

How I pitched and led the production of a custom interactive Wealth Management Onboarding Experience, making our client and their employees happier.

Opportunity

Our client (one of the largest financial institutions in Canada) was ready to walk. They had an aging HTML experience used to onboard top-tier employees into their exclusive wealth management program. It had performed well initially but was outdated and no longer meeting user expectations. Engagement was dropping, and the client was at risk of churning.

User impact:

High performers are busy people. They need to absorb complex information and understand their takeaways and next steps quickly, even when they’re joining a wealth management platform. Uninformed decisions could irrevocably affect how they access and pay taxes on their bonuses for 5-10 years or more.

Strategy

I proposed a complete creative refresh—revamping the narrative voice, visual experience, and user flow to better align with what users need at the moment they qualify for the program.

Our approach:

  • User feedback review to understand pain points in the legacy experience

  • Script and storyboard reimagining to improve clarity and emotional engagement

  • Experience redesign optimized for desktop (where most users accessed it), balancing professionalism with personalization

  • Voiceover coaching and QA, ensuring tone aligned with the experience goals

  • SCORM integration with analytics to track progress and completion within the client’s LMS

Execution:

I wore many hats…

  • Pitched the full creative strategy to win the refresh

  • Scripted and storyboarded the new experience end-to-end

  • Built the interactive experience using eLearning software and a dedicated hand-selected team of freelancers and SCORM specialists

  • Directed voiceover talent for tone and consistency

  • Coordinated with developers to QA SCORM compliance and analytics before launch

Results

We tracked exponential growth in unique user engagement with the tool over 3 quarters of analytics and a higher-than-average completion rate.

< Shortened demo

Completion rates

For multi-question decision support websites, a “good” conversion rate typically falls between 10% and 30%. However, this can vary depending on several factors, including the industry, target audience, and the complexity or length of the experience.

It can be difficult to put these numbers in context since users are driven to decision-support tools for different compelling reasons. Forrester Research and McKinsey have explored these factors and suggest a good benchmark for experiences like ours is:

  • Excellent: 25-30% (we are here)

  • Good: 15-25%

  • Average: 10-15%

98% of users reported improved understanding and readiness to act.

and also

+13% Improvement
in perceived helpfulness vs. the legacy version.

Indication: We won our clients’ business and helped their employees make informed choices