July 2022, I joined a Vancouver-based indie game studio as the Lead UX/UI Designer for Battle Rally, our first mobile title. The project was funded by the Canada Media Fund, and unless we hit a 28% Day-1 retention rate, our funding would be cut.
The team was talented but fragmented, and our Q1 2023 soft launch showed only 13% D1 retention. Players were dropping off early, and no one really knew why. That’s when I was brought in to fix the onboarding experience and get new players to stick around.
I led the end-to-end design process from research and in-house testing to the redesign of mission flow and reward systems. The result: a 17% increase in D1 retention and 66% uplift in onboarding completion, plus a lasting shift toward continuous user testing in the team.
We faced limited funding and an extremely tight timeline. No formal testing had been conducted before I joined, leaving us with only marketing's user acquisition (UA) data to work with.
While UA provides large-scale metrics, these numbers raise questions rather than provide answers. Metrics tell us “30% drop-off at step 3 of onboarding” but don’t explain why.
The lack of UX budget for testing created a disconnect between design and users. My role was reduced to UI polishing, without the ability to validate ideas with real players. Most team members assumed players were dropping off because they didn’t like the gameplay.
With a tight deadline and a hard funding cutoff looming at the end of Q2, continuing to build without testing was a dangerous gamble.
I proposed a pilot-first approach: forget full-scale research, we’d solve one narrow question:
Why are players dropping off in Race 3 of onboarding?
I repurposed internal resources and recruited 7 colleagues from other teams who had never played the game. Using two spare devices, I set up a dual-camera test rig, one recording their faces, the other their screen interactions.
I tracked behaviour manually with timestamps and created a highlight reel showing players’ visible confusion. The insight was clear:
The unclear Mission Obejective screen
Result screen when players win
Result screen when players lose
I packaged the findings into a single-page behavior map and showed a 3-minute video reel at an all-hands meeting. It sparked real empathy, especially from leadership. For the first time, the team saw players genuinely struggling.
I proposed a small external test using PlaytestCloud. With approval in hand, I designed the test script and task flows, rolling it out within 3 days. This marked the beginning of our first formalized UX testing loop.
I was presenting the insights from the user testing and conducting a workshop to determine design priorities
With insights from both internal and external playtesting, I led a redesign of our onboarding flow, especially the "Start a Race" and "End of Race" screens. We restructured mission prompts, added visual clarity, and introduced animated feedback to reward success and explain failure.
Here’s what happened post-redesign:
But more importantly, UX testing became a required step in every major release. And PlaytestCloud became a standard research tool we use for external testing. I created the team’s first UX test framework, scripts, feedback templates, and documentation, so testing could scale beyond me.
💡 Kindly note that you can use the slider to view the design before and after.
Enhanced Mission Objective Visualization
Revamped End Result Interface
Revamped Rewards Presentation Interface
Let's work together and create something amazing!