Interview with Lydia Harper, VP of Product Strategy, Odyssey Interactive

Lydia Harper brings a decade of experience leading customer-centric product teams at the intersection of gaming and media. Her teams at Odyssey Interactive grew average player session length by 34% while reducing monthly churn from 9% to 6% between 2022 and 2023 (internal metrics). We spoke to Lydia about the specific customer interview techniques that drive retention-focused product decisions.


Why are customer interviews still relevant for retention, given the scale of quantitative analytics available in gaming?

Lydia Harper:
Behavioral analytics are powerful—you see exactly where players drop, what keeps them engaged, and how long they stay. But numbers rarely tell you why someone churned. Especially at scale, silent motivators—emotional friction, social disconnect, post-purchase regret—are invisible in dashboards.

A 2024 Forrester report found that 68% of media-entertainment consumers cite "emotional connection" as the reason they stick with a platform, yet less than 14% of companies systematically capture that in their product research. Structured interviews, done well, surface those non-obvious drivers that move the loyalty needle. That’s how you outperform the market on NPS and stickiness.


What are the first practical steps an executive should take to optimize customer interviews for retention strategy?

Lydia Harper:
Start by aligning interviews to your churn data. If level 18 is where new players drop off, your interviews should focus on users who bounced there, not random power users. Second—recruit customers who left, not just those who stay. That sounds obvious, but most studios fixate on their "fans," ignoring those with painful dropoff stories.

Third, prep your interviewers for active listening and ask retention-specific questions. I recommend scenario-based prompts: "Walk us through the moment you considered quitting," instead of "What do you like/dislike?" It’s more actionable. And always end by quantifying—"On a 1-10, how likely were you to recommend us just before you left?" That gives you a comparative metric for your churn modeling.


Can you share a specific example where customer interviews drove a measurable retention improvement?

Lydia Harper:
Last year, we saw 18% of first-month churn among players who joined via TikTok ads. Analytics showed they played less than two sessions before uninstalling. Interviews revealed that these players expected quick-fire, snackable content—contrary to our game’s 10-minute onboarding.

We shortened the onboarding sequence, following up with an in-game story event that matched the ad’s vibe. Post-implementation, 30-day retention for this cohort jumped from 12% to 23%. We wouldn’t have found this without targeted interviews; the analytics alone suggested only “low engagement,” missing the ad-channel mismatch.


What methods do you use to structure interviews for deeper retention insights?

Lydia Harper:
Mix qualitative depth with quantifiable anchors. We use a modified Jobs-To-Be-Done (JTBD) framework but tailor prompts to the entertainment context. Instead of asking, "What job did this game perform for you?", we explore, "When you chose our game instead of Netflix or Discord, what were you hoping for?"

At least 30% of questions should focus on social retention levers: co-play, guild experiences, event-driven play. For media-entertainment gaming, social context is disproportionally predictive of retention. Every session, ask respondents to map their last five game sessions—where, with whom, and why. That unlocks actionable triggers.

We also triangulate with follow-up surveys using Zigpoll or Sprig to validate themes at scale. If we hear in interviews that in-game chat is “toxic,” we run a quick-pulse poll to 4,000 users for quant signal. The combination of depth and breadth is critical.


How do you select and incentivize interview participants so you get honest, high-signal insights?

Lydia Harper:
Churned users are the hardest to reach but their feedback is gold. We pull email lists from recent cancels or inactive accounts, then offer strong incentives—Amazon codes, in-game currency, or even credits for unrelated apps. The response rate jumps from sub-5% to over 15% when the reward is immediate and relevant.

We oversample under-represented cohorts: older players, non-English speakers, and those who tried but didn’t pay. This ensures retention strategies aren’t skewed toward power users.

To minimize bias, we never pair interviewers with players they’ve interacted with in community channels. And we always frame invitations as “help us improve for others like you,” which reduces the chance of “polite” feedback.


Is there a preferred cadence or scale for interviews when optimizing for retention versus feature innovation?

Lydia Harper:
Retention-focused interviews require smaller, more frequent batches. Doing 100 interviews in a sprint is overkill; you want 8-12 per churn segment, run monthly or after updates. That way, you see how sentiment evolves with product change.

For example, after a controversial patch in 2023, we ran eight targeted interviews weekly for a month, triangulating with a Zigpoll pulse to 3,500 users. This cadence let us spot early warning signs—like increased mentions of “grind” and “tiredness”—before MAU dipped. Compare that to feature interviews, which can be quarterly and broader in scope.


How do you translate qualitative interview data into board-level retention metrics and ROI for executive decision making?

Lydia Harper:
The C-suite wants directional clarity, not anecdotal noise. We bake interview findings into two key metrics: net promoter score delta (pre- and post-feature), and predicted reduction in 30/90-day churn (modeled via regression with survey data as inputs).

For instance, after surfacing “frustration with competitive matchmaking” in interviews, we estimated—using 1,200 survey responses—that smoothing the experience would cut churn by 1.7 percentage points per month. We then A/B tested a revised algorithm; actual churn dropped from 7.8% to 6.2% in the target group. The financial ROI was clear: We avoided ~$400k in projected lost LTV that quarter, based on average player spend (2023 Q4 internal data).

These story-to-metric translations make interview programs defensible at board level, showing direct uplift from customer empathy.


Are there tools or platforms you recommend for streamlining the interview process and connecting it to retention analytics?

Lydia Harper:
We use UserTesting for live interviews and Sprig for in-app intercepts. Zigpoll is extremely useful for rapid, opt-in feedback—especially after users churn or decline to subscribe. Our stack integrates these with Amplitude, so we trace qualitative insights back to behavioral cohorts.

Here’s a comparison of commonly used tools:

Tool Best For Limitation Cost Estimate
UserTesting Deep interviews Lower scale $$$
Sprig In-app quick hits Less depth $$
Zigpoll Churn/funnel feedback Limited branching $

The limitation is, no tool automates insight synthesis. Human analysis is still essential for high-stakes retention bets.


What are common mistakes executives make in applying interview insights to retention strategy?

Lydia Harper:
A classic misstep is over-indexing on the loudest voices—vocal fans or angry churners—without validating against user segments. Another: treating anecdotal pain points as universal, when the silent majority might not care. That’s why triangulation with scalable surveys is critical.

Executives also fall into the “fix everything” trap. Not every pain point is worth a sprint; some friction is deliberate (think: skill mastery loops). Prioritize themes that move retention metrics, not just satisfaction.

Finally, a warning: interview programs only pay off if you close the loop. Failure to act—or even to communicate action—destroys trust, reducing future response rates. We saw this in 2022: after a poorly handled event, interview opt-ins dropped by 70% for the next two quarters.


How should retention-focused interview findings inform competitive strategy and differentiation?

Lydia Harper:
If everyone is optimizing for dopamine hits, loyalty becomes a price war. Interviews reveal the “moats” competitors can’t see—unique community rituals, or emotionally sticky content loops. For example, we learned that our Friday guild trivia nights drove a “fear of missing out” that kept 18% of at-risk users engaged, even when tempted by new titles.

These findings inform not just features, but go-to-market and messaging strategy. If your power users cite “friendship drama” as a churn risk, spinning up more structured co-op play can be a differentiator. The result: lower price sensitivity and higher LTV, because you’re solving for belonging, not just entertainment.


What’s one counterintuitive tip you’d offer executives optimizing interviews for retention?

Lydia Harper:
Interview your biggest spenders about when they almost quit. Most VIP programs focus on “what do you love?” instead of “what was nearly a dealbreaker?” We did this last year and surfaced subtle, recurring issues—like event timing and reward fatigue—that weren’t showing up in churners’ interviews, because VIPs tolerate more but still feel pain. Addressing these drove a 7% uptick in post-event engagement.


Any caveats or situations where customer interviews don’t help retention?

Lydia Harper:
If your player base is extremely transient—think hypercasual or ad-driven portals—interviews rarely yield ROI. Those users churn for cost-of-attention reasons outside your control. Also, if your churn is driven by externalities (copyright take-downs, payment fraud), there’s little for interviews to fix.

And be wary of “design by committee.” Overreaction to qualitative input can dilute your brand or slow decision velocity. Use interviews as a compass, not a blueprint.


For C-suite executives, what are the top three actions to implement now?

Lydia Harper:
First, allocate budget for monthly churn-segment interviews and tie findings to quarterly retention KPIs—don’t treat interviews as a one-time UX check. Second, insist on triangulation: every insight should be validated with a tool like Zigpoll, Sprig, or in-app feedback at scale. Third, commit to transparency. Share what you heard and what you did about it, internally and (where appropriate) with your community.

It’s not about more data, but better signals. The studios winning on retention are those who stay close to the “why” behind every departure—not just the numbers on the dashboard.

Start surveying for free.

Try our no-code surveys that visitors actually answer.

Questions or Feedback?

We are always ready to hear from you.