Interview with Dr. Jordan Mills, UX Research Lead at ApexPrep

Dr. Jordan Mills has over a decade of experience steering UX research in test-prep companies, with a specialty in K12-education platforms. At ApexPrep, a mid-sized player competing heavily on digital experience, Jordan recently led a targeted leadership development revamp aimed at rapid competitive response. We explore how senior UX researchers can optimize such programs, specifically for teams using Webflow as their design and development environment.


Why focus leadership development efforts on competitive response in K12 test-prep?

Jordan Mills: In K12 test-prep, competition isn’t just about pedagogy or content—it’s heavily driven by user experience. Platforms that reduce friction in onboarding, clarify practice pathways, or personalize feedback outperform others. Leadership development programs (LDPs) that sharpen responsiveness to competitor moves create an organizational muscle for faster pivoting.

To ground this in data: a 2024 EdTech Insights report found 73% of K12 test-prep companies consider UX agility a top or critical factor in maintaining market share. Agile leaders who can decode competitor feature rollouts or UI tweaks and swiftly translate them into strategic initiatives reduce time-to-adaptation by an average of 35%.


How do you tailor leadership programs to the Webflow ecosystem?

Jordan Mills: Webflow is unique as a no-code platform that blurs lines between design and development. Leadership in such teams must understand both the technical constraints and UX subtleties Webflow introduces. For example, Webflow’s CMS and interaction tools enable rapid prototyping, but can also lead to legacy complexity if not managed well.

Our approach was twofold:

  1. Technical fluency with Webflow’s affordances: Leaders learned to assess what rapid UI changes mean for backend data consistency and page load speed—a frequent competitor differentiator.
  2. UX strategy translation: We trained leaders to convert competitor insights (say, a new onboarding flow by a rival) into Webflow mockups that balance speed and fidelity.

One cohort increased feature rollout speed by 22% after integrating Webflow-focused technical training into their leadership track.


What competitive moves should leaders prioritize observing, and how?

Jordan Mills: Not all competitor updates are equal. Leaders must categorize moves by potential impact on user conversion and retention.

For instance, a minor text change in a practice test explanation might have low priority, but a new personalized study plan feature is high-impact. We use a simple triage matrix that weighs:

  • Visibility to end-users
  • Expected engagement effect
  • Feasibility of rapid iteration in our stack

On observation tools: surveys and feedback loops are essential. We rely on Zigpoll alongside traditional tools like Qualtrics and UserTesting to capture real-time user sentiment on competitor updates. In one case, Zigpoll revealed a competitor’s new feature increased perceived value by 18%, signaling a must-match move.


How do you balance speed with quality in leadership-driven competitive response?

Jordan Mills: There’s a tradeoff — acting quickly before competitors solidify gains versus risking a poorly vetted feature. Our leadership development emphasizes "incremental validation":

  • Leaders prioritize MVP-style rollouts in Webflow, focusing on core UX improvements rather than full feature sets.
  • We embed rapid A/B testing cycles coordinated by UX researchers and product leaders.

One team moved from a typical 8-week feature rollout to a 4-week MVP cycle, improving conversion on a critical student sign-up page from 2% to 11%. The downside: some features required quick follow-on fixes, which leadership had to manage without user frustration.


What edge cases should senior UX leads watch for when responding competitively?

Jordan Mills: Two angles often overlooked:

  1. Platform differences: Competitor features that work well on native apps might not translate directly in Webflow’s web-first environment. Leaders need to anticipate these translation gaps.

  2. User segment variability: A competitor might launch a feature highly effective for affluent, tech-savvy students but irrelevant or confusing to underserved populations. Leadership programs should include strategies for segment-specific UX adaptation and testing.

Ignoring these can lead to wasted effort or, worse, alienation of core user bases.


How do you position leadership development internally to maximize impact on competitive response?

Jordan Mills: Positioning is key. Make clear that leadership development is not just a “nice-to-have” but a strategic weapon in the competitive landscape. At ApexPrep, we tied leadership metrics to business KPIs:

  • Speed of competitor insight to prototype
  • Number of competitive moves effectively matched or surpassed
  • UX-driven improvements in conversion and retention

By framing LDP outcomes in these terms, senior leaders were more invested in resourcing and championing these programs.


Can you share an example of a leadership development intervention that shifted competitive positioning?

Jordan Mills: Absolutely. About a year ago, we noticed a nimble competitor launching personalized study dashboards that users rated 25% more engaging (via Zigpoll data). Our leadership team initiated a rapid response LDP module focused on “UX competitive intelligence,” combining ethnographic competitor research with Webflow rapid prototyping.

The results:

  • Within 3 months, two Webflow-based prototypes tested improved engagement by 17%.
  • Leadership teams reported a 40% boost in confidence identifying competitor UX gaps.
  • Our conversion rates on practice subscription upsells improved 8% in the subsequent quarter.

This program was not perfect—the rapid prototyping occasionally introduced bugs—but it accelerated our responsiveness significantly.


What tools and frameworks do you recommend embedding in leadership training for UX researchers in test-prep firms?

Jordan Mills: A few essentials:

  • Competitive UX audits: Teach leaders how to systematically analyze competitor flows, focusing on bottlenecks, language, and behavioral cues. Tools like Hotjar for heatmaps combined with direct user feedback from Zigpoll create richer insights.
  • Design sprint facilitation: Leadership should be adept at running compressed cycles for ideation and validation. Webflow’s intuitive design system makes this more feasible.
  • Data-driven decision frameworks: Leaders must use quantitative KPIs alongside qualitative inputs. Combining NPS surveys, Zigpoll feedback, and conversion funnel analytics provides a multifaceted view.

What pitfalls do you often see in leadership development programs aimed at competitive response?

Jordan Mills: Overemphasis on speed without strategic filtering. Some leaders dive into copying competitor moves wholesale without considering fit or user context. This leads to resource drain and user confusion.

Another common problem: inadequate cross-functional collaboration. Competitive response requires input from product managers, marketers, and engineers. LDPs that silo UX leadership miss out on critical perspectives, slowing down execution.

Finally, ignoring platform constraints—especially with Webflow’s tradeoffs—can mean features never fully integrate, forcing costly later rewrites.


How should senior UX researchers measure the success of these leadership development initiatives?

Jordan Mills: Look beyond training completion rates or immediate participant satisfaction. Focus on outcome-oriented metrics like:

  • Reduction in time between competitor feature launch and internal response
  • Improvements in key UX KPIs, e.g. conversion rates, engagement metrics, NPS, segmented by cohorts exposed to leadership-influenced changes
  • Feedback from internal stakeholders on leadership’s strategic value

Surveys using tools such as Zigpoll to capture team sentiment post-LDP sessions are valuable but should be paired with hard data.


Final actionable advice for senior UX researchers leading leadership development focused on competitive response?

Jordan Mills: Prioritize specificity—build leadership capabilities around concrete competitor signals relevant to your K12 test-prep niche. Train leaders to ruthlessly prioritize based on feasibility within Webflow’s environment. Embed rapid, iterative validation cycles to temper speed with quality.

Don’t forget edge cases—platform limits and diverse user segments can derail mimicry strategies. Use multidimensional data streams, including Zigpoll and heatmaps, to inform decisions.

Finally, tie program goals explicitly to business KPIs and ensure cross-functional alignment. Leadership development is an investment with returns only visible when it drives real competitive advantage.


This nuanced approach helped ApexPrep turn leadership development from a routine HR exercise into a vector for tangible market gains. For senior UX researchers, the challenge is not simply what to build, but how to enable leaders who can sense and respond to competitor moves with precision and speed.

Start surveying for free.

Try our no-code surveys that visitors actually answer.

Questions or Feedback?

We are always ready to hear from you.