Interview with Dr. Elena Marchand: Smart Design Thinking Workshops Strategies for Executive Customer-Support

Can you clarify what often gets misunderstood about design thinking workshops, especially in large automotive electronics companies?

Many executives assume that design thinking workshops are primarily creative brainstorming sessions fueled by intuition and gut feeling. That’s a misconception. In complex automotive electronics environments, relying solely on subjective input creates risk. These workshops must integrate rigorous data analysis and evidence-based decision-making throughout, or they drift into wishful thinking.

This means before any ideation starts, customer support leaders need granular, validated data about client pain points, product failure patterns, and support process bottlenecks. For instance, an automotive supplier producing ADAS (Advanced Driver-Assistance Systems) modules can’t base design improvements only on anecdotal complaints. They require quantitative feedback, call center analytics, and failure mode data linked to vehicle telematics.

Ignoring this results in solutions that might sound innovative but don’t measurably improve key metrics like first-contact resolution rate or reduce warranty claims. A 2024 Forrester report revealed that organizations embedding analytics into design workshops raised their customer satisfaction scores by up to 15% within one fiscal year — compared to a 3% lift for those that did not.

What should an executive prioritize when structuring these workshops around data-driven decision-making?

Start by framing the workshop’s purpose with clear, measurable goals aligned to strategic KPIs, such as reducing electronics-related downtime or decreasing call escalations for infotainment faults.

Next, curate relevant data sets ahead of time. This includes:

  • Call volume and resolution trends from your CRM
  • Warranty claim and repair data from partners and OEMs
  • Customer sentiment analysis from surveys and social media monitoring tools
  • Telemetry data indicating real-time system faults

Surveys using platforms like Zigpoll or Medallia can add structured qualitative data to complement quantitative sources.

Inviting cross-functional stakeholders — product engineers, field techs, customer support analysts, and supply chain leaders — ensures diverse perspectives in interpreting this data. Use data visualization tools during workshops to keep discussions anchored in facts, not assumptions.

Finally, design experiments to test hypotheses generated during the session. For example, if a workshop surfaces that a particular sensor’s calibration errors cause 20% of support tickets, prototype a software patch and measure its impact on ticket volume before full rollout.

Are there particular challenges or trade-offs when running these workshops in large enterprises with thousands of employees?

Large enterprises enjoy deep troves of data but suffer from organizational silos, making data access and alignment difficult. Too often, data lives in departmental silos — warranty teams, support centers, and product quality groups don’t share insights routinely. This fragmentation clouds the full picture.

A second challenge is scale. Workshops must balance inclusivity with focus. Inviting 30 people risks unproductive debates; having only five risks missing critical insights. One automotive electronics firm we worked with capped workshop participants at 12, split into two sessions to manage scale while maintaining effectiveness.

Another trade-off is time versus depth. Executives want quick insights but meaningful data analysis requires time. In one example, an OEM’s support team spent three weeks preparing data dashboards before their design thinking sessions. This upfront investment yielded a 40% improvement in identifying root causes compared to previous ad hoc workshops.

This process won’t work well for smaller companies lacking data infrastructure or where customer support data is too sparse for meaningful analysis. However, even modest data collection improvements before workshop planning can help.

Can you share an example where a data-driven design thinking approach significantly impacted customer support in automotive electronics?

A Tier-1 supplier of cockpit electronics, supporting vehicles from multiple OEMs, faced rising customer complaints about intermittent display failures. Their support team held traditional brainstorming workshops focused on hardware redesigns, but call volumes continued climbing.

They shifted to a data-driven workshop approach, integrating customer call logs, repair shop diagnostics, and in-field failure telemetry. The data showed 60% of failures correlated with software glitches triggered by firmware updates. During the workshop, they hypothesized an over-the-air update protocol issue.

Post-workshop, they designed controlled A/B experiments deploying revised firmware to a test fleet. The result: a 75% reduction in support calls related to display errors within three months, directly attributable to insights from the data-anchored workshop.

This example shows how emphasizing evidence over intuition enabled the support team to pinpoint a non-obvious root cause and justify targeted resource allocation. ROI was clear: reduced support costs, improved customer satisfaction, and stronger OEM relationships.

How can executives measure the ROI of these workshops and connect outcomes to board-level metrics?

Quantifying impact starts by defining KPIs linked to strategic business goals before workshops begin. For automotive electronics customer support, relevant metrics include:

  • Reduction in support ticket volume and repeat calls
  • Decrease in average handling time per call
  • Improvement in first-contact resolution (FCR) rate
  • Lowered warranty claim rates and repair costs
  • Enhanced customer satisfaction (CSAT) and Net Promoter Scores (NPS)

Track these metrics in periods before and after implementing workshop-derived solutions. For example, a company that decreased infotainment-related warranty claims from 3.2% to 1.8% over 12 months could translate that into millions saved in repair and recall expenses.

Data from post-implementation surveys using tools like Zigpoll or Qualtrics can validate improvements in perceived support quality.

Communicating ROI to boards requires linking these operational improvements directly to strategic themes like brand protection, regulatory compliance, and cost efficiency. Framing insights as “through data-driven workshops, we shortened the mean time to resolution by 22%, reducing downtime impact on vehicle fleets and improving our Tier-1 partner standing” resonates with executive audiences.

What advice would you give to an executive customer-support leader planning their first data-driven design thinking workshop?

Start small but prepare meticulously. Don’t underestimate the effort needed to gather and clean relevant data beforehand. Partner closely with analytics teams to ensure data quality.

Choose a narrowly defined, high-impact problem area. For example, focus on support issues related to a single electronics module or vehicle platform rather than wide-ranging challenges. This drives sharper insights and more actionable outcomes.

During the workshop, enforce discipline to ground discussions in data and avoid drifting into abstract ideation. Use live polling tools like Zigpoll or Mentimeter to collect participant feedback instantly and keep alignment.

Plan post-workshop follow-ups that translate ideas into experiments, pilot programs, and measurable results. Assign clear accountability for data tracking and communicate progress regularly with leadership.

Remember, this approach requires culture change. Executives must champion the value of evidence-based decision-making and foster collaboration between data analysts, product teams, and customer support.

How do you see data-driven design thinking evolving in the automotive electronics customer support space in the next five years?

The integration of real-time vehicle telematics and AI analytics will deepen. Executives will increasingly rely on continuous data streams from connected cars rather than periodic surveys or support logs alone.

Workshops will evolve to become more dynamic, incorporating live data dashboards and simulation models to test hypotheses instantly. This shift will reduce the traditional time lag between issue discovery and resolution.

However, complexity will increase. Executives will need to balance data privacy, cybersecurity concerns, and the growing volume of IoT-generated data. Automated tools to curate and prioritize data for workshops will become essential.

Finally, a stronger emphasis on predictive analytics will allow customer support teams not only to resolve issues faster but anticipate and prevent failures before customers notice. This proactive approach will redefine competitive advantage in automotive electronics support.


This interview highlights how executive customer-support leaders in automotive electronics can drive measurable business value by rethinking design thinking workshops through a data-driven lens. By anchoring ideation in analytics, experimentation, and evidence, these workshops become a strategic lever to improve product quality, optimize support processes, and ultimately strengthen their position in a hyper-competitive market.

Start surveying for free.

Try our no-code surveys that visitors actually answer.

Questions or Feedback?

We are always ready to hear from you.