Selectra's customer space

-

Web App / Energy Monitoring Dashboard / Continuous Product Development

Turning an overlooked dashboard into a product users actually engage with

Role:

Product Designer within a Loyalty Squad (PM + Engineers)

Stack:

Figma, Hotjar, Google Analytics

Outcome:

+40% website traffic / +30% conversion rate / -10% bounce rate post-launch

Outcome:

+60% interactions on the analysis block / -30% bounce rate / 3 min average session time post-launch

The Problem

Selectra's Customer Space is a multi-feature platform built for energy clients: monitor consumption, receive personalized advice, compare deals, manage contracts, and access a personal advisor. On paper, a compelling product. In practice, one feature was quietly failing: the energy consumption monitoring dashboard. Discovery activities told a clear story: users were landing on the dashboard and leaving. They didn't understand what the data meant for them, they couldn't act on it, and the interface wasn't helping. The perceived value of the monitoring feature was so low that it was actively hurting retention (users who didn't engage with it were less likely to return to the platform at all).

My Contribution

As the unique designer embedded in the Loyalty Squad alongside a PM and engineers, I owned the full design scope for this cycle, from discovery through delivery, while operating within a collaborative, cross-functional team dynamic where every decision was tested against product, technical, and business constraints.


Discovery & diagnosis


I contributed to a multi-method discovery phase: user surveys, qualitative interviews, behavioral analytics via Hotjar, and quantitative data from Google Analytics. The convergence point across all sources was unambiguous: users weren't finding value in the consumption analysis, not because the data wasn't there, but because the interface wasn't translating it into something meaningful or actionable.


Planning & prioritization


Working with the PM, I helped structure the cycle: reviewing outcomes from previously shipped features, assessing risks through combined quant/qual data, and breaking down the work into prioritized tasks. The goal was to make the dashboard feel less like a data dump and more like a personal energy advisor.


Design & iteration


I reworked the full dashboard across all breakpoints (mobile, tablet, desktop) with a consistent north star: simplicity and convenience of use. Lo-fi wireframes, component refinement, hi-fi mockups, and an interactive prototype built for usability testing before the delivery sprint. The feedback loop from testing allowed us to catch and fix friction points before they reached production.

Key Decisions

Reframing the dashboard around insight, not data


The original dashboard displayed consumption figures. What users needed was interpretation (what does my consumption mean? Am I spending too much? What can I do about it?). I redesigned the analysis block to lead with contextual analysis and actionable advice rather than raw numbers. The shift was conceptual before it was visual: from data display to decision support.


Usability testing before the delivery sprint


At handoff stage, there was pressure to ship. We had solid data, a coherent design, a clear direction. I pushed for usability testing anyway (because data tells you what's happening, not why, and a prototype test tells you what you missed). It surfaced friction points that would have reached production otherwise.

Impact

The results came in the week following deployment, and they confirmed the diagnosis was right and the intervention was on target.


+60% increase in interactions on the analysis block: the feature users were previously ignoring became the most actively used part of the dashboard. That's not a marginal improvement, it's a behavioral reversal.


-30% bounce rate: users who landed on the dashboard stayed significantly longer and explored further, indicating that the redesigned experience was delivering enough value to warrant continued engagement.


3 minutes average session time: a meaningful benchmark in a product category where users historically checked in and left immediately. Three minutes signals that users were reading, reflecting, and acting on what they saw, not just glancing and bouncing.


Collectively, these metrics validated the core hypothesis: the problem was never that users didn't care about their energy consumption. It was that the product wasn't giving them a reason to. Fix the interface, fix the engagement.


Stack & Process at a glance


User SurveysQualitative InterviewsHotjar AnalysisGoogle AnalyticsCycle PlanningWireframing (All Breakpoints)Component DesignHi-fi UIUsability TestingDelivery SprintPost-launch Metrics