Orion Energy Technology is an energy platform provider. The platform is licensed to retailers to manage billing, payments, direct debit, energy usage, and assisting with home moves. Its current client is one of the biggest energy companies in the UK, OVO Energy. This project is about helping OVO Energy customers understand their energy usage.Energy Usage page shows users their yearly/monthly and daily energy consumptions for electricity and gas. We also show comparisons to users so they can compare their past energy consumption with the current one.
The team was formed during Coronavirus times in May 2020. We saw Energy Usage Customer satisfaction score was very low at 3.50 and the target was 4. The challenge was to find out why the CSAT dropped and how we could fix it. The CSAT score is indicative to protect the company data.
Increased customer satisfaction score from 3.50 to 4.05 in four months. These figures are indicative and not company data. This is to protect company data.
The solution was to present complex data in a simple and concise manner to web and mobile customers.
I led the team to enhance the user experience of the energy usage area of the Saas platform. I also brought in a discovery led approach to the team by setting up and conducting regular research and testing sessions.
In May 2020 customer satisfaction score (CSAT) was very low for the Energy usage section at 3.50.The challenge was to find out why the CSAT had dropped and how we could fix it. The team was formed during the Coronavirus epidemic. Company policy was not to disturb our customers with any additional research requests during these challenging times.
The OKR for the team was to increase CSAT to 4.00. (These figures are indicative and not company data. This is to protect company data.)
The team relied on User surveys, data analytics tools and screen recording tools to give us insight into user behaviour. Instead of interviewing customers we interviewed customer care agents to uncover customer problems.
This was the first remote ideation session for the team and everyone used different mediums to express their ideas.
As there were many ideas, many business and user considerations and device sizes I decided to experiment and find the best way to show usage. Below are a few different designs that we considered
We interviewed and then did testing with 5 web users and 5 mobile users.
By this time, the company had allowed designers to contact our customers for the purpose of interviewing and testing. We tested the below designs:
After the findings from the testing, there were some changes made in the mobile designs
Design 1 for mobile has a floating button at the bottom of the experience. The user can see a drop-down menu for comparison on the top right when they click on kWh. The intention behind a floating button was that it will be visible to the user no matter where they are in the experience and the intent behind the positioning for the drop menu was that it would be easy to spot and will only appear once the user clicks kWh, so it's clear that comparison is only for kWh. Design 2 for mobile has a fixed CTA button on the top right with the comparison button at the bottom of the charts.
Through further consideration, we understood that green looks like the same colour as orange to colour blind users. As we use orange in our charts to show electricity, we decided to replace the green with another colour from the brand palette which was magenta. That would test well with all kinds of colour blindness.
Final designs for the desktop have a more accurate copy and the comparison feature is higher in the information hierarchy. We also went back to the line chart for comparison but made the line chart more bold and prominent for a clear comparison. I also designed the hover bubble to have a short but clear copy, which supported the line chart with the right information. This tested well with the users ⅘ said it was clear and easy to spot.
Final designs for the mobile have a fixed CTA button with a dropdown bar under it. This tested well with the users and 5/5 could find the CTA button and the comparison.
We decided to launch one feature at a time and measure the CSAT after the release. As we worked in a two-week sprint, we could measure the CSAT after every two weeks or four weeks if there was a pure technical sprint in between. This helped us understand what value each change was bringing and also if we were on the right path. These changes were made only for the desktop version.
With the change in the top navigation of year/month/day, there was also 32% increase in page views on the day page (previously this page was hard to find) in first 30-days of launch. These figures are indicative and not company data. This is to protect company data.