The main problem of the platform and this work initiative had been started prior to myself starting at M2030. Feedback from SCOs and Supplies meant that they were not aware of their progress, what they had to do within the platform as a minimum and what each other wanted from one another. The lack of perceived value was a sensitive issue and churn/inactivity was inevitable.
The idea of a redesigned landing page had come about from the larger initiative of increasing engagement with the reduction plan feature. The reduction plan allowed Suppliers to create or choose climate emission reduction projects that they would action within their facility/organisation. This also showed their intentions and progress to their SCOs.
From research of general user interviews with Suppliers and analysing Mixpanel data, users
:
- were not aware of the feature
- were not aware of the need/benefits to them
- did not find it valuable for their organisation for many reasons.
Although users personas had already been defined for all of the platform, the 1:1 interviews highlighted that the recurrent end users were of 2 main sorts and this is who I concetrated on:
Further research into Customer Support emails/tickets indicated that users did not know what was being asked of them: was paying the fee enough? Was submitting measurements data enough to meet compliance?
Job titles were being collected from the platform, which gave insight into the prior knowledge of what the platform was for.
The assumption was that users of the platform were Heads of depts. and mostly of those in charge of sustainability.
However, those with the keywords that related to a 'head' indicated that they represented a small minority of the type of users. Again, a small section were of a specific 'sustainability' role.
However, there was a significant unexpected group of users that were of an admin, sales/marketing background, who would not know the basic principles of carbon emissions data sharing platform.
As learnt from 1:1 Supplier calls, the dedicated main user has picked up learning of the platform via colleagues, rather than via the multiple learning processes provided by M2030.
This has meant that features of the platform have been missed or misused.
Mixpanel data showed that the conversion from viewing the action plan or action library did not convert into viewing more info about an action or shortlisting it.
For the Suppliers that we were going to interview, I analysed their events relating to actions and reduction plans, to gain a better understanding of their sequential journey through the platform and any patterns detected from their login date and activity.
Although the platform prescribed the step-by-step process of adding facility measurements first, then adding reduction plans/actions, the users here did the opposite.
This indicated that potentially different users had different assigned responsibilities or found adding actions easier/accessible than adding measurements (which was a time-consuming task).
Each section was being considered as important each other.
Despite there being AI-recommended Actions on the landing page, these were not the most common way a user got to shortlisting an action, despite being the easiest to access. Feedback from calls stated that they were appreciated but did not find them always that suitable. The AWS algorithm was not set up to be customisable to the type of industry.
Adjusting this was decided by the PM and devs to be out of scope.
The flow from viewing an action to shortlisting it was convoluted, involving viewing their current plan, adding multiple filters and returning to the landing page. If the landing page (named dashboard) was a page returned to repeatedly, how could I make the AI-recommended actions more obvious and relevant?
The main JTBD for Suppliers is to enter their facilities emissions measurements. With this entered, a carbon footprint is calculated and a viable reduction plan can be formed. Looking again at Mixpanel data, the majority did not convert to the desired flow of viewing measurements and at the minimum, saving the data. The final event is submitting data.
How could I improve this conversion, to enable the value in creating/inputting a reduction plan?
From a Unknowns and Knowns matrix, I could ascertain what the main weakenesses and strengths were, to focus on an MVP scope.
From some basic research, myself and the PM put together our assumptions taken from this research to better decide what to do and how to prioritise.
What were the risks in our assumptions and going forth with the decisions we had made?
There were a lot of options for the user to choose from within the main top navigation bar. This wasn't feasible to display all these features within the landing page.
From some basic research, myself and the PM put together our assumptions taken from this research to better decide what to do and how to prioritise.
As a designer, I used general design principles to analyse the current landing page.
The onboarding experience was, at this time, limited to an information box. (I did later improve the onboarding experience - please see this project)
The cards were in no particular order and were cognitively overwhelming for a new user. The excess of text was difficult for translation and users were not reading it: evident from the amount of support tickets still received.
Alongside the multiple and dominent recommended actions, which were not always relevant for the sustainability maturity of the organisation or the user. These took up a large part of the screen.
Importantly, it was not scalable: with upcoming features of PCF, a dedicated Help Center, Knowledge Hub, target settings... There would be a risk that with the continuation of this format and the additions needed, that users would be overwhelmed and would have to scroll to find what they needed.
The main carbon footprint report had user feedback that:
- it was difficult to understand at first glance.
- it didn't provide the consolidation of CO2e that was needed.
- the default timeline view (with months displayed) wasn't needed as an overview, only as an export.
- the user did not input any savings data and was therefore not understood, as well as the NACs.
- as the report was on a separate page, within the tree of 'reports', it was not accessed regulary if at all.
Within my design iterations, I started out with sketches of different types of data within differing layouts, as well as deciding how to rework the footprint chart.
There was need to reuse components and keep within the NG Zorro design system.
The wireframe underwent user testing with several Suppliers to gain feedback, which was gathered in the spreadsheet. Questions were designed by myself and the PM. The goals of the testing were to:
- to acquire general open-ended feedback
- time on task of the ask: 'where would you click to go to start or look at what actions have been shortlsted' (note the lack of using direct UX copy)
- understanding whether the data visualisation was understandable and useful to them
- understanding whether the UX copy made sense as well as gave enough instructions.
- and seeing if the RAG status was motivational enough to drive users to explore further
Feedback was requested via a Microsoft form to selected Suppliers, as well as a banner implemented via Appcues: a piece of software that worked on top of the existing UI.
The results of this feedback was small and mostly positive. Negative feedback was responded to for further clarification, but received no response.
After the design had been live for a few months, I then went back to Mixpanel to analyse the impacts. Had page views of the main features to the main JTBD increased? Were the intended conversions happening? Below are conversion graphs of the main JTBD links from the landing page to the desired event.
It is hard to compare like for like data as the original data was for the total amount of users, over an annual period.
At this point, the initiative ended/product changed its direction and, within a small team, we were reprioritised to other initiatives.