Please note: there is a reason for the quality of the screenshots - high quality assets sharing is not allowed by this particular employer.
———
1. Background
At the start of the year 2023 I was approached by the Loyalty & Rewards team, with a request to explore a potential design solution to a very specific problem:
The exclusive events we have for our Premier customers were lacking a digital presence and therefore, scalability.
With the focus and attention the Premier Tier was getting from the business, this seemed like a perfect opportunity to apply my expertise as a design lead, and rally my new stakeholders to the most optimal solution, solidifying the role of Design in the process, while evangelising methodologies like Design Thinking.
The clear backing from above also meant a chance to introduce the concept of experimentation, which resonated very positively within the working group and stakeholders, as this meant less pressure to get things done "perfectly" on the first try, and more open space for ideas to test, iterate and learn from.
———
2. Design sprint
The Product owners have admitted to me that they've not done anything like this before, which naturally prompted me to step into a leading role throughout the sprint, which I set up and run over a timeframe of about three weeks, with apx. two session per week, to respect people's busy diaries and give enough time to digest feedback and plan the next steps.
I put a lot of emphasis on the stakeholder's understanding of every step of the process, and invited them to contribute, both during the sessions and offline.
Running the sprint remotely meant I had to deal with the challenge of the lack of the face to face engagement, so I planned the sprint in a way that was interactive and collaborative, using our virtual white-boarding tool, Figjam.
The success of the sprint relied heavily on collaboration; and in my facilitation I've ensured we had enough input and agreement to progress fast enough to not lose momentum and focus, while allowing enough time for the main anchoring points, like the problem statement, the success metrics, the design principles and the personas, to be formalised and agreed on fully.
Setting up the testing sessions with the researcher meant we could act on the following customer feedback quickly, and agree on the delivery plan with provisional live date within less than 2 weeks.
Planning and facilitation
Inspired by the Google Sprint philosophy, I've allocated time in each session to a specific stage in the Design Thinking methodology, navigating the conversation from distilling the problem statement and ideating solutions, to prioritising high value ideas to then draft, prototype and test.
Problem statement
"If I had an hour to solve a problem I'd spend 55 minutes thinking about the problem and five minutes thinking about solutions."
- Albert Einstein
The first session was naturally the most important one. I had the attendance and attention of people new to me, with some of them being new to the Design Thinking, sprints, and even designers in general. So the right introduction into the process, and the buy-in from all stakeholders were extremely important.
I made it clear that the core working group (design, POS) was required, with the other disciplines (who we called "experts") were beneficial to attend.
I found rallying the team to the solution more efficient, once I've explained my role, and more importantly - their roles, as contributors, clearly.
Our distilled problem statement ended up as:
”We currently lack a CENTRAL PLACE for customers to FIND experiences/events that are VALUABLE, RELEVANT (PERSONALISED) and EXCITING enough to ENGAGE them, via a FLEXIBLE/SCALABLE implementation, to drive NPS/SATISFACTION levels up”
Ideation
So what can be done?
To encourage the group to share ideas, I've used a couple of techniques, like the "rose, bud and thorn", the affinity diagram and the impact vs effort matrix.
Voting as a group helped identify the agreement level needed to prioritise the ideas with the most consensus.
The high-impact/low-effort ideas were then selected collectively, to be used in the design drafts and the prototype testing.
Stakeholders
It was important to have as many people from different departments attending the sessions as possible. Having everyone at the table from the get-go meant less surprises later and a smoother progress overall. This was my soft push to "agile", so I've emphasised the benefits of all the stakeholders connecting, in an endeavour to challenge the "waterfall" processes they were used to.
We had a very good attendance from multiple teams, as seen on the left.
In addition to letting every participant in the sprint introduce themselves by their name and role, I asked them to also provide a short summary of the expected outcome of the sprint, from their discipline perspective.
This resulted in a deeper subconscious connection to the project, as well as a feeling of a personal responsibility for its success, due to their name pinned on the board and visible to all.
Proto-personas
One of the session was dedicated to the people who we envisioned using the product. It was important to consider every possible attribute, be that demographics, interests, NPS, digital engagement levels or more.
We used the term "proto-persona" to emphasise their speculative nature, and the fact they would be validated with user testing.
The result of the session was not only a deeper understanding of how our proposed solution would solve each proto-persona's needs and goals, but also an increase in empathy, thanks to the use of names and the user stories written for each one.
Experience mapping
The next step was to overlay the customer’s actions/goals over a linear progression through the journey. This helped us understand the exact stages of the experience and the touchpoints with specific systems, channels and communications. I always find this mapping exercise useful in creating further empathy with the user, while envisioning the journey, identifying the potential points of friction to address.
This resulted in a clearer vision for the next steps, the drafting of the designs.
Wireframing
I’m a proponent of a somewhat unorthodox approach - drafting/wireframing designs with the Product owner/manager. Throughout my career I’ve found that this method further solidifies their sense of involvement, increases empathy with the user and prevents any surprises later on, in the higher fidelity design stages.
After putting the screenflow together and reviewing them with all the stakeholders, I could now “add the skin to the bones” with our UI component library.
UI
My main goal while designing the journey in high fidelity was to ensure consistency with the rest of the app, and to clarify each component needed with our Brand team, and the developers, to prevent any blockers in the build. Having a predefined Figma library of course made this phase easier.
User testing
"Pay attention to what users do, not what they say."
- Jakob Nielsen, Co-founder of Nielsen Norman Group
Prior to engaging the Research team to facilitate a moderated user testing session for our prototype, I’ve run another session, where I asked the group to collaborate on the testing plan.
The Researcher appreciated a clearer idea around what we wanted to find out, which helped him complete the facilitation script faster.
The core group then attended the sessions to ensure we capture the user behaviour, as well as their thoughts and feelings.
Outcomes
To help the stakeholders understand the Design Thinking method, I've aligned the sprint sessions to the Double Diamond framework, which was much more widely promoted in the business. This resulted in a clearer focus on the objective in each phase.
Within a couple of weeks we had a fully interactive prototype, which we tested externally and iterated on accordingly, following insights and the global feedback.
———
Technical delivery
From the point of the sprint's output we had about two months for delivery. We were introduced to the dev team, the copywriter, the Insights team and the third-party provider.
In these two months we've had daily stand-ups with actions to overcome various technical challenges.
For me it meant constant support of development, raising and discussing solutions for potential risks (design and security-related), tweaking copy and designs, and ensuring the design files were up-to-date at every step.
I have to note that another factor contributing to the project's success was the fact that everyone had a genuine interest and input in each other's fields of expertise. This meant that no team member was left to work in isolation, producing their output, i.e.:
There was no time for a "waterfall" development, so we had to be agile!
It was a very respectful and democratic process throughout, and we were able to rely on facts, data and insights as the anchor for discussions, thanks to the prior customer testing. We've all learned a lot about each other's jobs!
———
Tech challenges
Challenge: proposed experience not achievable without app release.
Solution: CSS hardcoded & releases to .CO.UK demo micro-site.
Result: delivery was possible via the alternative route, and with support of a third party.
Challenge: customer data capture and transfer back seemed impossible!
Solution: use of cookies and an approved agency platform to get the data
Result: use of Insights platform together with customer input of an identifier in-code to enable us to map back and enrich our data through the agency to learn more from the experiment.
Design challenges
Challenge: constant pushing of the new brand delivery date, meant I had to:
Solution: translate the designs into Legacy, so that:
Result: consistency with the current language was ensured when this went live.
Challenge: no prior template due to this being a new feature, meant I needed to:
Solution: design the experience from scratch, taking into account the multiple channels it span, maximising each platform's interaction capabilities.
Result: after the flow was validated by testing, we had a new blueprint for similar experiences we'd consider in the future, including the strategic permanent long-term solution.
———
Final designs
The approved design grew naturally into a plethora of screens, to ensure we have thought about every step of the journey, happy and unhappy path included.
Per the challenge above, I had to help the team pivot from one design language to another, to align with the look and feel of the app for the time of the launch. This is where my Figma skills came to play, along with my ability to delegate work between me and the designers I manage, to maximise efficiency.
———
Quantitative analysis
The first thing I've done post-launch is to prompt the team to find us an Analytics partner, so we can extract the traffic data coming through.
I wanted to visualise the step by step journey with the traffic numbers, emphasising the drop-out numbers, to make us understand where the customer had potential pain-points or any friction preventing them from completing the journey, and signing up to an event.
This qualitative analysis has proven very useful and has contributed to our overall learnings, alongside the qualitative research (a user survey) that followed.
Another positive outcome here was the collaboration with the Analytics team, that now understood the reasoning for my request and the overall impact their data had on the success of our experiment.
———
Qualitative analysis
The second thing was to highlight the need to complete the picture with customer insights and verbatim via a survey we'd send for them to complete.
My rationale was that this would answer the question "why?" along with the question "what?" that we had answered by the traffic funnel analysis.
Working with the research lead we've come up with a survey that gathered the user sentiment on every step of the journey, both with open and closed questions.
The result of this analysis was a clearer areas of focus where the customers scored the interaction/information on the screen lower, and the confidence in the other areas, where the sentiment was positive.
———
Performance metrics
Out of the 10K customers we aimed for, the reach was 7,824.
1,282 out of 1,351 impressions on the entry point have interacted with the first call to action to start the journey
846 out of a 1,000 customers have submitted their interests, which is a vital step in the journey (the input required to indicate a potential conversion).
The engagement only fluctuated by 22% from week 1 to week 4 post-launch.
CTR% increased from 4% on week 1 to 64% on week 3!
Digital events will have a positive impact on NPS, even for customers who don't win freebies/high value assets.
The survey has highlighted a need to scale up to more diverse and unique content, which is what I’m currently doing for the next iteration of the product.
———
Hypotheses validation
“Premier customers will have the appetite to view their experiences (events) in the app.”
17.3% have clicked through to “find out more” which elevated engagement for the component it’s placed on from 4% to 6% during the experiment.
Conclusion: there’s a high level of interest in the feature, and the entry point placement is optimal.
“Customers will be willing to share their preferences in order to access relevant content.”
Of the 1,351 customers who clicked the Insight card, 968 (72%) progressed to “Choose your experience” and 846 (87%) went on to complete and submit their preferences.
Conclusion: 87% of customers are willing to share this information, and active engagement can optimise the funnel further.
“Customers will find it useful to register for events that are ‘coming soon’.”
40% of customers went down the ‘coming soon’ route for the Football category; 34% of customers went down the ‘coming soon’ route for the Tennis category.
Conclusion: The answer is yes - customers have shown the willingness to provide their interests, if it meant a more personal experience.
“Customers will return to access the events on a weekly basis.”
Inconclusive: it’s likely, but more testing is required on different engagement mechanisms.
———
Outcome and key findings
Phase 1 Results
The results from Phase 1 were extremely promising in terms of customer interest, willingness to share hobby information, and willingness to engage repeatedly.
We saw a 17 % CTR on the insight (vs 4-6% predicted). Of those that clicked through, we saw an 72% conversion rate to choosing preferences, and of those 87% went on to complete the journey with preferences submitted and events selected. 15-35% returned weekly to explore refreshed content, and over 30 % of customers willingly completed the journey for experiences that were labelled 'coming soon’. Football and tennis were the most popular interest areas (40% for football, 34% for tennis) of the options included.
There are a couple of reasons I'm proud to be a part of this project:
The business will not be afraid to experiment going forward-the benefit of a continuous iterative deployment is clear: there's less risk with a smaller number of customers who see it, and it creates a safer space for testing ideas, when the pressure of having one chance to do it right is eliminated. This might be the remedy to our historic issue of long development processes, or shelving good projects due to complexity.
I'm hoping for this to be used a case study for how a design(er) led project benefits the business, dispelling any remaining doubts we're more than just a Ul resource.
One team/one goal-we worked as one single unit, challenged each other, and delivered designs and change iteratively and rapidly.
I believe I've managed to establish the I&D and our pod as a go-to team to kick-start new ideas and projects, with now established expectations of the process.
I've never seen such a fast delivery in my 3.5 years at Barclays (even if it is for a smaller cohort of customers). The impact on our ways of working is already evident, and the fact the Matt Hand Paul Marriott-Clark were actively involved makes me feel like I've put myself, our pod and the I&D team in the spotlight in a very positive way.
By bringing new teams onboard, like Analytics, Sponsorship, Marketing etc., finding solutions to the technical constraints and the lack of resources together, implementing customer feedback, ensured every possible approach is taken to make the user experience the most optimal it could be.
———
Next steps
Based on our learnings and the now established working pattern, I am now proposing further design improvements, e.g.:
video instead of imagery at the entry point, to further improve engagement
more diverse and appealing content, based on more granular interest selection
push notifications for continuous engagement and updates on the digital events going live
Roadmap and responsibilities
The next phase of delivery requires an updated vision, which prompted me to run a workshop with the team to polish our hypotheses, based on what we've learnt post-launch.
I then marked the required type of work with a name of the team responsible.
This is now helping us have clearer expectations of each team's involvement, to ensure an even more efficient delivery.
———
Thanks for reading :)