It’s four years since I wrote How to Do a UX Review on 24 ways. Since then I have done many more UX reviews for large organisations, growing companies and startups and learnt an awful lot. It’s time to share that knowledge.
Contents: Questions to ask before you begin the UX review | Inputs before you start the UX Review | UX Review Workshop | UX Review Final Report | Doing the Review | Download Templates
Updated for April 2021 Added questions to ask before you begin.
There’s lots of detail here so grab yourself a nice cup of tea and let’s get started.
What is a UX Review?
A UX review is where an experienced UX practitioner takes data, insights, and business metrics and asses the quality user experience of a website, app or other digital product and makes recommendations based on evidence to make improvements.
Why conduct a UX review?
The best way to improve a user experience is to conduct research with users. However time and budget constraints mean that is not always possible. Reasons why you’d conduct a UX review rather than user research (Taken from my 2015 article)
- Quick results: user research and analysis takes at least three weeks.
- Limited budget: the £6–10,000 cost to run user research is about twice the cost of a UX review.
- Users are hard to reach: in the business-to-business world, reaching users is difficult, especially if your users hold senior positions in their organisations. Working with consumers is much easier as there are often more of them.
How to do a UX review
We look at three things overall.
- Data: What is / isn’t working (From Analytics, Business Metrics, A|B Tests, Email & Social campaigns)
- Insight: Why something is / isn’t working (From User Personas)
- Recommendation: How to improve the things that aren’t working (Quick wins: usability, Long term: user experience, Actions)
I’ve seen plenty of reviews that miss out one or more of these. That’s a sure fire way to mess up. Stay tuned to find out why.
Questions to ask before we begin.
Rather than having the right answers, it’s better to have the right questions. Here’s my list of goto questions. Once you know the answers to these questions improvement becomes so much obvious.
Build from
We want to learn about who users are, what they like and use this to generate new ideas and focus on what works now, many websites don’t do this basic thing
- Who are your customers?
- What do they love about you?
- What are you most proud of?
- What is the key metric you use to judge the success of your website?
- Describe your sales pipeline?
Fix and improve
Here we look at what isn’t working about say a SaaS sales pipeline. Once you know the pain point you can improve it, it’s surprising how many websites don’t focus on this approach
- Where are your sales pain points?
- What do you wish you knew about your website visitors?
Technical challenges
If your website isn’t easy to update, they you are already at a disadvantage. Typically this is because the site is hand crafted from some esoteric (and very cool) framework.
- Tell me about your website platform, does it make it easy to? (ie without a developer)
a. Update copy frequently
b. Understand user behaviour
c. Publish articles and blog posts
d. Run A|B tests
If you can’t do these things easily then you won’t succeed.
Acquisition channels
Once we know this we can optimise for these channels and use the learnings about the successful channels to improve the weaker ones.
- Tell me about acquisition channels
i. SEO
ii. Paid search
iii. Display / Social
iv. Email
v. Content marketing
- Which of these channels work well for you? Why?
Future gaze
We need to know future plans to help get there
- In a perfect world how would you website work?
- What’s stopping you from doing this right now?
Then we help do THAT!
From the answers to these questions you’ll get an idea about what we need to do fix the UX. Now we need to look at further data to understand what is happening now and what we can fix.
Inputs before you start the UX Review
Inputs help us frame the eventual outcomes if our recommendations are successful. We need to know what metrics to improve, that’s both in terms of data analytics but also business drivers.
To do the UX review right you need data about what is working and what’s not. (That’s right numbers. That scared me the first time – I only got a C in maths at school) but actually it’s not that challenging).
We’ll look at understand why something is working or not later on in the review.
Once we know what isn’t working and why it isn’t working we can look to make recommendations to fix what isn’t working.
Analytics Data
Data helps show where things are going well and what needs improvement.
If your not sure what analytics package is being used, BuiltWith.com can tell you.
Google Analytics
If all this talk of data scares you, grab a copy of Researching UX (I’m series editor so know it’s a great book!)
If you are using paid advertising, either Google or Facebook reports. Look for the best performing keywords / ads. These show user intent, that is your users’ goal.
There are two things to look for in Google Analytics to help with the UX review.
a. Landing pages and search terms
Landing pages are the pages users see first when they visit a website – more often than not via a Google search. Landing pages reveal user goals. If a user landed on a page called ‘Yellow shoes’ their goal may well be to find out about or buy some yellow shoes.
The thing to look for is high-traffic landing pages with a high bounce rate. Bounce rate is the percentage of visitors to a website who navigate away from the site after viewing only one page. A high bounce rate (over 50%) isn’t good; above 70% is bad.
To get a list of high-traffic landing pages with a high bounce rate install this rather nifty bespoke report.
b. User / Behaviour flows
This repot is fantastic. It shows you how users move through the website. The order of pages they visit (shown in green) and where they drop out (shown in red).
The report begins at landing pages and the pages we are interested in are the ones where there’s a large dropout, shown as the red arrow point downward. If users are dropping out something is wrong with the user experience on that page / step.
Update for 2020: This overview is great: Metrics that Matter to Product Managers
Mixpanel / Amplitude
Mixpanel and Amplitude are the most common analytics tools for apps. Here’s a guide to Mixpanel for beginners and one for Amplitude
MVT / A|B Programme
If the organisation runs an multi variant testing programme. Find out what has been tried. What has worked, what hasn’t worked and the numbers to prove it.
This can help with any recommendations that you make. Perhaps an idea has been tried and hasn’t worked. What has worked and what can it tells us about our users?
If you aren’t undertaking MVT or if you are but aren’t keep track of the results of the experiments make that a recommendation as part of the review. Here’s a guide to Planning Conversion Optimisation Experiments and keeping track of them and a great overview of why you should be running design experiments.
Email marketing
If you are using MailChimp or other email provider look at the analytics data. Which emails have the highest open rate? What are those subject lines? Again we have some user intent. Equally look at which links have got the most clicks, what terms are used? More intent.
Here’s a guide to Email open rates and a Comprehensive Guide to Email Analytics
Business metrics
What are the numbers your organisation cares about? Is it $@£ sales, or conversion rate, or monthly active users. How is success measured? If you can get the business numbers it will help you prioritise the issues you identify in the review.
Business metrics help us frame the outcomes we want to achieve from the recommendations from the review. For example you highlight the problem that a user needs an account to purchase an item from the store. You recommend that user account creation be merged with checkout. Adding a business outcome will strengthen your recommendation. Outcome: Basket / Cart abandonment will go down. The PIE framework we’ll look at later for prioritising UX recommendations works better with business metrics included.
Here’s a video of me talking about What to measure and what to expect when measuring and improving a user experience.
Update for 2020 Metrics are a huge part of what I do. This guide for business metrics for start-ups is invaluable. It’s a deep read but you will come out of it with a greater understanding of how companies make money.
User Personas
We have some user journeys and now we need to understand more about our users’ motivations and goals. If we can’t run user research we do need to be able to include our users in the process.
I have a love-hate relationship with personas, but used properly these portraits of users can help bring a human touch to our UX review.
The key to using personas is to make the details actionable
The easiest way is to create simple archetype users. The key thing being we need behavioural information (what they will do) not demographics (how old they are, what newspaper they read). Interaction is about behaviour not belief.
You can see the various parts of the persona contain solid, actionable insights. If you don’t have enough information about the persona to do the review ask your team / client to help complete. You can gather the data from existing user research, sales teams of customer service.
You can download the PowerPoint / Keynote user persona format I use (both an example and a blank) at the end of the article.
Here’s some help with Persona photos and where to find them.
Doing the UX Review
Now we have the inputs we need to start the review.
1. Break the site down into common user journeys
a. Attraction – out there getting users to come to you
b. Activation – when they come to you engaging
c. Retention – encouraging users to come back
User journeys don’t start at the website app. For new users to an app it’ll be the App store. For eCommerce it’ll be a Google search. It could equally be a post on Facebook.
So an example attraction journey would look like:
Facebook post » Android App store » Installation » Onboarding
Other example journeys:
Google Ad » Landing page » Sign up » Onboarding
Google Ad » Product Page » Cart » Enter payment details
We’ll take the example of an eCommerce website selling shoes.
We should then walk through the user journey reviewing each step / page for UX issues.
2. Identify the UX issues
Firstly look at the data. Are there pages with high traffic and high exit rate?
Are there pages in the activation flow with high drop out? These are worth fixing as a priority as time, money and effort has been put into attraction users. The Enter Payment page has a 11% exit rate, that is quite low but this later in journey suggests something is wrong and fixing UX problems on this page will have more of a dramatic effect.
This is why data is important. The data gives us clues where to look for UX issues.
The severity of the issues and the priority it should be fixed in are also defined by the data. A high traffic page with an a high exit rate = severe. A page late in the activation flow = severe. The data is a way to sort the issues in terms of priority, we’ll look at this again later when we look at the PIE framework.
3. Identify the cause of a UX issues
We can see from our data that the product page is causing problems.
In an ideal world we’d run some user research to understand why drop out is high here and what we could do to fix it.
We can use the personas to help understand why.
We look at goals and motivations. Can they be met on the page? The example here is to understand the fit of the shoe.
If we look at example of screen we are reviewing:
This is wireframe to show a poor product page, obviously we’d use a real screenshot in the review
Can we see if there is anything to help her understand the fit of the shoe? Yes, we can see it in the user reviews. Going through each part of the persona and seeing if the page under review has content / UI elements that match.
We go through each persona element until we hit returns. There is nothing on the page about return the item. (a big deal when buying clothing online as it might not fit). This could be the cause of the drop out.
a. Quick Wins / Usability Issues
These are easy things to fix, changing copy on a button. Adding info about returns (our example issue identified here).
b. User Experience / Business Challenges.
As the name suggests are not trivial to address. An example we might see here is that our persona wants her shoes by the weekend yet the business doesn’t offer fast shipping. Making the suggestion to add fast shipping is easy, setting up the logistics to offer it is hard. That’s a lot of time, effort and money. If there is significant development to be done to address this issues then these issues can added to the product backlog or if they are new features to the product roadmap.
4. Making recommendations to fix UX issues
Making recommendations is the hard part. There are two schools of thought. 1. The team as a whole should decide what to do about an issue. 2. The reviewer makes a suggestion.
I’m one for making a suggestion, at least to get the team thinking about possible solutions. At this point review any A|B tests run before on this page. Results from those can tell you about ideas tried that haven’t worked before. Adding the business outcome to any recommendations you make will help when it comes to prioritising.
If the problem is on one page, it’s rarely just one thing that’s wrong. It’s far more common for it to be a series of small issues that cumulatively create a poor user experience.
I’ve written an article called How to design the perfect eCommerce product page that might help with recommendations.
Suggestion for adding shipping to the product page, easy to design hard from a business point of view to implement
A good adage is Design Like You’re Right, Test Like You’re Wrong
To validate your recommendations run an A|B test. If you aren’t running A|B tests now is the perfect time to start. Here’s a great resource to get started AB testing – The Complete Guide.
The UX Review Report
A report is of course optional, especially in these days of lean UX. The report should be a succinct as possible as nobody likes a to trawl through 100+ pages.
The objective of the report is to give the wider context behind each UX issues identified. It designed to help people who were not at the workshop get up to speed and to act as a reminded to those that were.
You can download the report in Keynote / Powerpoint (see later)
A good format for the review is to use is to list the user journey at the start of the report along with the data. Then list the personas.
Then for each step in the user journey create a slide for each issue identified.
As you can see I suggest using a heading to represent the step in the user journey and then a heading describing the issue identified. For each step in the journey you’ll probably find multiple issues.
For each issue identified offer some data and a ‘quote’ from the persona. The data shows the extent of the issue, and the quote the human side of the problem.
For the review to be as useful as possible offer a recommendation. This is optional, you could use the workshop or even a later design workshop to identify solutions to key issues. But I find offering a recommendation, even if it’s 100% agreed on the best way of getting stuff fixed more quickly. Where possible include a business outcome, eg increased conversion to help justify making the fix.
Bonus tip: Use Skitch for editing screenshots or Awesome Screenshot Chrome Extensions to grab screenshots.
UX recommendations into MVT / AB test programme
Alongside the report you could produce a spreadsheet of the issues identified along with the severity and the recommendation. This can help the team to sort each issues and plan how to fix and then MVT/AB test the recommendations. Beware of just providing the spreadsheet as the report offers the context of each UX issue and without context UX issues can easily be ignored. Use the PIE framework I suggested earlier.
I favour using the PIE framework for prioritising.
Read more on How to prioritize conversion rate optimization tests using PIE
Improvements to business outcomes / metrics
The more senior members of your team / business will almost certainly ask questions around the business outcome of the recommendations you are suggesting. I’ve been caught out by this before. This is why it’s good to gather the data on business metrics before you begin.
Predicting outcomes and and in-turn business metric improvements is no easy task. Here’s a simple guide for eCommerce: The 3 Most Important Ecommerce Benchmarks and I’ve did a conference talk a few years ago on predicting business outcomes from improved UX:UX and ROI: What to measure and what to expect
Workshop and gaining consensus
The workshop is the most important part of the review. You need the team to buy into your reasoning and recommendations but most important of all you need to make sure stuff gets fixed. There is only so much that can be covered in a report. Allow three hours for the workshop.
Who should attend
Anyone who needs to take ownership of the findings as well as those who are the decision makers in terms of product strategy.
Typically this would mean, Product Manager, Product Owner, Designer and Lead Developer. Also as we’re looking cross channel, Marketing and Social leads
Use the report we generated as part of the review. When presenting the report, explain the method you used to conduct the review, the data sources, personas and the reasoning behind the issues you found. Start by going through the usability issues. Often these won’t be contentious and you can build trust and improve your credibility by making simple, easy to implement changes.
The most valuable part of the workshop is conversation around each issue, especially the experience problems. The workshop should include time to talk through each experience issue and how the team will address it.
As I mentioned earlier break the issues down into two groups:
a. Quick Wins / Usability Issues
What can we fix quickly. How quickly can we do it.
b. User Experience / Business Challenges.
Not easy to fix and should be talked about with an understanding that they aren’t easy to address. Identifying challenging issues to address and talking about them with that understanding helps the person who’s job it is to sort this feel better about what you are suggesting.
I collect actions on cards throughout the workshop and make a note of who on the team will take what action with each problem as well as if it is a quick win or something more challenging. Again where possible include the business outcome for each issue.
Physically handing the card to the members of the team at the end of the workshop is a way of handing them responsibility.
Bringing it all together: free download template
I’ve created a free template you can use for the review. It includes all the elements mentioned above: personas, PIE framework, screenshot formats and section headings.
Good luck with your UX review and let me know how you get on by adding a comment below. If you’d like me to do the UX review on your behalf get in touch.