What Data Can Tell You About Your User Experience
Analytics gives you a lot of information and can alert you to major user experience (UX) issues - if you know what warning signs to keep an eye out for.
Think of Google Analytics as your user experience’s first line of defense.
Looking at your analytics account should provide high-level insight into your UX.
Are things behaving like we want them to behave?
Are people taking the course of action we want them to take?
Are users navigating your website (or mobile app) as you would expect them to?
If not, your user experience could be to blame.
Before we discuss what warning signs to watch for, it’s important to understand that metrics and analytics cannot tell you why things are happening.
They can point out that there is an issue, but they don’t provide any context.
Analytics is only your first line of defense. It’s the warning sign that something isn’t quite right. Metrics like time on page are ultimately just numbers. They mean nothing in terms of which elements are influencing your users and how. The metrics don’t tell us if the user is confused or uninterested or if something on the page is broken.
When it comes to finding out why, you need more information. You need to conduct user testing.
That being said, there are certain things to watch for in your analytics program that can indicate issues with your user experience. The following metrics are ones you are likely already tracking in your analytics program. Each of these can point out user experience shortcomings, and inform your next steps.
If you have a low completion rate, users are failing to complete the tasks you have set up for them. If users cannot complete a task successfully, you’ve got some work to do.
The goal here is to determine why users are failing at the task you have lined up for them. It will depend greatly on the task at hand. Examples of these tasks include signing up for a newsletter or completing a purchase. More details below on how to handle a few common types of completion rates.
A type of completion rate, this is definitely a red flag. If users are reaching a form page, they should be ready and willing to complete the form. Abandoning it is not a good sign. If you see high form abandonment rates in your analytics, you need to take another look at your forms.
Take a look at the form in question. Some variables to test include length (no user wants to fill out 20 questions just to download a whitepaper), responsiveness (is your form showing up correctly on mobile?), and question type (do you have a lot of open-ended questions? Or scales that don’t make sense?).
Consider testing to see when users are leaving. Are they giving up right off the bat or half-way through? Do users reach a certain question and then give up? This can help you hone in on the complication.
Remember, test one variable at a time.
Are you seeing form responses that are riddled with errors?
This could be because of formatting inconsistencies (think of all the different ways to write a phone number) or answers that seem to totally miss the mark.
If you’re seeing a lot of mistakes and a high form completion time, it’s likely that your users are not having a smooth experience. A lot of mistakes and a rapid form completion time is not any better. It suggests that users think they are completing the form successfully when they are not.
You’ll want to review the mistaken form submissions. Look for trends or patterns. Then, you’ll want to take another look at your forms. Run testing based on the patterns you’ve discovered. Variables to test include the wording of your questions, question descriptions, and question type.
Is everyone exiting your site at the same point? Perhaps on the same page or at the same journey stage?
It’s critical not to assume you know the cause of the drop-off. It could be a UX issue, but this could just as easily be a marketing problem. You won’t know until you dive a little deeper.
Drill down as much as you can. Try to narrow down to the exact drop-off page. Jared Spool, Founding Principal at UIE, discusses a great case study for why this is important in a presentation he gave at ProductCamp Boston.
A large e-commerce company was losing $300,000,000 a year because of abandoned carts. Or so they thought. Through user testing, it was determined users were required to log in after reviewing their cart and deciding to buy. Many of the users were then experiencing sign-in issues and consequently giving up on the purchase.
Once you’ve narrowed down your drop-off point, you should start usability testing. Keep in mind the variables to examine will vary depending on the type of page and type of site.
Time on page or task
Are your users spending an inordinately long time viewing a brief blog post or completing a short form?
It doesn’t take very long to read a 300-word article or to type in your name and email address. When it comes to time on task, watch for any tasks that have an unusually short or long time to completion.
In this instance, it may be useful to try a tool like Hotjar, which can show heatmaps, visual recordings, and all sorts of analyses and visuals of how users are interacting with your site. If you aren’t using a tool, you’ll want to consider a focus group to provide insight on whether users are totally confused or totally uninterested.
Do you have one hundred pages of content? Are there five pages that no one ever views?
Page popularity can also help to inform the way you structure your website’s navigation. You want your most powerful content to be easily accessible, and you don’t want ineffective content pieces getting in the way of that.
Look closely at the content that is underperforming. Is it a series of blog posts? Or is it a page of legal jargon that you are required to have somewhere? If it’s the former, consider deleting it.
If you’re tracking site search queries - and you should be - they can provide valuable insight on the information your users are seeking.
Are they searching for basic information, like your services or contact page?
Turning to site search to find information that should be found via site navigation is not a great sign. Numerous searches seeking help (how do I check out or where do I reset my password) also hint at a poor user experience.
Look at those search queries. See any patterns or trends? Use them to inform your next steps.
In this situation, a focus group may be helpful in understanding why users are searching for certain queries and how you can make that information more easily accessible.
This offers great insight into the user’s pain points. Not to mention, error messages can cause users to leave your site and traffic to drop.
Figure out what error messages are the most common. When are users seeing them? Is there a problem on your end that is causing an error? A broken link or 404? Or is it something the customer is doing, like leaving a form field blank?
Prioritize which errors to fix first based on how frequently they are served to users and the resulting effect. For instance, an error that results in the user abandoning their cart and not making a purchase would be prioritized over fixing a broken link in a blog post. Once you’ve got your list, start fixing.
Mobile vs desktop analytics
Though inherently different platforms, wide disparities in desktop versus mobile analytics are another point of concern. For instance, say your desktop site traffic is great, but your mobile site has a bounce rate that’s through the roof. Or maybe you’re noticing that people are frequenting the help or customer support sections of your app or mobile site, but rarely reach out for assistance on desktop.
Both of these scenarios suggest a poor mobile experience.
Your course of action here is dependent upon where you are noticing a disparity. Make sure your site and content are optimized for mobile. Then - depending on the metric - try the suggestions above to help guide your next steps.
You’ll notice I tried not to make any suggestions as to why you might see these metrics. It’s all about separating your observations from your inferences.
Remember, analytics doesn’t tell us why, and it’s crucial not to guess or assume you know the underlying cause (we all know what happens when you assume).
Instead, take what you have observed in your analytics platform, and apply it. Use it to inform your next steps, to support and prioritize your user testing initiatives.
Design and data don’t have to be rivals. Like Jared Spool said, “As designers, we need to accept and embrace the world of metrics and use their amazing powers to change the way we’re doing things.”
If you're seeing a lot of issues with your user experience or you're not sure how to get started, it might be time to seek some expert advice. We'd be happy to help walk you through the process and do some heavy lifting. Let us know how we can help.