Now that you’re tracking your KPIs, how do you analyze and react to them? What does it mean if you go over or under your targets or baselines? And what do “use it to prioritize your roadmap” or “plan for your next sprint” actually mean?

At the end of the day, remember that tracking metrics isn’t the end of your analytics journey and that you should be using your data to help you make decisions and tell a story around it.

Here are some suggestions for reacting to your KPIs:

Digging deeper into your metrics:

If a metric is below or above your baseline or target, always ask why.

If a metric is below or above the target:

What happened to make this metric move? Is something broken? Is a design difficult to use? If so, you may need to re-prioritize your next sprint or your roadmap to fix this issue. User research can also be used to dive deeper into how veterans experience the product.

Compare metrics

User behavior might not happen in a vacuum. Compare your KPIs to see if the behavior correlates or if something is off.

If Google Analytics traffic looks fine, but several users are calling in and providing feedback on a specific part of your service, you might want to look into your performance metrics to see if something is performing below standards.

If you’re providing quick-glance content and users are scrolling through to the end of your page and several users are calling in and asking about a specific detail that’s at the bottom of your page, you may want to re-evaluate your sprint objectives or roadmap to prioritize that use-case to an easier-to-reach spot.

Remember correlation does not equal causation!

Don’t let the data drive you

Remember to pay attention to outliers and other causes of why the data might point to a shift. Does your traffic go up significantly on Wednesdays? That could mean that your service is needed on Wednesdays, or it could mean that marketing leads to referrals on Wednesdays but users are exiting before using your service. Remember to continue to ask why data has shifted before coming to conclusions. Data will help you know that something is happening. It will not tell you WHY something is happening.

Use prioritization methods to take what you find from your analyses into consideration as you plan your roadmaps and sprints.

What do the different types of metrics mean for VA.gov?

Customer Satisfaction & Trust

Since we provide essential digital services for veterans and their families, our services represent the VA’s ability to deliver services for veterans and may be the only point of contact that our users have with the VA or the only way that veterans can receive certain services. It is important to track user satisfaction and trust with your product to find out if your product meets users’ expectations and needs.

Findability

Many of the services on VA.gov are vital to veterans and their families. They should be able to find the service regardless of the referral or operating system, otherwise, they can’t use your service.

(Task) Service Completion

When looking at user behavior, it’s helpful to segment your users and ask what interactions they are performing or not performing and why.

Did users receive the service they needed? Did they find the information they were looking for, successfully use the tool, or apply for the benefit they needed and were eligible for?

For veteran services, conversions might not necessarily mean success. For example, for forms, if users are coming to your page and then leaving before they go onto the form, this may mean that they found that they don’t meet the eligibility criteria or have the documents necessary to use the form and apply for the benefit. On the other hand, if users are coming to your page and several users who are ineligible to apply for the benefit do complete the form, then your onboarding information page was not successful in deterring ineligible users from taking the time to complete your form.

It’s important to ask which user behaviors are successful and which mean additional UX tweaks.

Help Desk (Contact Centers)

If a user is having difficulty accessing or using your digital service, they should be able to call the VA contact centers for assistance. It is vital that if there is a large UX change or a new product on VA.gov that you complete a contact center review so that the contact centers understand your product and can help veterans through any issues they may have.

When a large portion of users calls into the contact centers about an issue with your product or service, you can use this information as direct user feedback as to an issue your product is experiencing, which may be a bug, an issue with external services, or a design change.

The Contact Center team representatives will also triage any relevant Tier 3 issues to your team to help escalate veteran-facing issues.

Back end services & website performance vs. user errors

For your product’s user design to be the first and foremost part of the user experience, your application should also be “performing” to minimum quality standards.

If your content is slow to load or your tool is taking half a minute to buffer before displaying what the veteran is looking for, their first user experience with your service is going to be frustrating or even non-existent if they exit.

Coming Soon:

  • How to use data for experimentation, including but not limited to A/B testing

  • How to use data to help debug and enhance performance

  • How to use data to prioritize roadmaps

  • How to use data for predictive analytics

  • How to use data for prescriptive analytics

Platform resource links:

View the training slides below or view our customer guide to learn more about getting support from the Analytics & Insights team.

Insights Training - Exploratory Analysis.pdf

External resource links: