But data can be incredibly difficult to sort through; it’s not automatically helpful and instructive.
You want to build a product powered by premium data fuel that goes the distance. When you partner with an external UX research team, you can be sure you’ll have a more nuanced understanding of data. And you’ll also know what data is trustworthy and relevant — and which isn’t — as you move forward with product initiatives.
Data is a valuable resource. There’s no debate about that. But it can be difficult to know how to use this resource responsibly and beneficially — especially when there’s an abundance of it, as tends to be the case.
You may have many research reports in a given year, and each report tells a story that provides actionable insights. If you conduct all your research in-house, well, squeezing the right insights out of the data story can be tricky because:
Volume, bandwidth, and bias: These are formidable obstacles for your team. So, too, is discerning what data matters — and how to best measure it.
Focusing on the wrong data can set your product back. You can waste time, energy, and budget heading in the wrong direction because you haven’t prioritized the right research findings. Ideally, data collection tools should help keep you on the right track. Sophisticated platforms like Google Analytics and Pendo provide opportunities to know how your users really interact with your product — at least in theory.
These tools can be avenues to key insights, but if used incorrectly, they can easily become dead-end streets. Why? Because of the allure of vanity metrics — statistics that look impressive on the surface but do little (if anything) for your business objectives. Vanity metrics aren’t relegated to social media likes and shares. They’re alive and well in the EdTech space, too.
Vanity metrics like the ones listed below lead to assumptions, shallow explorations, and/or rushed conclusions:
Another glaring issue with vanity metrics is that they measure past performance. Checking the rearview mirror when making product decisions is smart. After all, you have to take time to evaluate what you’ve already done. But even if you experienced success, you can’t keep your eyes in the rearview.
Past success doesn’t mean rinsing and repeating those same processes will lead to product growth.
When you work with Openfield, we’ll help you keep your eyes on the road ahead. We pay close attention to both lagging and leading indicators and conduct both quantitative and qualitative research.
Lagging indicators are outputs that confirm an in-progress or past pattern. Leading indicators are inputs that point to future events. Both are pertinent to your EdTech product decisions.
For example, if instructors are not visiting the reports page, your data will show low page views (lagging indicator). It’s good to know the page isn’t being used as intended. But based on this data alone, you might assume the reports page isn’t valuable to instructors and be tempted to throw it out.
After conducting appropriate research on the instructor reports page, we may determine that certain actions are necessary to drive instructor engagement in the future (leading indicator). Instead of throwing out a potentially powerful resource for instructors, we can work toward solutions that support and encourage their use of it.
Just like you need to be attentive to both past and future indicators, you need to conduct both quantitative and qualitative research. One without the other gives you an incomplete view of the reality of your product.
Qualitative research collects hard numbers. Numbers can be very compelling, but numbers need context. As far as UX research is concerned, observed numbers are just a starting point. Knowing a product page has low page views is critical. But more critical is our follow-up question: “Why?”
Our UX researchers carry out qualitative research in order to give proper context to the numbers and dig deeper. Qualitative research strives to understand user behavior, desires, preferences, and thought processes. Because of that, it yields deeper, richer insights that have the power to meaningfully shape successful products.
In the case of the instructor reports page, our researchers might ask:
Our researchers answer these questions with user testing. Surveys, interviews, and focus groups are all part of understanding numbers and driving an appropriate solution.
Your data tells a story. An external partner like Openfield ensures you are able to listen to the story it tells. We’ll focus on the data that matters, trim out what doesn’t, and help you understand the difference.
Let’s talk about your product’s data story — and how to use it as fuel to drive your product strongly into the future.