The current state of the insights function headed into the new year
GET THE REPORTData reigns supreme in today’s world. Which is why data quality is paramount for any business or brand.
The reliability, accuracy and relevance of data needs to be non-negotiable. But the reality is that there are still things like poor survey designs, bad respondents and a need for standardization that can make the data quality landscape a bit bumpy.
Jack Millership, Head of Research Expertise, and Tassia Henkes, Research Director, at Zappi joined us on the Inside Insights podcast to address the state of data quality today and talk through their advice on how to tackle the data quality crisis.
Read on to get our top three takeaways for tackling data quality based on their conversation.
One key takeaway from our discussion is that poor survey design is a major culprit in driving respondents away.
Long, tedious surveys with repetitive or tricky questions can alienate participants. For example, in the clip below, Jack notes that respondents are often asked their age and gender multiple times due to poor data flow between systems, which can be frustrating for respondents. And if respondents are frustrated with the survey, that may influence a negative response.
Tassia highlighted another issue among survey designs: panel bias. When using a small, static panel, there’s a chance respondents might be friends or acquaintances, leading to biased data. A broader pool from a private marketplace can mitigate this by providing more diverse and unbiased data — making it more worthwhile to invest in, in order to get more reliable data.
It’s also worth noting that it's crucial to adopt a mobile-first approach in survey design. Many current practices are outdated and fail to account for the fact that mobile is now the default for many users. So designing surveys with mobile users in mind will also enhance accessibility and engagement.
Consistency and standardization in sampling are essential, as the sample can heavily influence survey scores.
To ensure reliable benchmarks, you need consistent sampling methods. This means aligning around broad audiences and using standardized measures across the industry.
Tassia added that while broad sampling is ideal, it's possible to filter down to specific demographics in your analysis, ensuring both wide reach and targeted insights. This approach enhances the accuracy and stability of survey data, making it more reflective of real-world conditions.
Double opt-in panels, reward networks and programmatic sources have all been designed to give us more consumer access. But has this come at a cost?
The industry has seen a compression in cost per interviews (CPIs), driven by price competition. This downward pressure means respondents are paid less and asked to do more — 30 to 40-minute surveys with repetitive questions (like we mentioned before) and minimal compensation. This leads to a shrinking pool of willing participants, as many people can get turned off after a single bad experience.
Ensuring high-quality data also means distinguishing between disengaged respondents and bad actors. While some respondents might provide short, less insightful answers, this doesn't necessarily indicate poor quality. They might still be answering in good faith.
However, patterns of speeding, straight-lining or identical responses across multiple surveys are red flags for “bad actors,” such as bot farms or click farms. Tassia mentioned that you also need to be mindful of linguistic diversity in responses, ensuring you don't dismiss valid data due to non-standard language use or slang.
If you prefer to listen, check out the full podcast episode here.
Jack and Tassia’s advice on improving data quality involves a multifaceted approach:
Engage and respect respondents: Shorten surveys, make them engaging and ensure fair compensation.
Standardize and align: Consistent sampling methods and measures are crucial for reliable data.
Detect and address quality issues: Distinguish between disengaged respondents and bad actors, and ensure inclusive language is used.
By starting to address these issues, you can collect higher quality data that's reflective of the diverse populations we all aim to understand and ultimately make more business decisions with confidence. While this is by no means the one list to solve the data quality crisis, these tips will certainly help you along the way.
At Zappi, data quality is very important to us. That’s why we work with partners who also care deeply and want to make sure you understand what they do to ensure good quality. For more on our approach, check out our article 5 steps to better quality consumer insights data.
Let's continue this important conversation and work towards better data quality in our industry!
Each month we share the latest thinking from insights leaders and Zappi experts, open roles that might interest you, and maybe even a chart or two for all you data nerds out there.