For Stiegl,one of Austria’s most traditional private breweries, brand tracking is not just a routine obligation – it is a strategic management tool. At regular intervals, the media and market research agency Media1 conducts corresponding brand tracking studies to measure key indicators such as brand awareness, slogan recall, and advertising effectiveness. However, with the increasing prevalence of survey fraud in the industry, signs of problematic data quality began to accumulate: unusual responses, suspicious patterns, early indications of automated participation. For Media1, it was clear that they did not want to leave data quality in studies for Stiegl and other clients to chance.
Challenge
Until now, data cleaning at Media1 was carried out manually after fieldwork was completed. Project managers reviewed response times, read through open-ended answers, and flagged suspicious cases for removal – a time-consuming and subjective process that took roughly two hours per wave.
On average, around 6–7% of interviews were excluded through this manual process. However, more and more cases began to raise red flags: responses appeared that didn’t match the language or tone of the target group, closely resembled content from Wikipedia or Google, or showed unusual repetition patterns. The suspicion: a portion of the sample might include dishonest or even fraudulent participants. But there were no objective tools in place to confirm this. Uncertainty grew – as did the concern that flawed data might slip through unnoticed, potentially leading to misguided strategic decisions for Stiegl.
Solution
Media1 activated ReDem within the survey software Keyingress, where ReDem is integrated via API. From that moment on, every incoming survey participant was checked in real time, and responses of low quality were not counted as ‘completes.’ Thanks to real-time quota management, re-recruitment became unnecessary.
Impact
The use of ReDem delivered two key advantages:
- Time savings of 75%: The manual cleaning effort dropped from around two hours to less than 30 minutes per wave. Instead of spending valuable time reviewing datasets, the team could focus more on analyzing and interpreting the results.
- Significant improvement in data quality: With ReDem in place, 16% of interviews in the Stiegl brand tracking study were identified as low-quality responses – more than twice as many as had previously been detected through manual checks.
Had these 16% low-quality cases remained in the dataset, they would have significantly distorted key metrics. For example, aided brand awareness would have appeared 20 percentage points lower – a clear deviation from the steady trend observed over recent years. Slogan awareness would have falsely suggested a drop in advertising effectiveness, showing a decline of 18%. At the same time, other values would have been skewed in the opposite direction: consideration – the willingness to purchase or try the product – would have been overstated by 10%. The most extreme distortion would have been in TV spot recognition, which would have shown an increase of 26%, completely disconnected from the actual campaign performance.
These deviations clearly demonstrate: low-quality data doesn’t just create noise – it introduces systematic bias that can fundamentally distort the interpretation of a brand tracking study.
Conclusion: Data quality isn’t a nice-to-have – it’s a strategic imperative.
"The risk of systematic bias was significantly lower in this year’s brand tracking study due to the use of ReDem. We are pleased to have implemented the tool, not only to obtain more reliable data but also to ensure plausible results in our longitudinal comparisons." - Christoph Auböck, Media1