The Bible Society and Garbage In, Gospel Out: Why Data Due Diligence Can’t Be Outsourced.

In 2024, the Bible Society published research suggesting a significant surge in church attendance across the UK. They called it a “Quiet Revival.” The data, sourced from the respected polling firm YouGov, indicated that people were returning to pews in droves, especially young adults and men. For an organisation dedicated to the Bible’s influence, this was the headline of a lifetime.

Well, the floor just fell away as the Bible Society had to issue a public retraction.

YouGov admitted that its quality control systems failed to catch flawed data. The “revival” wasn’t actually happening, at least not in the numbers the data suggested. The Bible Society stated they could not have known about the errors. However, they also admitted the results “resonated” with their existing beliefs and experiences.

This is a classic case study in why data due diligence cannot be outsourced, even to the most prestigious third-party providers. I work with leaders on data and AI strategy to ensure their decisions rest on solid ground, not just convenient numbers that suit a narrative.

The Fallacy of Outsourced Accountability

Many organisations believe that by hiring a top-tier data provider like YouGov or Nielsen, they are buying “the truth.” In reality, they are buying a service like any other. While these data providers have strong methodologies, they are not immune to issues. This is why due diligence is so important. Potential issues include technical glitches; we all know that Excel can do weird and wonderful things when it converts data types from one format to another. There can also be sampling errors, or even AI bot interference in digital surveys.

When you use a dataset that doesn’t belong to you, the responsibility for the conclusions you draw still sits with you. You cannot outsource accountability. If your organisation publishes a report that turns out to be false, the reputational damage hits your brand, not just the data provider’s.

In the case of the Bible Society, the trust placed in the third-party provider acted as a shield against internal skepticism. Because the source were deemed to be “reputable,” the internal checks were likely less rigorous than they would have been for an unknown source.

Professional office setting with data reports, symbolizing the risks of third-party data reliance.

The Danger of “Resonance”

The most revealing part of the Bible Society’s retraction was the admission that the data “resonated” with them. In the world of business analytics, “resonance” is often a polite word for confirmation bias.

Confirmation bias is the tendency to search for, interpret, and favour information that confirms our pre-existing beliefs. When data tells us something we already believe to be true, or something we desperately want to be true then, as humans, we lower our guard. We stop asking difficult questions.

When results “resonate,” that is exactly when you should be most skeptical. If a dataset shows a 50% increase in a metric that has been declining for decades, that isn’t a “revival”. It is a huge question mark, and it should be viewed as a statistical anomaly that requires aggressive interrogation.

Practical Due Diligence: A Technical Checklist

How do you verify a dataset that isn’t yours? You don’t need a PhD in statistics, but you do need a systematic approach. Before you go public with any finding, run these basic checks.

1. Basic Descriptive Statistics

Start with the basics: mean, median, and mode.

  • Mean vs. Median: If the mean (average) is significantly higher than the median (middle value), you have outliers. In the case of church attendance, a few respondents claiming they attend church 50 times a week would skew the average.
  • The Mode: What is the most common answer? If the mode is an extreme value, it suggests a problem with the survey design or respondent honesty.

2. Frequency Distributions and “Straight-Lining”

Look at how the data is distributed. In digital surveys, “straight-lining” is a common sign of bot activity or disinterested respondents. This happens when a user clicks the same option (e.g., “Strongly Agree”) for every single question just to finish the survey. If you see a large cluster of respondents with identical, repetitive patterns, your data is likely compromised.

3. Historical Benchmarks and External Proxies

Data does not exist in a vacuum. If your data shows a massive spike in churchgoing, look for external proxies to verify it.

  • Did book sales for Bibles increase?
  • Did website traffic for major denominations spike?
  • Did public transport data show more movement on Sunday mornings?

If the “Quiet Revival” data showed a massive surge but every other indicator didn’t move much, the data was almost certainly wrong.

Data analyst reviewing distribution charts to identify statistical anomalies and survey outliers.

Counteracting Bias through Red Teaming

To avoid falling into the trap of confirmation bias, you must build dissent into your data science process.

One effective method is “Red Teaming.” In a Red Team exercise, you appoint a group of people (or a consultant) whose sole job is to prove the data wrong. Their goal is to find the flaws, identify the biases, and provide alternative explanations for the results. Personally, I love doing this exercise.

Ask the “What if?” questions:

  • What if these respondents are all bots?
  • What if the question was phrased in a leading way?
  • What if this spike is a seasonal fluke rather than a trend?

By actively looking for data that proves you wrong, you strengthen the integrity of your final conclusion. If the data survives a Red Team attack, you can present it with much higher confidence.

Faith as a Data Filter

The Bible Society incident highlights a unique challenge of the intersection of faith and data. Faith is, by definition, a belief in things unseen. It is a powerful internal narrative. However, when faith becomes a filter for empirical data, it acts as a permanent confirmation bias. 

Religious or mission-driven organisations often look for signs of success to justify their work and encourage their supporters. This makes them particularly vulnerable to “good news” data. I remember working for one organisation (not the Bible Society, note!) where a team member was a committed Christian. I asked her if she’d checked the data was correct and she said yes, because she had prayed over it and said that God would have shown her if the data was wrong and she didn’t need my help. After hearing this a few times, I gently explained that perhaps God had sent me to show her that the data was wrong, and I was possibly the highly unlikely answer to her prayers. She sat in stunned silence as I picked my way through it, pointed out some errors. In any case, I fixed the issues and she just gaped at me and she didn’t say another word after that.

In a professional business intelligence context, we must separate our hopes from our observations. Whether you are a non-profit driven by faith or a corporation driven by a specific market vision, your desires shouldn’t influence your data validation. Stay objective, even when it hurts. The truth will come out in the end.

Lessons for the Future

The Bible Society has a chance to turn this failure into a blueprint for better governance. For any organisation moving forward, these should be the standard operating procedures:

  1. Demand Transparency: Ask your data provider for the raw, anonymised data, not just a summary report. You cannot perform due diligence on a PowerPoint deck or a PDF. You will make bad data look pretty, that’s all.
  2. Internal Validation: Run your own internal checks (descriptive stats and distributions) before any public release.
  3. The “Skepticism Threshold”: If a result looks too good to be true, assume it is. Set a higher threshold of evidence for “extraordinary” findings.
  4. Public Honesty: If you find an error, own it immediately. The Bible Society did eventually retract, but the goal should be to catch these errors before they reach the public domain. It is better to find out before other people do.

Data is a tool, but it is also a liability if handled poorly. In an era where AI and automated polling are becoming more common, the risk of “garbage in, gospel out” has never been higher. I help organisations navigate these complexities. From AI strategy to core data governance, I ensure your insights are based on reality, not just resonance.

If you are concerned about the integrity of your datasets or need to build a more robust data and AI strategy, let’s talk.

Sources:

Share this:

Like this:

Like Loading...

Discover more from Jennifer Stirrup: AI Strategy, Data Consulting & BI Expert | Keynote Speaker

Subscribe now to keep reading and get access to the full archive.

Continue reading