Skip to main content

⚠️ When Data Becomes a Trap:
The McNamara Fallacy in Library Performance

A hidden danger in library analytics — when numbers look right, but reality is wrong.

People often assume that working with national-scale library data is all about complexity.

They ask about machine learning models, pipelines, or transformations. But the real danger is much simpler — and far more dangerous.

The McNamara Fallacy

“The tendency to measure what is easily measurable, while ignoring what truly matters.”

🧠 When Data Starts Misleading Us

In modern libraries, we are no longer facing a data scarcity problem. We have dashboards, reports, and indicators everywhere.

But here is the uncomfortable truth:

More data does not automatically mean better understanding.

📊 Where the Trap Happens

📚 Collection Size vs Collection Value

Counting books is easy. Measuring whether they are actually used, relevant, and impactful is much harder. When we focus only on numbers, libraries may optimize for quantity — not usefulness.

🚶 Visitor Count vs Real Engagement

A high visitor number looks impressive. But does it reflect learning, literacy improvement, or meaningful interaction? Not necessarily.

📄 Compliance vs Service Quality

Reports are measurable. Human experience is not. When systems reward documentation more than impact, libraries risk becoming efficient on paper—but ineffective in reality.

⚠️ The Real Risk: Data-Misled Policy

When metrics drift away from reality, decisions follow the wrong direction.

We think we are doing evidence-based policy —
but we are actually doing data-misled policy.

💬 Final Thought

The question is not:
“What can we measure?”

But:
“Are we measuring what truly matters?”

Because in the end, data is not the goal.
Impact is.