Databrain is a cognitive bias, a cultural pathology, and a failure of epistemology.
It emerges when data—especially quantitative, discrete, and legible data—is treated as inherently authoritative, determinative, or self-interpreting, regardless of what has been excluded, simplified, or misunderstood in the process. It represents a truncation of human knowing (an overreliance on the propositional) and left-hemisphere dominant processing, resulting in flawed reasoning, impaired judgment, and dangerous systemic blindness.
It is a form of simulated thinking; a misrecognition of processed outputs (data) as direct inputs from the world. It short-circuits sensemaking by replacing insight with spreadsheet artifacts, context with categories, and relevance with recency.
It is not the use of data that defines databrain, it is when we collapse the world into what can be measured, tracked, and modeled, and then trust that model more than reality itself—it is the mistaking of data for reality.
The remedy isn’t to reject data, but to reintegrate it within a fuller ecology of knowing, rooted in embodied skill, shared perspective, and attuned participation.
Symptoms of Databrain
- Treats metrics as meaning.
- Mistakes numbers for truth.
- Values quantity over quality.
- Mistakes the map for the territory.
- Discounts context, perspective, and value judgment.
- Prefers legibility over what is relevant and meaningful.
- Ignores long feedback loops or emergent, invisible dynamics.
- Deploys before understanding, then measures for harm too late.
- Believes data can speak for itself, ignoring who’s interpreting it, how, and why.