Published research from
the Institute.
Summaries of ongoing work at the intersection of complexity science, evolutionary biology, and organizational adaptation.
Policy Lag in an Exponential World
Regulatory frameworks built for linear change fail under exponential conditions
Regulatory frameworks are designed retrospectively — they codify observed failures and established best practices. In a world where technological and social change was roughly linear, retrospective regulation worked: the next version of the technology resembled the last version closely enough that last cycle's rules applied. That assumption has failed.
Resilience vs. Efficiency: The Tradeoff Hypernovelty Exposes
Decades of optimization for efficiency have eliminated the slack that resilience requires
Modern organizations are optimized for efficiency in stable conditions. Lean operations, just-in-time supply chains, minimal redundancy, and maximum utilization of human capacity are all efficiency strategies that work well when the environment is predictable. They are liabilities when it is not.
The Shrinking Half-Life of Organizational Knowledge
Strategic knowledge expires faster than organizations can replace it
The half-life of a strategic insight — the time after which it is as likely to be wrong as right — has collapsed from decades to months in many industries. Organizations built around planning cycles of 12–36 months are operating with knowledge that expires mid-cycle.
Evolutionary Mismatch and the Attention Economy
Why human attention architecture is a structural liability in hypernovelty
Human attention evolved for environments characterized by salient physical threats, social bonding signals, and low-frequency novel stimuli. The modern information environment exploits each of these evolutionary adaptations simultaneously and at scale, producing a systematic mismatch between our attentional architecture and the environments we have constructed.
Cognitive Debt and the AI Acceleration Trap
How automation offloads thinking without transferring understanding
As AI systems absorb more of the cognitive load of knowledge work, organizations are accumulating cognitive debt — the atrophying of human judgment, contextual understanding, and domain expertise that cannot be rebuilt quickly when AI systems fail or encounter novel situations.