The pursuit of digital immortality and universal brain-computer interfaces has encountered a fundamental paradox. As follows from the report by Konstantin Willeke and his group, published on arXiv, the OmniMouse model is changing the rules of the game, but not in the way Silicon Valley is used to. Trained on 150 billion neural tokens, the model demonstrates an inversion of the standard scaling law: in neuroscience, data volume is more critical than the number of parameters.

While the AI industry spends billions in the hope that increasing model weights automatically leads to increased intelligence, OmniMouse shows the opposite. As researchers note, performance grows reliably with the addition of new data, but quickly hits a ceiling when trying to increase the size of the model itself. For business, this is a wake-up call: even the relatively simple mouse visual cortex remains a hostage to a data deficit, not computing power. Attempts to simulate the human brain by simply "pouring on hardware" no longer look viable.

OmniMouse achieves state-of-the-art performance in three regimes: brain activity prediction, behavioral decoding, and neural forecasting. According to Willeke, the model outperforms specialized systems by using recordings from 3.1 million neurons obtained during hundreds of sessions with test mice. In our view, this is a clear signal for the neurotech sector: the era of the "zoo" of private solutions is ending, giving way to universal biological processors.

However, the current reality for investors looks sobering. If in LLMs like GPT-4 massive datasets make parameter scaling the primary driver of progress, in brain modeling even 150 billion tokens are not enough for emergent properties to appear. As follows from the analysis, data on 3 million neurons is just a drop in the ocean of cortical complexity. The hardware is ready for breakthroughs, but the biological base is not.

Management verdict: Priority in neurotech is shifting from purchasing computing clusters to creating proprietary systems for high-throughput data collection. Scaling for scaling's sake in neural modeling has reached a dead end. The winners will be those companies that solve the problem of "information hunger" through automation of neural recording. Until we bridge the gap in dataset filling, digital brain simulation will remain a costly exercise with extremely low ROI.

Artificial IntelligenceNeural NetworksAI InvestmentDigital TransformationOmniMouse