Neurosurgeon grows living neurons on electrodes to make AI models faster and cheaper — raises $25M seed
Feb 12, 2026 with Alexander Ksendzovsky
Key Points
- Biological Computing Company raises $25 million seed to deploy living neurons on electrode arrays as a practical efficiency layer for AI models, with a software adapter already integrated into video generation systems.
- The company's neurons consume far less energy than silicon and rewire themselves during learning, addressing AI's compute constraints by closing a 40-year gap between neuroscience and AI architecture that widened when backpropagation replaced biology-based principles.
- Ksendzovsky is targeting foundational model labs and hyperscalers constrained by compute, positioning the adapter as a drop-in tool to make existing models faster and cheaper rather than as speculative research.
Summary
Alexander Ksendzovsky, a neurosurgeon and neuroscientist with two decades of experience growing neurons on electrodes, is positioning biological computing as a practical solution to AI's energy crisis. His company grows living neurons on electrode arrays, about 5,000 electrodes per dish, and uses them to enhance artificial neural networks. Neurons consume far less energy than silicon and adapt by rewiring their connections during learning, whereas current hardware is rigid and static.
The company is already shipping two products. The first is a software adapter that plugs directly into artificial neural networks to improve their performance, speed, and cost. It is currently integrated into video generation models. The second is an algorithm discovery platform that uses real brain cells to identify architectural principles that might succeed transformers by building state-of-the-art models in-house, identifying neuroscience principles that improve them, and feeding those insights back into the development loop.
Ksendzovsky frames the problem as a 40-year divergence in AI architecture. Early perceptrons in the 1950s mimicked neuroscience, but backpropagation, which won a Nobel Prize and became standard in the 1980s, is not a real neuroscience principle. That mathematical break sent AI down a biologically implausible path. Neuroscience itself flourished separately, with researchers learning to grow brain cells in dishes and decode neural signals, a foundation for today's brain-computer interface companies. Ksendzovsky started his company three years ago to close that gap, applying current neuroscience understanding back into AI systems.
The adapter is live in video generation models. He is targeting foundational model labs constrained by compute, essentially all of them, and hyperscalers' research arms interested in post-transformer architectures.
The company raised a $25 million seed round. Ksendzovsky says customers, collaborations, and design partnerships are underway, with plans to expand beyond video generation into broader foundational model optimization and next-generation algorithm discovery.