Innerphases is building human-native communication devices that interface directly with the nervous system
May 27, 2025 with Aidan Smith
Key Points
- Innerphases, founded by a former Neuralink researcher, is building a non-invasive neural interface that pairs weak biosignals with language models to infer user intent without recording external conversations.
- The startup targets a regulatory middle ground between consumer wearables and brain implants, positioning its device as minimally invasive surgery rather than neurosurgery to bypass the FDA pathway Neuralink requires.
- Innerphases applies EMG signal processing similar to Meta's Orion AR headset but focuses on continuous speech synthesis and personal behavior modeling instead of discrete gesture control.
Summary
A Neuralink researcher who leads the company's speech synthesis project is building a non-invasive neural interface startup called Innerphases, pitched as a middle path between a consumer wearable and a brain implant.
The core technical bet is that neural signals captured non-invasively — EMG or similar — are inherently noisy and low-fidelity, but that doesn't matter if you pair them with enough context. If a device picks up only the words "guy" and "left," a language model can infer the full query. The insight borrowed from a former colleague is that machine learning can extract value through data arbitrage or time arbitrage: by collecting signals continuously over long horizons, a model can build high-confidence predictions about intent even from weak inputs.
The product gap he is targeting
He is skeptical of the current generation of ambient AI devices — Limitless pendants, smartwatches — because voice is a socially disruptive interface. Asking a device out loud who someone is mid-conversation is awkward. The goal is an interface that is subtle enough to be used continuously without invading the privacy of people around the user, capturing data about the user's own state and intent rather than recording external conversations.
The device concept runs on permanently collected biosignal data, processed locally in a way that models the user over time. The dual output is a higher-bandwidth interface for interacting with LLMs and an improving personal model of the user's behavior and intentions. He is withholding the specific sensing method, saying he will demo it in a few months.
Regulatory positioning
He wants to avoid Neuralink's FDA pathway if possible, describing a target device somewhere between a contraceptive implant and mole-removal surgery in terms of invasiveness and risk — minor, everyday procedures rather than neurosurgery. He remains openly bullish on Neuralink's long-term trajectory but sees a faster commercial path for something minimally invasive.
Meta's Reality Labs is the closest public reference point: its Orion AR headset uses a wristband that reads EMG signals for finger-pinch controls, built on technology from a company Meta acquired roughly six years ago. He frames Innerphases as applying a similar non-invasive signal approach specifically to speech and continuous personal modeling rather than discrete gesture control.
The company is pre-launch with no disclosed funding.