Shishir Mehrotra on Superhuman: proactive AI that acts before you ask
Oct 31, 2025 with Shishir Mehrotra
Key Points
- Superhuman rebrands Grammarly's parent company around four products—Grammarly, Kōd, Mail, and new agent platform Superhuman Go—positioning AI as human augmentation rather than replacement.
- Superhuman Go opens Grammarly's cross-platform infrastructure to any agent, enabling proactive AI that surfaces without user prompts across a million web and mobile applications.
- Superhuman processes roughly 100 billion LLM calls weekly across 40 million users, dwarfing ChatGPT usage through ambient inference triggered by keystrokes and app switches rather than explicit requests.
Summary
Shishir Mehrotra has rebranded Grammarly's parent company to Superhuman, restructuring it around four products: Grammarly, Kōd, Mail (the email client formerly known as Superhuman), and a new product called Superhuman Go.
The rename follows a Google-to-Alphabet logic. Grammarly survives as a product brand, but the corporate entity is now Superhuman. Mehrotra's test for the name was twofold: broad enough to anchor a suite of products, and anchored in the word human rather than super. The empowerment framing matters to him because Grammarly has always positioned AI as augmenting the user rather than replacing them.
Superhuman Go is the material new development. It takes the cross-platform layer that lets Grammarly operate across roughly a million web, desktop, and mobile applications and opens it up as a general agent platform — not just a writing assistant. The pitch is that any agent can run on top of that infrastructure, embedded and proactive, appearing where users already work rather than requiring them to open a separate chat interface.
Mehrotra frames the competitive landscape in three buckets: chat players (ChatGPT and its peers, which ask you to come to them), do players (headless agents running background tasks, dominant in coding — he cites Anthropic's figure that 39% of their queries are now headless agentic tasks), and assist players, which is where Superhuman sits. The assist model is proactive: AI surfaces before you ask, triggered by keystrokes, document loads, and app switches.
The scale claim is notable. Mehrotra says Superhuman runs roughly 100 billion LLM calls per week across 40 million users, which works out to a few thousand AI calls per day per person — compared to perhaps a dozen for a heavy ChatGPT or Gemini user. The volume comes from ambient, invisible inference rather than explicit prompting.
On the browser wars, he has no plans to compete. He argues browser fragmentation actually benefits Superhuman, since it makes a cross-browser, cross-platform assistant more valuable, not less.