17 Sep 2025
AlterEgo, an MIT Media Lab spin‑out, unveiled a wearable it bills as a “telepathic” interface that reads the neural signals your brain sends to speech muscles before you actually speak. The startup calls the capability “Silent Sense”: sensors pick up intent to speak (including fully motionless intent) and a model maps those signals to words and actions. According to the report, early lab demos traced back to Arnav Kapur’s 2018 MIT work achieved ~92% accuracy on limited vocabularies; Kapur now leads the company, which says its first mission is restoring communication for people with ALS, MS, and other speech‑impairing conditions.
Practical uses described include typing messages at thought speed, silently querying information, controlling smart devices without audible voice, and private conversations between AlterEgo users (plus contextual questions using on‑device cameras). The company stresses it doesn’t “read minds” broadly — it detects consciously intended speech signals, not arbitrary thoughts. Early access signups are open, but there’s no pricing or shipping timeline disclosed as the team moves from lab demos toward productization.
Source