A neurotechnology startup unveiled the first consumer-grade brain-computer interface this week — a lightweight headband that reads electrical brain activity through the scalp and translates it into commands for smartphones, computers, and smart home devices.
How It Works
The device uses an array of dry electroencephalography sensors combined with on-device AI to decode neural patterns associated with specific intentions. Users can scroll, select, type, and navigate applications through focused thought alone. Initial reviews report accuracy rates that make the device practical for everyday tasks, though a learning curve of several hours is required.
We are at the beginning of a new chapter in human-computer interaction. The keyboard and touchscreen will eventually be seen as transitional technologies.
Privacy advocates have raised concerns about the collection of neural data, prompting the company to implement on-device processing with a strict no-cloud policy for raw brain signals.