Neurology
Technology
MedicalDevices
Paritosh Dubey
Brain-computer interfaces (BCIs) are shifting from lab demos to functional products. In 2025, non-invasive BCIs, particularly EEG-based wearables now enable limited but real device control, while clinical implants are restoring communication and movement in patients with paralysis. This article organizes the landscape with clear definitions, market context, capabilities vs. limitations, and strategic implications, with sources linked throughout.
Non-invasive BCIs have crossed the usability threshold for basic control tasks (cursor moves, selections, simple macro triggers) in consumer-ish contexts; especially for accessibility and gaming. Companies like Neurable and NextMind pioneered accessible EEG interfaces for headsets and AR/VR control, inspiring a wave of wearables integrating EEG + eye tracking + EMG for more robust signal fusion.
Medical-grade BCIs (mostly invasive) continue to hit major milestones: restored cursor typing rates that approach smartphone texting, neural decoding of speech from cortical signals, and spinal stimulation enabling walking in some paralysis cases. These set the ceiling, while non-invasive devices push the floor up.
Invasive BCI: implant electrodes in or on the brain (ECoG, microelectrode arrays). Highest signal quality; used in clinical research and early-stage neuroprosthetics.
Non-invasive BCI: read brain activity from outside the skull (EEG, fNIRS). Lower SNR but safe, cheaper, and wearable; suited for consumer, prosumer, and many accessibility use-cases.
Non-invasive EEG: Good for coarse intent and state detection (attention, workload, simple selection/control) with improvements from machine learning and better dry electrodes.
Typical information transfer rates are limited (a few bits per second in comfortable, real-world conditions), but are enough for selection-based interfaces, dwell-based cursors, and macro triggers in AR/VR and desktop environments.
Invasive implants: Recordings from motor and speech cortex now support cursor control, synthetic speech, and handwriting decoding at tens of characters per minute, with ongoing work improving stability, longevity, and portability.
Selected milestones and metrics
Non-invasive control: Early NextMind demos showed attention-based selection and basic control in VR environments, inspiring broader adoption of attention-as-input paradigms across EEG startups. Neurable developed EEG headphones for attention state tracking and control in digital environments, including integrations with productivity and VR apps.
Neurable headphone form-factors emphasize “passive BCI” (detecting intent/state) plus “active BCI” (discrete control) to reduce user training and fatigue. See: Neurable product pages and demos Neurable.
NextMind’s non-invasive dev kit demonstrated target selection accuracy that allowed “mind-clicks” in VR; the team was acquired by Snap in 2022 to explore AR/VR interfaces The Verge.
Clinical communication & control:
Neuralink reported the first human implant in early 2024 and later shared a patient using an implanted device to control a cursor and play games, marking a patient-facing demonstration of an implanted BCI in a home-like setting Neuralink blog.
Academic groups (Stanford/UC/Meta and UCSF) published breakthrough speech BCIs achieving words-per-minute sufficient for basic conversational rates, using cortical signals to synthesize speech or text from attempted speech Nature, 2023–2024 overviews: UCSF.
BrainGate consortium has repeatedly demonstrated cursor control and typing via microelectrode arrays, with continued progress on home use and reliability BrainGate.
Restoring movement:
Schematic systems combining brain signals with spinal stimulation have enabled walking control in some patients with spinal cord injury, showing the power of brain–spine interfaces Nature, 2023 WIMAGINE/EPFL/CHUV press.
Note: Exact bit rates and accuracy vary by system, electrode tech, training time, and task. Non-invasive BCIs reliably support low-bandwidth tasks; clinical implants lead in bandwidth but remain highly specialized.
Accessibility: Non-invasive wearables and hybrid systems (EEG + eye tracking) can enable pointer selection, basic keyboard input, and macro triggers for users with limited motor control. Clinical implants can achieve much higher throughput but are constrained by clinical pathways and cost.
Healthcare & rehabilitation: Monitoring cognitive load, detecting fatigue, biofeedback for ADHD/anxiety, and neurorehab protocols. Non-invasive signals are also used in sleep and mental wellness products.
AR/VR and gaming: Attention-based selection (“mind click”), context-aware UI adjustments, and cognitive-state-driven difficulty scaling. Expect more EEG integrated into headbands/headphones and VR headsets, marrying BCI with gaze and hand tracking for robust multimodal interfaces.
Non-invasive players: Neurable (EEG wearables for attention/control), Emotiv (EEG headsets for research and productivity), OpenBCI (hardware + SDK; Galea for VR), Cognixion (accessibility-focused interfaces), and academic spinoffs targeting fNIRS/EEG hybrids. NextMind’s tech continues under Snap for AR R&D.
Invasive/clinical: Neuralink (implantable BCI), Synchron (endovascular stent-electrode array with less invasive placement), Precision Neuroscience (subdural interfaces), Paradromics (high-channel implants), and BrainGate (academic consortium).
Platforms and ecosystems: Toolchains (Matlab, Python/MNE, BrainFlow), SDKs (OpenBCI, Emotiv), and growing datasets enabling better ML models for signal decoding.
Useful primers and overviews
IEEE Spectrum BCI tracker and features IEEE Spectrum.
Nature/Science feature coverage on speech BCIs and brain–spine interfaces Nature news.
FDA and regulatory pathways for neural interface devices FDA CDRH.
What non-invasive BCIs can do reliably:
Detect attention/engagement levels, trigger selections (“mind clicks”), simple directional control, and macro activation in controlled settings.
Support accessibility tasks when combined with assistive tech (eye tracking, switches) to reduce input burden.
What they struggle with:
Continuous, high-precision control without assistance; variable signal quality across hair types, motion, sweat; calibration fatigue; and lower bit rates.
What invasive BCIs can do:
High-bandwidth control for cursor typing, speech decoding, and motor prosthetic control; still limited by surgery, longevity, and medical follow-up.
For product teams
Design for multimodal input: Combine EEG with gaze, head pose, hand tracking, or EMG to boost accuracy and comfort. EEG alone is fragile; fusion yields UX users trust.
Shift from “telepathic control” to “intent-aware assistance”: Use BCI to enhance context (attention, fatigue), not just explicit commands. This lowers error costs and increases user satisfaction.
Focus on low-friction onboarding: Minimize calibration; deliver immediate value (e.g., attention-aware notifications, dwell-based selection) before advanced features.
Prioritize privacy and on-device processing: Biosignals are sensitive. Offer transparent policies, edge inference, and secure data export options.
For healthcare and accessibility stakeholders
Non-invasive BCIs are viable adjuncts today: Pair EEG with established assistive technologies for meaningful gains in communication and control.
Clinical BCIs are expanding indications: Watch for FDA designations, reimbursement frameworks, and home-use trials; partner early to shape workflows and training.
For investors and ecosystem partners
Near-term revenue: Prosumer and enterprise (training, rehab, productivity, VR labs), SDK licensing, and partnerships with headset OEMs.
Medium-term catalysts: FDA clearances for specific indications, hybrid signal devices, and integrated AR/VR products with embedded EEG and eye tracking.
Long-term upside: Speech BCIs, high-throughput implants, and brain–spine interfaces; requires patience and regulatory muscle.
FDA milestones for implantable systems (Neuralink, Synchron, Precision Neuroscience) and non-invasive digital therapeutics using EEG/fNIRS.
Headset integrations (VR/AR devices embedding EEG sensors) that normalize BCI-lite controls in consumer products.
Decoding advances from foundation models trained on large multimodal neuro datasets; better generalization could reduce calibration time.
Privacy frameworks and neuro-rights legislation shaping data access and portability.
Company and product
Neurable: Product overview and demos – https://neurable.com/
NextMind acquired by Snap (2022) – https://www.theverge.com/2022/3/23/22992750/snap-acquires-nextmind-brain-computer-interface-ar-vr
OpenBCI Galea and SDK – https://openbci.com/
Emotiv EEG systems – https://www.emotiv.com/
Clinical and research
UCSF speech prosthesis updates (2023) – https://www.ucsf.edu/news/2023/08/426291/scientists-develop-speech-prosthesis-paralysis
BrainGate consortium publications and demos – https://www.braingate.org/
Nature coverage: brain–spine interface enabling walking – https://www.nature.com/articles/d41586-023-01651-3
IEEE Spectrum BCI coverage and trackers – https://spectrum.ieee.org/brain-computer-interface
FDA medical device resources for neural interfaces – https://www.fda.gov/medical-devices/
Regulatory and ethics
Neuro-rights and data governance discussions (OECD/UNESCO primers) – https://www.oecd.org/sti/emerging-tech/ and https://unesdoc.unesco.org/
Notes on metrics and claims
Non-invasive BCIs today typically support low-bandwidth control suitable for selection-based tasks. Published clinical studies for invasive systems show far higher throughput for speech and cursor control, but with surgical requirements and long-term support needs.
Vendor performance numbers vary; always evaluate in the target environment (movement, sweating, hair, and lighting affect EEG quality).
Authors:
Discussion
0 CommentNo comments yet!