Mark Jackson has ALS. He can't move his hands, speak clearly, or use traditional input devices. But he just controlled an iPad with nothing but his thoughts—navigating apps, composing messages, and accessing digital content through pure neural intent. This isn't science fiction. It's Apple's new Brain-Computer Interface Human Interface Device protocol working with Synchron's Stentrode implant, and it represents the most significant breakthrough in direct neural control of consumer technology.
Synchron's Stentrode implant sits in Mark's superior sagittal sinus—the main vein over his motor cortex—capturing neural signals without requiring open brain surgery. The device resembles a normal stent but functions as a circuit board, navigating through blood vessels to position electrodes directly above the brain's motor regions. When Mark thinks about moving his finger, the Stentrode captures those neural firing patterns and wirelessly transmits them to an external decoder.
This endovascular approach represents a crucial advantage over competitors like Neuralink, which requires invasive craniotomy procedures. Synchron's method uses catheter-based insertion through the jugular vein, making it significantly safer and more scalable for widespread adoption. The company has now implanted Stentrodes in 10 patients across the United States and Australia under FDA approval.
Apple's breakthrough isn't just supporting brain-computer interfaces—it's treating neural signals as a fundamental input method equivalent to keyboards, mice, or touchscreens. The BCI Human Interface Device protocol creates a standardized "language" that allows any brain-computer interface to communicate with Apple devices seamlessly.
Traditional assistive technologies merely replicate physical inputs—pressing virtual buttons or simulating mouse clicks. Apple's BCI HID enables bidirectional communication between the brain interface and the device. The iPad can share contextual information about what's displayed on screen, allowing the neural decoder to optimize performance based on current interface elements and user focus.
The system integrates with Apple's existing Switch Control accessibility feature, providing visual feedback that shows Mark the strength of his neural signals in real-time. As he focuses his thoughts on specific interface elements, colored overlays indicate signal intensity, creating a closed-loop system that improves with practice.
While Synchron's current focus targets the 150,000 Americans with severe upper limb disabilities, the implications extend far beyond medical applications. Apple's decision to build BCI HID support directly into iOS, iPadOS, and visionOS signals recognition that neural interfaces represent the next evolution of human-computer interaction.
The technology demonstrates that direct neural control can be more private and secure than traditional inputs. Brain signals are inherently unique to individual users and cannot be intercepted or replicated by external parties. Unlike passwords, biometrics, or behavioral patterns, neural signatures provide authentication that's literally impossible to hack or steal.
Dr. Tom Oxley, Synchron's CEO, positions this as "a next-generation interface layer" rather than specialized medical equipment. The company's vision extends toward making brain-computer interfaces "ubiquitous like the keyboard and the mouse"—suggesting neural control could eventually become a standard interaction method for general computing.
The Stentrode system employs sophisticated machine learning algorithms to translate Mark's neural intentions into precise digital commands. When he thinks about tapping his index finger, the system doesn't just detect the signal—it learns his specific neural patterns and improves accuracy over time.
The AI component can distinguish between different types of intended movements, allowing for complex interactions through simple thought patterns. Mark can assign specific neural intentions to custom shortcuts—thinking about finger movements to return to the home screen, or focusing on hand gestures to compose messages. Each user's neural signature becomes a personalized control language.
This learning capability extends to the closed-loop feedback system. As Mark uses the device, it builds a model of his neural response patterns, optimizing the interface to match his specific cognitive rhythms and signal strengths. The more he uses it, the more intuitive and responsive the control becomes.
One of the most remarkable aspects of Mark's demonstration is its complete autonomy. The Stentrode system operates wirelessly via Bluetooth without requiring additional hardware, caregiver assistance, or complex setup procedures. When Mark wants to use his iPad, he simply thinks about it, and the device responds.
This autonomous operation represents a crucial breakthrough for brain-computer interface adoption. Previous systems required extensive technical support, calibration procedures, or physical connections that limited practical usability. Synchron's approach eliminates these barriers, creating a neural interface that works as seamlessly as picking up a smartphone.
The wireless architecture also enables broader applications beyond tablets. Mark has previously controlled Apple Vision Pro headsets through thought, and the system can extend to iPhones, smart home devices, and any technology that supports the BCI HID protocol. This expandability suggests neural control could become a universal interface method.
Synchron became the first BCI company to begin clinical testing of permanently implantable systems in 2019, and now leads the field in practical neural interface deployment. With commercial approval expected by 2030, the company is positioned to bring thought-controlled computing to mainstream markets within this decade.
The collaboration with Apple accelerates this timeline significantly. By integrating with the world's most popular consumer technology platform, Synchron gains access to existing infrastructure, developer ecosystems, and user bases that would take decades to build independently. Apple's commitment to BCI HID suggests neural interfaces will become standard platform features rather than specialized medical devices.
The technical foundation is already expanding beyond individual devices. Synchron envisions neural interfaces controlling smart home environments, autonomous vehicles, and industrial systems. The same neural patterns that allow Mark to navigate his iPad could eventually operate entire digital ecosystems through pure thought.
Mark Jackson's demonstration represents more than assistive technology—it's proof that direct neural control of digital devices has moved from laboratory concept to practical reality. The fact that he can navigate complex interfaces, compose messages, and operate applications through thought alone signals a fundamental shift in human-computer interaction.
The technology that enables Mark to regain digital independence today will likely become available to broader populations tomorrow. Neural interfaces offer advantages beyond accessibility: they're faster than typing, more private than biometrics, and more intuitive than learning complex commands or gestures.
As Apple rolls out BCI HID support across its platforms in 2025, thought-controlled computing will transition from experimental medical technology to mainstream interface option. The same neural patterns that allow Mark to control his iPad could eventually become as common as touchscreen gestures are today.