This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro

this-startup-wants-to-put-its-brain-computer-interface-in-the-apple-vision-pro

Startup Cognixion announced today that it is launching a clinical trial of its wearable brain-computer interface technology integrated with the Apple Vision Pro to help paralyzed people with speech disorders communicate with their thoughts.

Cognixion is one of several companies, including Elon Musk’s Neuralink, that is developing a brain-computer interface, or BCI, a system that captures brain signals and translates them into commands to control external devices.

While Neuralink and others are working on implants that are surgically placed in the head, Cognixion’s technology is noninvasive. The Santa Barbara, California, company is testing both a software component (an augmented reality BCI app) and a hardware add-on (a custom headband that can read brain signals) with the Vision Pro. The trial will include up to 10 participants in the US with speech impairments due to paralysis from spinal cord injury, stroke, traumatic brain injury, or amyotrophic lateral sclerosis, also known as ALS or Lou Gehrig’s disease.

Cognixion’s goal is to get BCI technology to as many people as possible, and it sees the Vision Pro as a way to do that. “In order to democratize access, you need to do it in such a way that’s the least risky and the most acceptable for adoption for the majority of people,” says Andreas Forsland, the company’s CEO.

Forsland started Cognixion after his mother got sick with pneumonia and was intubated in the ICU. She was fully conscious of everything going on around her but was unable to speak; Forsland became her communication partner when she was in the hospital.

“As a result of that, I experienced tremendous breakdowns of communication between her and her care providers, where I had to intervene,” he says. He started thinking about how people with speech motor disabilities need better ways to communicate.

The company has already designed its own headset, called the Axon-R, and tested it with ALS patients earlier this year. Its custom software uses generative AI models that train on an individual user’s speech patterns. Paired together, the technology enabled participants to “speak” through the headset at a rate approaching normal conversation speed. That study showed that patients could comfortably use the BCI for a few hours a day, several times a week.

Now, Cognixion is bringing its AI communication app to the Vision Pro, which Forsland says has more functionality than the purpose-built Axon-R. “The Vision Pro gives you all of your apps, the app store, everything you want to do,” he says.

Apple opened the door to BCI integration in May, when it announced a new protocol to allow users with severe mobility disabilities to control the iPhone, iPad, and Vision Pro without physical movement. Another BCI company, Synchron, whose implant is inserted into a blood vessel adjacent to the brain, has also integrated its system with the Vision Pro. (Apple is not known to be developing its own BCI.)

In Cognixion’s trial, the company has swapped out Apple’s headband for its own, which is embedded with six electroencephalographic, or EEG, sensors. These collect information from the brain’s visual and parietal cortex, located at the back of the head. Specifically, Cognixion’s system identifies visual fixation signals, which occur when a person is maintaining their gaze on an object. This allows users to select from a menu of options in the interface using mental attention alone. A neural computing pack worn at the hip processes brain data outside of the Vision Pro.

“The philosophy of our approach is around reducing the amount of burden that is being generated by the person’s communication needs,” says Chris Ullrich, Cognixion’s chief technology officer.

Current communication tools can help but aren’t ideal. For instance, low-tech handheld letterboards allow patients to look at certain letters, words, or pictures so that a caregiver can guess their meaning, but they’re time-consuming to use. And eye tracking technology is still expensive and not always reliable.

“We actually build an AI for each individual participant that is customized with their history of speaking, their style of their humor, anything they’ve written, anything they’ve said, that we can gather. We crunch all that down into something that is a user proxy,” Ullrich says.

From inside the headset, users are presented with AI-generated suggestions, or they can compose their own. Cognixion’s system can also use eye tracking or head movement for control, which the Vision Pro enables, but users with severe motor disabilities may not be able to use those options. That’s where reading brainwaves comes in.

In a video of one participant in the Axon-R study, a rabbi with ALS, he’s asked, “How do you stay so positive?” Using Cognixion’s technology, he answers, “I try to see the good in every situation.”

Cognixion thinks its noninvasive approach will also reach patients faster than any of the implanted brain chips in development. For one, safety is a no-brainer—literally. To get clearance as a medical device, it will still need to prove efficacy. That will require a bigger pivotal trial with around 30 patients, Ullrich says. The Food and Drug Administration would likely evaluate the device based on user-friendliness and how much it improves a person’s quality of life.

Jonathan Kao, a BCI researcher at UCLA who isn’t involved with Cognixion, sees enormous potential in noninvasive BCIs because of the risks that come with surgically implanted devices. But the main challenge is that the brain signals captured by noninvasive systems are much weaker and noisier than implanted ones.

“No one has been able to put out a consumer device that matches the performance of an invasive device or even comes close,” Kao says. “The signal quality is very poor so it’s a challenge to decode quickly, reliably, and robustly.”

It makes for a frustrating and slow user experience. But Kao says AI copilots like the one Cognixion and his own lab are developing could bridge that gap in performance, making noninvasive systems more usable for patients.

“If we’re able to do that,” Kao says,”we could have a widespread device that doesn’t require surgery.”

Related Posts

Leave a Reply