SPONSORED
March 8, 2025

UC Researchers Help Paralyzed Man Control Robotic Arm

roboticarm
Photo Source: pexels.com/

UC San Francisco researchers have achieved a milestone by enabling a paralyzed man to manipulate a robotic arm via an AI-enhanced brain-computer interface (BCI).

This innovation sets itself apart by maintaining accuracy over a remarkable seven-month period, a feat that has eluded previous systems and brings hope for restoring movement to paralyzed individuals.

A Breakthrough in Brain-Computer Interface Technology

While traditional brain-computer interfaces required constant recalibration due to shifting brain activity, UC San Francisco’s Weill Institute for Neurosciences has made a breakthrough with an AI-powered BCI that adapts to these fluctuations. Dr Karunesh Ganguly, the neurologist leading the study, shared,

“The key was the discovery of how activity shifts in the brain day to day as a study participant repeatedly imagined making specific movements.” He added,

“Once the AI was programmed to account for those shifts, it worked for months at a time.”

This advancement enabled a participant, paralyzed by a stroke, to control a robotic arm for a period far longer than ever before with similar technology.

How the AI-Enhanced System Works

The brain-computer interface relies on sensors placed on the surface of the participant’s brain to detect neural signals when he imagines moving different body parts. These signals are then translated by an AI system into commands that control a robotic arm.

The participant first trained with a virtual robotic arm to improve control, allowing the AI to learn how to interpret his brain activity.“Eventually, he got the virtual arm to do what he wanted it to do,” researchers observed.

After transitioning to a real robotic arm, the participant was able to perform intricate tasks such as picking up blocks, turning them, and even retrieving a cup from a cabinet to hold it up to a water dispenser.

Adapting to Brain Activity Shifts

Brain-computer interfaces have often struggled with the natural fluctuation of brain activity, but Dr Ganguly and his team discovered that while brain activity patterns stayed the same in shape, their location shifted slightly each day.

The AI model was created to adapt continuously to these daily changes to solve this, preserving the accuracy of the system.“This blending of learning between humans and AI is the next phase for these brain-computer interfaces,” said Ganguly. “It’s what we need to achieve sophisticated, lifelike function.” Months later, the participant demonstrated the system’s lasting viability, using the robotic arm after just a 15-minute “tune-up.”

Potential for Real-World Impact

Regaining even simple movements can make a huge difference in the lives of people with paralysis. Tasks like grasping objects and using basic tools could become possible again. Dr Ganguly and his team are working to refine the AI model to make the robotic arm’s movements smoother and more intuitive.

The next step will be testing the BCI in home environments, bringing the technology closer to everyday use.

“I’m very confident that we’ve learned how to build the system now, and that we can make this work,” said Ganguly. If successful, this development could lead to long-term solutions that help individuals with paralysis regain independence.