The US-developed robotic arm brings the sense of touch by controlling it with your mind.
A sense of touch is about more than just increasing dexterity. It’s not just the ability to reach into your pocket and grab your keys. It’s also the ability to hold a loved one’s hand and feel that emotional connection.
says Nathan Copeland, 34-year-old, who was in a car accident in 2004 that left him paralyzed chest down. He volunteered to participate in scientific research, and six years ago underwent a major operation to have tiny electrodes implanted in his brain.
Previous versions of the arm required the participant, Nathan Copeland, to guide the arm using vision alone. “When I only had visual feedback, I could see that the hand had touched the object,” Copeland says. “But sometimes I would go to pick it up and it would fall out.”
A typical grasping task also took Copeland about 20 seconds to complete. “With sensory feedback, he was able to complete it in 10,” says Jennifer Collinger, an associate professor in the department of physical medicine and rehabilitation at the University of Pittsburgh.
Collinger and a team of researchers have been working with Copeland and he has learned to control the motions of the robotic arm using a brain-computer interface.
The team began by placing electrodes in an area of Copeland’s brain that processes sensory information. That allowed them to use electrical pulses to simulate a range of sensations. What’s unique about Copeland is an additional set of electrodes that are connected to his somatosensory cortex, which receives and processes sensations.
“When we’re grabbing objects, we use this sense of touch very naturally to improve our ability to control,” explained Gaunt.
After operating on Copeland to install the electrodes, the team held their breath.
“No one knew what to expect because this had only been done in monkeys and you can’t ask a monkey what something feels like,” said Copeland.
Then came the moment of truth, when they tried sending their first touch signal.
Before the interface could be put to work with the robotic arm, the scientists had to perform a series of tests with Copeland.
Finally, it was time to try it out.
Copeland sat next to the metallic black robotic arm and was asked to pick up a series of small objects like rocks, spheres, and place them on a box — with either the tactile sensors switched on, or off.
He was able to complete each task on average twice as fast when the sensors were enabled and were even able to achieve more complex tasks like picking up a glass and pouring its contents into another.
“The sensation gave me that assurance and confidence to know that I definitely had a good grab on the object and I could lift it up,” said Copeland.
Copeland got his brain-computer interface set up at home when the Covid pandemic shut down the university and has used his downtime to learn how to draw on a tablet and even play video games.
This post was created with our nice and easy submission form. Create your post!