Wearable cap lets amputees grasp objects with their mind

Atmel | Bits & Pieces


Researchers at the University of Houston have built a brain-machine interface to control prosthetic hands.


When it comes to brain-controlled interfaces, advancements in the space have come a long way since its earliest days of research at UCLA in the 1970s. Under a grant from the National Science Foundation and followed by a contract from DARPA, the papers published following the study marked the first appearance of the expression BCI in scientific literature. Now fast forward nearly 40 years and scientists are inspiring a wide-range of possibilities, including enabling amputees to command robotic limbs with their mind.

89254_web

That’s exactly what one team from the University of Houston has done. The researchers have developed an algorithm that allowed a man to grasp a bottle and other objects with a prosthetic hand, powered merely by his thoughts. Instead of implants, this non-invasive method uses a wearable EEG cap that monitors brain activity externally through the scalp. During the its demonstration, a 56-year-old man whose right hand had been amputated was successfully able to clutch selected items…

View original post 140 more words

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.