8 Amazing Mind-Reading Technologies

Bionic prostheses have made enormous strides in recent years — and the concept of a mind-controlled robot limb is now very much a reality. In one example, engineers at Johns Hopkins built a successful prototype of such a robot arm that allows users to wiggle each prosthetic finger independently, using nothing but the power of the mind.

Perhaps even more impressively, earlier this year a team of researchers from Italy, Switzerland, and Germany developed a robot prosthesis which can actually feed sensory information back to a user’s brain — essentially restoring the person’s sense of touch in the process.

“We ‘translate’ information recorded by the artificial sensors in the [prosthesis’] hand into stimuli delivered to the nerves,” Silvestro Micera, a professor of Translational Neuroengineering at the Ecole Polytechnique Fédérale de Lausanne School of Engineering, told Digital Trends. “The information is then understood by the brain, which makes the patient feeling pressure at different fingers.”

Read more: https://www.digitaltrends.com/cool-tech/8-examples-of-amazing-mind-reading-tech/

Paralyzed Man regains Sense of Touch with Brain-Controlled Robot Arm

Source: sputniknews.com

In the winter of 2004, 28-year-old Nathan Copeland suffered an accident that left him unable to feel any sensation in his arms and fingers. But a decade later, he has now been able to regain his sense of touch through a mind-controlled robotic arm that is directly connected to his brain.

Dr. José del R. Millán: “Brain-Machine Interfaces: The Perception-Action Closed Loop”

Source: eecs.berkeley.edu

Future neuroprosthetics will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired upper limb functions because controlled by the same neural signals than their natural counterparts. However, robust and natural interaction of subjects with sophisticated prostheses over long periods of time remains a major challenge. To tackle this challenge we can get inspiration from natural motor control, where goal-directed behavior is dynamically modulated by perceptual feedback resulting from executed actions.

Current brain-machine interfaces (BMI) partly emulate human motor control as they decode cortical correlates of movement parameters –from onset of a movement to directions to instantaneous velocity– in order to generate the sequence of movements for the neuroprosthesis.

 

Read more here: https://eecs.berkeley.edu/research/colloquium/160907

 

Interscatter enables inter-technology communication for implantable devices

A team of UW CSE and EE researchers introduce Interscatter, a novel approach that enables inter-technology communication by converting Bluetooth signals into Wi-Fi transmissions over the air. The system enables power-limited devices such as brain implants, contact lenses and credit cards to communicate with everyday devices such as smartphones and smartwatches. Through Interscatter, UW researchers demonstrate the potential to transform health care and unleash the power of ubiquitous connectivity. Learn more at http://interscatter.cs.washington.edu.

Mind-Controlled Prostetic Fingers can now move individually

Source: inverse.com

Throughout human history, the hand has been irreplaceable — prostheses, as clever as they were, couldn’t match nerves, bones, and sinew. As robotics improve, actuators and metal do a decent thumb-and-index impression. On the other hand, robotic fingers on a prosthetic hand generally clench in unison, which is great if you’re trying to catch a ball, but less so if you want to hold a pen or pick up earphones.

Read more here: https://www.inverse.com/article/11639-mind-controlled-prosthetic-robot-arm-waggles-fingers-for-first-time


Research on Hybrid EEG-fNIRS asynchronous BCI for multiple Motor Tasks

Source: journals.plos.org

Non-invasive Brain-Computer Interfaces (BCI) have demonstrated great promise for neuroprosthetics and assistive devices. Here we aim to investigate methods to combine Electroencephalography (EEG) and functional Near-Infrared Spectroscopy (fNIRS) in an asynchronous Sensory Motor rhythm (SMR)-based BCI. We attempted to classify 4 different executed movements, namely, Right-Arm—Left-Arm—Right-Hand—Left-Hand tasks.

 

Read more here: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0146610

Plugging into the Brain

Source: huffingtonpost.com

A few weeks ago, I wrote about Ray Kurzweil’s wild prediction that in the 2030s, nanobots will connect our brains to the cloud, merging biology with the digital world.

Let’s talk about what’s happening today.

Over the past few decades, billions of dollars have been poured into three areas of research: neuroprosthetics, brain-computer interfaces and optogenetics.

Read more here: http://www.huffingtonpost.com/peter-diamandis/plugging-into-your-brain_b_8624288.html