8 Amazing Mind-Reading Technologies

Bionic prostheses have made enormous strides in recent years — and the concept of a mind-controlled robot limb is now very much a reality. In one example, engineers at Johns Hopkins built a successful prototype of such a robot arm that allows users to wiggle each prosthetic finger independently, using nothing but the power of the mind.

Perhaps even more impressively, earlier this year a team of researchers from Italy, Switzerland, and Germany developed a robot prosthesis which can actually feed sensory information back to a user’s brain — essentially restoring the person’s sense of touch in the process.

“We ‘translate’ information recorded by the artificial sensors in the [prosthesis’] hand into stimuli delivered to the nerves,” Silvestro Micera, a professor of Translational Neuroengineering at the Ecole Polytechnique Fédérale de Lausanne School of Engineering, told Digital Trends. “The information is then understood by the brain, which makes the patient feeling pressure at different fingers.”

Read more: https://www.digitaltrends.com/cool-tech/8-examples-of-amazing-mind-reading-tech/

Tetraplegics benefit from mutual learning via brain-computer interface

Brain-computer interfaces (BCIs) are seen as a potential means by which severely physically impaired individuals can regain control of their environment, but establishing such an interface is not trivial. A study publishing May 10 in the open access journal PLOS Biology, by a group of researchers at the École Polytechnique Fédérale de Lausanne in Geneva, Switzerland, suggests that letting humans adapt to machines improves their performance on a brain-computer interface. The study of tetraplegic subjects training to compete in the Cybathlon avatar race suggests that the most dramatic improvements in computer-augmented performance are likely to occur when both human and machine are allowed to learn.

BCIs, which use the electrical activity in the brain to control an object, have seen growing use in people with high spinal cord injuries, for communication (by controlling a keyboard), mobility (by controlling a powered wheelchair), and daily activities (by controlling a mechanical arm or other robotic device).

Typically, the electrical activity is detected at one or more points of the surface of the skull, using non-invasive electroencephalographic electrodes, and fed through a computer program that, over time, improves its responsiveness and accuracy through learning.

As machine learning algorithms have become both faster and more powerful, researchers have largely focused on increasing decoding performance by identifying optimal pattern recognition algorithms. The authors hypothesized that performance could be improved if the operator and the machine both engaged in learning their mutual task.

Read more: https://www.sciencedaily.com/releases/2018/05/180510145828.htm

BCI and the Future of Movies

Recently, the film industry is showing interest in emerging technologies, such as Virtual Reality (VR). A milestone in this direction was the special award presented by the Academy of Motion Picture Arts and Sciences Board in 2017 to Carne y Arena directed by Alejandro G. Iñárritu. Carney Arena is a VR installation which was said to be opening “new doors of cinematic perception”. This follows on from the work of an increasing number of festivals (like Berlinale and the Venice Film Festival), filmmakers and researchers who are investigating the potential of using new interactive technologies in cinema.

Among the most recent innovations are new wireless Brain-Computer Interfaces, which are now available in the market as low-cost headsets. They are already used in computer games and the arts, but more recently they have been applied in interactive filmmaking as well. For example, Hollywood studios, like Universal and 20th Century Fox have released interactive versions of their films, where the spectator can control key moments of the plot with the use of a BCI headset.

Read more: http://theconversation.com/new-research-shows-how-brain-computer-interaction-is-changing-cinema-94832

Control a Computer with a Thought and a Twitch

Every so often a news article appears that shows a disabled person directing movement of a computer cursor or a prosthetic hand with thought alone. But why would anyone choose to have a hole drilled through his or her skull to embed a computer chip in the brain unless warranted by a severe medical condition?

A more practical solution may now be here that lets you hook up your brain to the outside world. CTRL–Labs, a start-up launched by the creator of Microsoft Internet Explorer, Thomas Reardon, and his partners, has demonstrated a novel approach for a brain-computer interface (BCI) that ties an apparatus strapped to your wrist to signals in the nervous system.

Physiologically, Reardon observes, all transfer of information among humans is carried out via fine motor control. Motor control of the tongue and throat gives us speech. Facial expression and body posture convey emotion and intention. Writing takes place by controlling fingers that scrape chalk on a blackboard, stroke paint, manipulate pen or pencil, or punch keys. If everything the brain does to interact with the world involves muscles, why not use the motor system to more directly interface mind and machine?

Read more here: https://www.scientificamerican.com/article/wristband-lets-the-brain-control-a-computer-with-a-thought-and-a-twitch/

Researchers use EEG to reconstruct what brain is seeing

A crime happens, and there is a witness. Instead of a sketch artist drawing a portrait of the suspect based on verbal descriptions, the police hook the witness up to EEG equipment. The witness is asked to picture the perpetrator, and from the EEG data, a face appears.

While this scenario exists only in the realm of science fiction, new research from the University of Toronto Scarborough brings it one step closer to reality. Scientists have used EEG data (“brainwaves”) to reconstruct images of faces shown to subjects. In other words, they’re using EEG to tap into what a subject is seeing.

Is it mind reading? Sort of.

Read more: https://www.smithsonianmag.com/innovation/new-study-brings-scientists-one-step-closer-to-mind-reading-180968338/#uKGD6vWVWppWqjCj.99

‘Mind-reading’ computers could enhance human brain and help police

“Mind-reading” computers could be routinely used to enhance the human brain in as little as five years, Sky News has been told. The technology, which monitors brain activity, could help people analyse complex information and make critical decisions. Radiologists scanning X-rays for cancer, City traders making multi-million pound transactions and police searching a crowd for a suspect’s face could be the first to benefit. But Dr Davide Valeriani, a senior research officer at the University of Essex, said any task requiring prolonged concentration could be made safer and more accurate with the help of computers.

Read more: https://news.sky.com/story/mind-reading-computers-could-enhance-human-brain-and-help-police-surgeons-and-city-traders-11238096

The Future of Brain Computer Interface Tech

Brain-computer interfaces (BCI) are increasingly becoming reliable pieces of technology, changing the lives of patients, particularly of patients who suffer from paralysis or similar conditions. BCI is defined as computer technology that can interact with neural structures by decoding and translating information from thoughts (i.e., neuronal activity) into actions. BCI technology may be used for thought-to-text translation or to control movements of a prosthetic limb. The umbrella term BCI covers invasive BCI, partial invasive BCI and non-invasive BCI. Invasive BCI includes the implantation and use of technology within the human body, such as surgically placed electrodes to directly detect electrical potentials. Partial invasive BCI devices are external recorders that detect signals from superficially implanted devices. An example of partial invasive BCI is electrocorticography (ECoG), which records activity of the brain via an electrode grid that was surgically embedded. The previous example is considered “partial” because the electrode grid is placed directly on the brain, but not permanently implanted inside of the brain. Non-invasive BCI technology involves external sensors/electrodes, as seen with electroencephalography (EEG).

Read more: http://in-training.org/future-brain-computer-interface-technology-15655

Tetraplegics easily learn to use brain-computer interfaces

For a brain-computer interface (BCI) to be truly useful for a person with tetraplegia, it should be ready whenever it’s needed, with minimal expert intervention, including the very first time it’s used. In a new study in the Journal of Neural Engineering, researchers in the BrainGate* collaboration demonstrate new techniques that allowed three participants to achieve peak BCI performance within three minutes of engaging in an easy, one-step process.

One participant, “T5,” a 63-year-old man who had never used a BCI before, needed only 37 seconds of calibration time before he could control a computer cursor to reach targets on a screen, just by imagining using his hand to move a joystick.

Dr. David Brandman, lead author of the study and an engineering postdoctoral researcher at Brown University, said that while additional innovations will help to move implantable BCIs like BrainGate toward clinical availability for patients, this advance of rapid, intuitive calibration is a key one. It could allow future users and their caregivers to use the system much more quickly and to keep it calibrated over the long term.

Read more: https://www.sciencedaily.com/releases/2018/01/180124123204.htm

Neural Network Reads Minds

It already seems a little like computers can read our minds; features like Google’s auto-complete, Facebook’s friend suggestions, and the targeted ads that appear while you’re browsing the web sometimes make you wonder, “How did they know?” For better or worse, it seems we’re slowly but surely moving in the direction of computers reading our minds for real, and a new study from researchers in Kyoto, Japan is an unequivocal step in that direction.

A team from Kyoto University used a deep neural network to read and interpret people’s thoughts. Sound crazy? This actually isn’t the first time it’s been done. The difference is that previous methods—and results—were simpler, deconstructing images based on their pixels and basic shapes. The new technique, dubbed “deep image reconstruction,” moves beyond binary pixels, giving researchers the ability to decode images that have multiple layers of color and structure.

Read more: https://singularityhub.com/2018/01/14/this-neural-network-built-by-japanese-researchers-can-read-minds/

Interfacing brains and machines

In the gleaming facilities of the Wyss Centre for Bio and Neuroengineering in Geneva, a lab technician takes a well plate out of an incubator. Each well contains a tiny piece of brain tissue derived from human stem cells and sitting on top of an array of electrodes. A screen displays what the electrodes are picking up: the characteristic peak-and-trough wave forms of firing neurons.

To see these signals emanating from disembodied tissue is weird. The firing of a neuron is the basic building block of intelligence. Aggregated and combined, such “action potentials” retrieve every memory, guide every movement and marshal every thought. As you read this sentence, neurons are firing all over your brain: to make sense of the shapes of the letters on the page; to turn those shapes into phonemes and those phonemes into words; and to confer meaning on those words.

This symphony of signals is bewilderingly complex. There are as many as 85bn neurons in an adult human brain, and a typical neuron has 10,000 connections to other such cells. The job of mapping these connections is still in its early stages. But as the brain gives up its secrets, remarkable possibilities have opened up: of decoding neural activity and using that code to control external devices.

Read more: https://www.economist.com/news/technology-quarterly/21733196-brain-computer-interfaces-sound-stuff-science-fiction-andrew-palmer