8 Amazing Mind-Reading Technologies

Bionic prostheses have made enormous strides in recent years — and the concept of a mind-controlled robot limb is now very much a reality. In one example, engineers at Johns Hopkins built a successful prototype of such a robot arm that allows users to wiggle each prosthetic finger independently, using nothing but the power of the mind.

Perhaps even more impressively, earlier this year a team of researchers from Italy, Switzerland, and Germany developed a robot prosthesis which can actually feed sensory information back to a user’s brain — essentially restoring the person’s sense of touch in the process.

“We ‘translate’ information recorded by the artificial sensors in the [prosthesis’] hand into stimuli delivered to the nerves,” Silvestro Micera, a professor of Translational Neuroengineering at the Ecole Polytechnique Fédérale de Lausanne School of Engineering, told Digital Trends. “The information is then understood by the brain, which makes the patient feeling pressure at different fingers.”

Read more: https://www.digitaltrends.com/cool-tech/8-examples-of-amazing-mind-reading-tech/

Tetraplegics benefit from mutual learning via brain-computer interface

Brain-computer interfaces (BCIs) are seen as a potential means by which severely physically impaired individuals can regain control of their environment, but establishing such an interface is not trivial. A study publishing May 10 in the open access journal PLOS Biology, by a group of researchers at the École Polytechnique Fédérale de Lausanne in Geneva, Switzerland, suggests that letting humans adapt to machines improves their performance on a brain-computer interface. The study of tetraplegic subjects training to compete in the Cybathlon avatar race suggests that the most dramatic improvements in computer-augmented performance are likely to occur when both human and machine are allowed to learn.

BCIs, which use the electrical activity in the brain to control an object, have seen growing use in people with high spinal cord injuries, for communication (by controlling a keyboard), mobility (by controlling a powered wheelchair), and daily activities (by controlling a mechanical arm or other robotic device).

Typically, the electrical activity is detected at one or more points of the surface of the skull, using non-invasive electroencephalographic electrodes, and fed through a computer program that, over time, improves its responsiveness and accuracy through learning.

As machine learning algorithms have become both faster and more powerful, researchers have largely focused on increasing decoding performance by identifying optimal pattern recognition algorithms. The authors hypothesized that performance could be improved if the operator and the machine both engaged in learning their mutual task.

Read more: https://www.sciencedaily.com/releases/2018/05/180510145828.htm

BCI and the Future of Movies

Recently, the film industry is showing interest in emerging technologies, such as Virtual Reality (VR). A milestone in this direction was the special award presented by the Academy of Motion Picture Arts and Sciences Board in 2017 to Carne y Arena directed by Alejandro G. Iñárritu. Carney Arena is a VR installation which was said to be opening “new doors of cinematic perception”. This follows on from the work of an increasing number of festivals (like Berlinale and the Venice Film Festival), filmmakers and researchers who are investigating the potential of using new interactive technologies in cinema.

Among the most recent innovations are new wireless Brain-Computer Interfaces, which are now available in the market as low-cost headsets. They are already used in computer games and the arts, but more recently they have been applied in interactive filmmaking as well. For example, Hollywood studios, like Universal and 20th Century Fox have released interactive versions of their films, where the spectator can control key moments of the plot with the use of a BCI headset.

Read more: http://theconversation.com/new-research-shows-how-brain-computer-interaction-is-changing-cinema-94832

Neural Operating System Helps Paralyzed, Voiceless Patients Communicate

Gand calls it a neural operating system. “Just like DOS worked with a keyboard, Windows with a mouse, iOS with touch, Nuos is another level of evolution where the human being is now able to communicate and compute using neurological signals,” he says. The system uses artificial intelligence to adapt to a patient. Someone who has just suffered a stroke, for example, would start with a simplified user interface that gradually becomes more advanced. It can be customized for various settings, from an ICU to someone’s home. The interface can allow someone to browse the internet, connect with external systems like robotics, and supports a wide range of other uses.

Read more here: https://www.fastcompany.com/40559440/this-neural-operating-system-lets-paralyzed-voiceless-patients-communicate

 

BCI technology to make daily life easier for seniors

Last week, residents of the Morningside of Fullerton senior community were given a preview of the cutting-edge technology. Once perfected, the brain-, eye- and sound-controlled devices could help seniors and people with disabilities by performing regular tasks in their daily lives. In addition to the robot arm, students, with the help of their senior volunteers, demonstrated a facial-recognition program that would allow seniors, particularly those with symptoms of Alzheimer’s and other cognitive diseases, put names to faces via a mobile phone. There was also a mind-controlled wheelchair and a device that could communicate using a computer-generated voice with home-assist devices such as the Alexa and Google Home systems to perform tasks such as turning electronics on and off.

Read more here: https://www.ocregister.com/2018/04/16/seniors-get-sneak-peak-at-the-technology-that-could-one-day-make-daily-life-easier/

Mind-controlled robots: the factories of the future?

The ability to control the physical world with your mind using a brain-computer interface or a mind machine has traditionally been focused on health care, and more recently the gaming industry. Now, thanks to cutting-edge technology pioneered by Altran, these applications are set to transform the way man and machine communicate on the factory floor.

AlterEgo Helps You “Talk” To Your Computer Without Words

Hey you! Ever wish your technology was more invasive? You love voice-to-text, but it’s just too public?

Some researchers at MIT Media Lab have come up with the perfect gadget for you. And it looks like a Bane mask crossed with a squid. Or, if you prefer: like a horror movie monster slowly encompassing your jaw before crawling into your mouth.

The researchers presented their work at the International Conference on Intelligent User Interfaces (yes such a thing exists) in March in Tokyo.

Whenever you think of words, they’re silently, imperceptibly, transmitted to your mouth. More specifically, signals arrive at the muscles that control your mouth. And those signals aren’t imperceptible to a highly sensitive computer.

The researchers call this device the AlterEgo. It’s got seven electrodes positioned around the mouth to pick up these signals. The data that the electrodes pick up goes through several rounds of processing before being transmitted wirelessly to a device awaiting instruction nearby. Oh, and it’s got bone-conduction headphones so that devices can respond.

Read more: https://futurism.com/alterego-talk-computer-without-words/

This device lets users ‘speak silently’ with a computer by just thinking

AlterEgo is a wearable system that allows a user to silently converse with a computing device without any voice or discernible movements — thereby enabling the user to communicate with devices, AI assistants, applications, or other people in a silent, concealed, and seamless manner. A human user could transmit queries, simply by vocalizing internally (subtle internal movements) and receive aural output through bone conduction without obstructing the user’s physical senses and without invading a user’s privacy. AlterEgo aims to combine humans and computers—such that computing, the internet, and AI would weave into human personality as a “second self” and augment human cognition and abilities.

Control a Computer with a Thought and a Twitch

Every so often a news article appears that shows a disabled person directing movement of a computer cursor or a prosthetic hand with thought alone. But why would anyone choose to have a hole drilled through his or her skull to embed a computer chip in the brain unless warranted by a severe medical condition?

A more practical solution may now be here that lets you hook up your brain to the outside world. CTRL–Labs, a start-up launched by the creator of Microsoft Internet Explorer, Thomas Reardon, and his partners, has demonstrated a novel approach for a brain-computer interface (BCI) that ties an apparatus strapped to your wrist to signals in the nervous system.

Physiologically, Reardon observes, all transfer of information among humans is carried out via fine motor control. Motor control of the tongue and throat gives us speech. Facial expression and body posture convey emotion and intention. Writing takes place by controlling fingers that scrape chalk on a blackboard, stroke paint, manipulate pen or pencil, or punch keys. If everything the brain does to interact with the world involves muscles, why not use the motor system to more directly interface mind and machine?

Read more here: https://www.scientificamerican.com/article/wristband-lets-the-brain-control-a-computer-with-a-thought-and-a-twitch/

Researchers use EEG to reconstruct what brain is seeing

A crime happens, and there is a witness. Instead of a sketch artist drawing a portrait of the suspect based on verbal descriptions, the police hook the witness up to EEG equipment. The witness is asked to picture the perpetrator, and from the EEG data, a face appears.

While this scenario exists only in the realm of science fiction, new research from the University of Toronto Scarborough brings it one step closer to reality. Scientists have used EEG data (“brainwaves”) to reconstruct images of faces shown to subjects. In other words, they’re using EEG to tap into what a subject is seeing.

Is it mind reading? Sort of.

Read more: https://www.smithsonianmag.com/innovation/new-study-brings-scientists-one-step-closer-to-mind-reading-180968338/#uKGD6vWVWppWqjCj.99