Tetraplegics benefit from mutual learning via brain-computer interface

Brain-computer interfaces (BCIs) are seen as a potential means by which severely physically impaired individuals can regain control of their environment, but establishing such an interface is not trivial. A study publishing May 10 in the open access journal PLOS Biology, by a group of researchers at the École Polytechnique Fédérale de Lausanne in Geneva, Switzerland, suggests that letting humans adapt to machines improves their performance on a brain-computer interface. The study of tetraplegic subjects training to compete in the Cybathlon avatar race suggests that the most dramatic improvements in computer-augmented performance are likely to occur when both human and machine are allowed to learn.

BCIs, which use the electrical activity in the brain to control an object, have seen growing use in people with high spinal cord injuries, for communication (by controlling a keyboard), mobility (by controlling a powered wheelchair), and daily activities (by controlling a mechanical arm or other robotic device).

Typically, the electrical activity is detected at one or more points of the surface of the skull, using non-invasive electroencephalographic electrodes, and fed through a computer program that, over time, improves its responsiveness and accuracy through learning.

As machine learning algorithms have become both faster and more powerful, researchers have largely focused on increasing decoding performance by identifying optimal pattern recognition algorithms. The authors hypothesized that performance could be improved if the operator and the machine both engaged in learning their mutual task.

Read more: https://www.sciencedaily.com/releases/2018/05/180510145828.htm

Neural Operating System Helps Paralyzed, Voiceless Patients Communicate

Gand calls it a neural operating system. “Just like DOS worked with a keyboard, Windows with a mouse, iOS with touch, Nuos is another level of evolution where the human being is now able to communicate and compute using neurological signals,” he says. The system uses artificial intelligence to adapt to a patient. Someone who has just suffered a stroke, for example, would start with a simplified user interface that gradually becomes more advanced. It can be customized for various settings, from an ICU to someone’s home. The interface can allow someone to browse the internet, connect with external systems like robotics, and supports a wide range of other uses.

Read more here: https://www.fastcompany.com/40559440/this-neural-operating-system-lets-paralyzed-voiceless-patients-communicate

 

BCI technology to make daily life easier for seniors

Last week, residents of the Morningside of Fullerton senior community were given a preview of the cutting-edge technology. Once perfected, the brain-, eye- and sound-controlled devices could help seniors and people with disabilities by performing regular tasks in their daily lives. In addition to the robot arm, students, with the help of their senior volunteers, demonstrated a facial-recognition program that would allow seniors, particularly those with symptoms of Alzheimer’s and other cognitive diseases, put names to faces via a mobile phone. There was also a mind-controlled wheelchair and a device that could communicate using a computer-generated voice with home-assist devices such as the Alexa and Google Home systems to perform tasks such as turning electronics on and off.

Read more here: https://www.ocregister.com/2018/04/16/seniors-get-sneak-peak-at-the-technology-that-could-one-day-make-daily-life-easier/

AlterEgo Helps You “Talk” To Your Computer Without Words

Hey you! Ever wish your technology was more invasive? You love voice-to-text, but it’s just too public?

Some researchers at MIT Media Lab have come up with the perfect gadget for you. And it looks like a Bane mask crossed with a squid. Or, if you prefer: like a horror movie monster slowly encompassing your jaw before crawling into your mouth.

The researchers presented their work at the International Conference on Intelligent User Interfaces (yes such a thing exists) in March in Tokyo.

Whenever you think of words, they’re silently, imperceptibly, transmitted to your mouth. More specifically, signals arrive at the muscles that control your mouth. And those signals aren’t imperceptible to a highly sensitive computer.

The researchers call this device the AlterEgo. It’s got seven electrodes positioned around the mouth to pick up these signals. The data that the electrodes pick up goes through several rounds of processing before being transmitted wirelessly to a device awaiting instruction nearby. Oh, and it’s got bone-conduction headphones so that devices can respond.

Read more: https://futurism.com/alterego-talk-computer-without-words/

Control a Computer with a Thought and a Twitch

Every so often a news article appears that shows a disabled person directing movement of a computer cursor or a prosthetic hand with thought alone. But why would anyone choose to have a hole drilled through his or her skull to embed a computer chip in the brain unless warranted by a severe medical condition?

A more practical solution may now be here that lets you hook up your brain to the outside world. CTRL–Labs, a start-up launched by the creator of Microsoft Internet Explorer, Thomas Reardon, and his partners, has demonstrated a novel approach for a brain-computer interface (BCI) that ties an apparatus strapped to your wrist to signals in the nervous system.

Physiologically, Reardon observes, all transfer of information among humans is carried out via fine motor control. Motor control of the tongue and throat gives us speech. Facial expression and body posture convey emotion and intention. Writing takes place by controlling fingers that scrape chalk on a blackboard, stroke paint, manipulate pen or pencil, or punch keys. If everything the brain does to interact with the world involves muscles, why not use the motor system to more directly interface mind and machine?

Read more here: https://www.scientificamerican.com/article/wristband-lets-the-brain-control-a-computer-with-a-thought-and-a-twitch/

Researchers use EEG to reconstruct what brain is seeing

A crime happens, and there is a witness. Instead of a sketch artist drawing a portrait of the suspect based on verbal descriptions, the police hook the witness up to EEG equipment. The witness is asked to picture the perpetrator, and from the EEG data, a face appears.

While this scenario exists only in the realm of science fiction, new research from the University of Toronto Scarborough brings it one step closer to reality. Scientists have used EEG data (“brainwaves”) to reconstruct images of faces shown to subjects. In other words, they’re using EEG to tap into what a subject is seeing.

Is it mind reading? Sort of.

Read more: https://www.smithsonianmag.com/innovation/new-study-brings-scientists-one-step-closer-to-mind-reading-180968338/#uKGD6vWVWppWqjCj.99

Tetraplegics easily learn to use brain-computer interfaces

For a brain-computer interface (BCI) to be truly useful for a person with tetraplegia, it should be ready whenever it’s needed, with minimal expert intervention, including the very first time it’s used. In a new study in the Journal of Neural Engineering, researchers in the BrainGate* collaboration demonstrate new techniques that allowed three participants to achieve peak BCI performance within three minutes of engaging in an easy, one-step process.

One participant, “T5,” a 63-year-old man who had never used a BCI before, needed only 37 seconds of calibration time before he could control a computer cursor to reach targets on a screen, just by imagining using his hand to move a joystick.

Dr. David Brandman, lead author of the study and an engineering postdoctoral researcher at Brown University, said that while additional innovations will help to move implantable BCIs like BrainGate toward clinical availability for patients, this advance of rapid, intuitive calibration is a key one. It could allow future users and their caregivers to use the system much more quickly and to keep it calibrated over the long term.

Read more: https://www.sciencedaily.com/releases/2018/01/180124123204.htm

Magic Leap unveils AR smart glasses

Magic Leap updated its website on Wednesday morning, revealing its highly anticipated augmented-reality smart glasses for the first time.

Billed as the Magic Leap One Creator Edition, the smart glasses feature an array of sensors on the front, connected via a wire to a battery and computing pack designed to be worn on the belt, matching the details first reported by Business Insider earlier this year. A wireless controller is used as input.

Magic Leap’s glasses will integrate computer graphics into the real world, a technology often called “augmented reality” by other companies. Magic Leap calls its technology “mixed reality.”

Magic Leap is calling its glasses Lightwear, the battery pack Lightpack, and the controller is called Control.

Read more: http://www.businessinsider.com/magic-leap-reveals-smart-glasses-photos-2017-12

Artificial Intelligence Microchips Will Turn Humans Into Zombies, Says Prominent Neuroscientist

Human beings will become indistinguishable from robots if they allow microchips to be implanted into their brains. That’s according to a claim by Dr. Mikhail Lebedev, a senior neuroscientist at Duke University in Durham, North Carolina. Dr. Lebedev told CNET that improvements in brain implant technology could go awry if human beings start acting like machines.

“It is even possible that “humanity” will evolve into a community of zombies,” he said. “Luckily this is not a problem as of yet.”

Brain implants may sound like something straight out of a science fiction movie but it isn’t as far-fetched as you might think. There are companies working on developing a human brain and computer interface that can make it easier for humans to communicate with computers. One of these companies is backed by none other than Tesla and SpaceX CEO Elon Musk.

Read more: https://www.inquisitr.com/4685012/artificial-intelligence-microchips-will-turn-humans-into-zombies-says-prominent-neuroscientist/

Ethical questions raised by brain-computer interfaces

A recent article published in BMC Medical Ethics explores the ethical aspects of brain-computer interfaces (BCI): an emerging technology where brain signals are directly translated to outputs with the help of machines. Here, two of the authors of the paper tell us more about the applications of BCI, its portrayal in the media, and some of the key ethical issues it raises.

Brain-computer interfaces (BCI) are devices that measure signals from the brain and translate them into executable output with the help of a machine such as a computer or prosthesis. This technology has varied uses, from assistive devices for disabled individuals to advanced video game control.

Read more: http://blogs.biomedcentral.com/bmcseriesblog/2017/12/18/ethical-questions-raised-by-brain-computer-interfaces/