The ability to control the physical world with your mind using a brain-computer interface or a mind machine has traditionally been focused on health care, and more recently the gaming industry. Now, thanks to cutting-edge technology pioneered by Altran, these applications are set to transform the way man and machine communicate on the factory floor.
Hey you! Ever wish your technology was more invasive? You love voice-to-text, but it’s just too public?
Some researchers at MIT Media Lab have come up with the perfect gadget for you. And it looks like a Bane mask crossed with a squid. Or, if you prefer: like a horror movie monster slowly encompassing your jaw before crawling into your mouth.
The researchers presented their work at the International Conference on Intelligent User Interfaces (yes such a thing exists) in March in Tokyo.
Whenever you think of words, they’re silently, imperceptibly, transmitted to your mouth. More specifically, signals arrive at the muscles that control your mouth. And those signals aren’t imperceptible to a highly sensitive computer.
The researchers call this device the AlterEgo. It’s got seven electrodes positioned around the mouth to pick up these signals. The data that the electrodes pick up goes through several rounds of processing before being transmitted wirelessly to a device awaiting instruction nearby. Oh, and it’s got bone-conduction headphones so that devices can respond.
AlterEgo is a wearable system that allows a user to silently converse with a computing device without any voice or discernible movements — thereby enabling the user to communicate with devices, AI assistants, applications, or other people in a silent, concealed, and seamless manner. A human user could transmit queries, simply by vocalizing internally (subtle internal movements) and receive aural output through bone conduction without obstructing the user’s physical senses and without invading a user’s privacy. AlterEgo aims to combine humans and computers—such that computing, the internet, and AI would weave into human personality as a “second self” and augment human cognition and abilities.
A paralyzed Ohio man was able to feed himself for the first time in eight years, after doctors implanted sensors in his brain that sent signals to his arm. (March 29)
Dr Jordan Nguyen has designed technology that allows Jess Irwin to play music with her eyes.
A clinical research publication led by Stanford University investigators has demonstrated that a brain-to-computer hookup can enable people with paralysis to type via direct brain control at the highest speeds and accuracy levels reported to date.
Future neuroprosthetics will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired upper limb functions because controlled by the same neural signals than their natural counterparts. However, robust and natural interaction of subjects with sophisticated prostheses over long periods of time remains a major challenge. To tackle this challenge we can get inspiration from natural motor control, where goal-directed behavior is dynamically modulated by perceptual feedback resulting from executed actions.
Current brain-machine interfaces (BMI) partly emulate human motor control as they decode cortical correlates of movement parameters –from onset of a movement to directions to instantaneous velocity– in order to generate the sequence of movements for the neuroprosthesis.
Read more here: https://eecs.berkeley.edu/research/colloquium/160907
A team of UW CSE and EE researchers introduce Interscatter, a novel approach that enables inter-technology communication by converting Bluetooth signals into Wi-Fi transmissions over the air. The system enables power-limited devices such as brain implants, contact lenses and credit cards to communicate with everyday devices such as smartphones and smartwatches. Through Interscatter, UW researchers demonstrate the potential to transform health care and unleash the power of ubiquitous connectivity. Learn more at http://interscatter.cs.washington.edu.
Scientists have built a device that can extract rough images out of a human brain. Reporting on the breakthrough, Brian Resnick of Vox went so far as to dub it a “mind-reading machine,” while admitting that “it doesn’t work all that well.”
Further reading shows that the machine is nothing so powerful as Cerebro, the telepathy device used by Professor X in the X-Men movies. That’s science fiction. In real life, what exists is a system devised by Kuhl Lab researchers that consists of an fMRI scanner hooked up to an artificial intelligence program.