Cognichrome – An eeg, brain to computer interface (BCI), robot art painter.

The Cognichrome is an art installation (see http://www.cognichrome.com) which reads a person’s brain using an EEG and instructs a robot to paint the interpretation of the user’s thoughts on a real canvas. Using machine learning algorithms, the painting evolves as the wearer is mentally interacting with the device, as their mind is exposed to new videos from the device’s monitors. When they decide to stop they can take the canvas home with them.

Read More

The startup Neurable thinks its brain-computer interface will be fast and accurate enough for playing games in VR.

Virtual reality is still so new that the best way for us to interact within it is not yet clear. One startup wants you to use your head, literally: it’s tracking brain waves and using the result to control VR video games.

Boston-based startup Neurable is focused on deciphering brain activity to determine a person’s intention, particularly in augmented and virtual reality. The company uses dry electrodes to record brain activity via electroencephalography (EEG); then software analyzes the signal and determines the action that should occur.

“You don’t really have to do anything,” says cofounder and CEO Ramses Alcaide, who developed the technology as a graduate student at the University of Michigan. “It’s a subconscious response, which is really cool.”

Read More

A New Intracortical BCI Clocks Faster Typing Speeds Out of the Gate

Many people with ALS experience trouble speaking. To help them stay connected, researchers are developing brain-powered systems to type out words, much like texting people using smart phones. The approach, known as brain-computer interfaces, aims to bypass damaged sections of the central nervous system to allow people with ALS to reach out to family and friends without caregivers’ assistance.

But although many of these investigational devices enable people with paralysis to communicate accurately, the technologies introduced to date are extremely slow for communication purposes. Most recently, a wireless device developed by UMC Utrecht’s Nick Ramsey’s team in the Netherlands enabled a person with ALS to communicate independently but at only 2 letters/minute (see November 2016 news; Vansteensel et al., 2016).

Now, a research team led by Stanford’s Jaimie Henderson and Krishna Shenoy introduce an intracortical brain computer interface (iBCI)-based strategy that enabled people with paralysis to communicate up to 8 words (39.2 characters)/minute, more than 4 times faster than existing neural interfaces (Bacher et al., 2015). This is compared to 12-18 words per minute, the average time it takes for able-bodied people to text on their cell phone without word completion assistance (Hoggan et al., 2008; MacKenzie et al., 2009). The technology according to Stanford’s Krishna Shenoy could be adapted to operate digital devices including computers, tablets and smart phones.

The strategy uses decoding algorithms previously developed by Shenoy’s team, to translate brain activity into ‘point and click’ control commands that work much like using a computer mouse (Gilja et al., 2012; Gilja et al., 2015; Kao et al., 2016). The approach, which involves the pre-implantation of electrode arrays in the hand-operating region of the motor cortex, uses a cable to deliver neuronal signals to a computer interface. The device is one of a growing number of neurotechnologies being developed in collaboration with a consortium of neuroscientists, neurosurgeons and bioengineers known as BrainGate that aims to restore independence to people with paralysis in part, by helping them stay connected.

Read More

Steering A Turtle With Your Thoughts

Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a brain-computer interface (BCI) that can control a turtle using human thought.
They chose a turtle because of its cognitive abilities as well as its ability to distinguish different wavelengths of light. Specifically, turtles can recognize a white light source as an open space and so move toward it. They also show specific avoidance behavior to things that might obstruct their view. Turtles also move toward and away from obstacles in their environment in a predictable manner.
The entire human-turtle setup is as follows: A head-mounted display (HMD) is combined with a BCI to immerse the human user in the turtle’s environment. The human operator wears the BCI-HMD system, while the turtle has a ‘cyborg system’—consisting of a camera, Wi-Fi transceiver, computer control module, and battery—all mounted on the turtle’s upper shell. Also included on the turtle’s shell is a black semi-cylinder with a slit, which forms the ‘stimulation device.’ This can be turned ±36 degrees via the BCI.
The human operator receives images from the camera mounted on the turtle. These real-time video images allow the human operator to decide where the turtle should move. The human provides thought commands that are recognized by the wearable BCI system as electroencephalography signals.
The BCI can distinguish between three mental states: left, right, and idle. The left and right commands activate the turtle’s stimulation device via Wi-Fi, turning it so that it obstructs the turtle’s view. This invokes its natural instinct to move toward light and change its direction. Finally, the human acquires updated visual feedback from the camera mounted on the shell and in this way continues to remotely navigate the turtle’s trajectory.
The researchers demonstrates the animal guiding BCI in a variety of environments, with turtles moving indoors and outdoors on many different surfaces, like gravel and grass, and tackling a range of obstacles, such as shallow water and trees.
This technology could be developed to integrate positioning systems and improved augmented and virtual reality techniques, enabling various applications, including devices for military reconnaissance and surveillance.

CSUF Students develop Hard- and Software to mind-control Wheelchairs

Source: ocregister.com

With the aid of a smartphone application and hardware, Cal State Fullerton students have created a low-cost and easily manageable system by which an electric wheelchair can be controlled by brain waves and facial movements.

The project – Brain-Computer Interface Controlled Driving Aid for Electric Wheelchairs – is being led by graduate computer engineering student Nikhil Shinde and undergraduate computer engineering students Graciela Cortez and Rayton Espiritu.

 

Read more here: http://www.ocregister.com/articles/project-702406-students-electric.html

Overcoming Paralysis with the help of the Mind – and BCI

Source: labmate-online.com

In a feat that reflects the sheer brilliance of humankind, a team of scientists has reversed the effects of paralysis and enabled a man to walk again using the power of his mind. With zero help from an exoskeleton, robotic limbs or brain implant, the achievement marks a huge step forward for the physical rehabilitation sphere. The results have been published in the Journal of Neuroengineering and Rehabilitation and have already got the globe chatting.

Mind over matter

The patient was a 26 year old man who suffered a spinal cord injury that led to no motor movement in his lower limbs. He couldn’t walk and could barely feel sensation in the lower half of his body, until now…

Read more here: http://www.labmate-online.com/news/news-and-views/5/breaking_news/man_uses_mind_to_overcome_paralysis/36457/