top of page

Improving EEG brain-computer interface usability

EEG-based brain-computer interfaces are a promising technology for people who are unable to use traditional interfaces (mouse, keyboard, eye tracking) to be able to control their external environment. This project area aims to improve these devices for the people who need it most.

Contributor(s):
Tatyana Dobreva
David Brown

Brain-computer interfaces, while exciting, have a lot of work to go before they live up to the hype present in movies like Avatar and Pacific Rim. In their present state (as of 2020), they're still rough around the edges and only useful for a few niche applications, such as:

  • Detecting mental state (e.g. sleep, calm)

  • Controlling low-dimensional movements (e.g. a drone or mouse cursor), but with relatively low precision

  • Controlling a selection interface via event-related potentials (e.g. a keyboard to type), but at low speeds of a few characters per minute


Furthermore, they require significant training, concentration, and dedication from the user to work properly. Over the years, we were involved in myriad efforts to help improve these devices for the populations that need it most: people that, due to injury or disease, are unable to use traditional interface devices, e.g. people with quadraplegia or spinal cord injury, people with ALS or other neurodegenerative disease that impairs their motor control, etc.


Our projects have spanned a wide range, from:

  • Improving usability for people with more variable EEG signals

  • Building a multiple choice testing system for children with cerebral palsy

  • Building a partially-autonomous context-aware 3D vision and robot arm interface for patients with quadriplegia to control


We also co-founded a company, Neurable, LLC, back in 2015 to attempt to bring these devices to broader commercial use.


We've since moved on to interfacing with biology at a more cellular level, but look forward to seeing where this field goes over the next decade!

Writings

Huggins, J. E., Alcaide‐Aguirre, R. E., Aref, A. W., Brown, D., & Warschausky, S. A. (2015). Brain‐computer interface administration of the Peabody Picture Vocabulary Test‐IV. In International IEEE/EMBS Conference on Neural Engineering, NER (Vol. 2015–July, pp. 29–32). IEEE Computer Society. https://doi.org/10.1109/NER.2015.7146552

I'm a paragraph.

Alcaide RE, Brown D, Ma X, Aref A, Winter B, Huggins J. Google Glass Display for a Brain‐Computer Interface, Neuroscience Conference, Abstract number 13377, Washington DC, USA, Nov 15‐19, 2014.

Huggins JE, Alcaide‐Aguirre RE, Aref AW, Brown D, Warschausky SA: "Brain‐Computer Interface Administration of a Standardized Vocabulary Test," 2014 MICHR Symposium, Coloring Outside the Lines: Innovating and Collaborating in the Changing World of Health Research, Ann Arbor, MI, October 1, 2014.

News
Coding the Future: the rise of the hackathon

This weekend’s event seemed to revolve around a theme of creation, a theme which has echoed that of past hackathons. Engineering senior David Brown, who was attending his first hackathon, said the collaborative and creative culture present at the event opened up new possibilities in his program design.

Links

Over the course of a nearly sleepless weekend, we hacked together our BCI to control a Google Glass

A senior design project at U of M to control a 3D printed robot arm via detection of intent from a BCI

bottom of page