Current Research Projects
Our research seeks to augment biological neural networks with artificial neural networks and bionic devices to treat neurological disorders and to further our understanding of neural processing. Working at the intersection of artificial intelligence, robotics, and neuroscience, we are developing biologically-inspired artificial intelligence and brain-machine interfaces to restore and/or enhance human function. We have received funding from the National Science Foundation (NSF), the Defense Advanced Research Projects Agency (DARPA), the National Institutes of Health (NIH), Veteran's Affairs (VA), Facebook Reality Labs (FRL), Biologic Input Output Systems (BIOS), and the University of Utah.
Below is an updated list of funded projects that we are currently working on:
Intuitive and Dexterous Control of Multi-Articulate Bionic Arms
We are developing state-of-the-art prostheses capable of being controlled intuitively by thought. This project leverages deep learning to extract high-degree-of-freedom information from neural and electromyographic recordings in real-time. This project is currently funded by the DARPA Intelligent Neural Interfaces Program in collaborator with Greg Clark. Additional funding is also provided by the company, Biologic Input Output Systems (BIOS).
Biomimetic Sensory Feedback for Prostheses and Neuromuscular Rehabilitation
We previously demonstrated that biomimetic sensory feedback – designed to mimic the natural activation patterns of the nervous system – is more intuitive and useful. We are now expanding these findings to a new non-invasive form of neural stimulation we recently developed. This project is currently funded by the DARPA Hand Proprioception and Touch Interfaces in collaboration with Greg Clark.
Assistive and Rehabilitative Bionic Exoskeletons
We are collaborating with MyoMo to develop adaptive myoelectric control algorithms for powered upper-limb exoskeletons. This project leverages big data to develop generalizable models of muscle activity that are resilient to neuromuscular changes following stroke. This project is funded by the National Institutes of Health from 2020-2025.
Patient-Specific Quantifiable Neuromuscular Diagnostics
We are designing new quantifiable measures of neuromuscular function using electrical recordings from the body. We are currently exploring the use of electromyography and electrical impedance myography to more quickly detect changes in muscle spasticity. This project is currently funded by the National Institutes of Health.
Inclusive Neural Interfaces for Virtual and Augmented Reality
We are developing neural interface technology that can be used to provide dexterous control over virtual and augmented reality. This project specifically seeks to ensure that the technology performs consistently across individuals of varying physical ability levels and neuromuscular impairments. This project is currently funded by Facebook Reality Labs.
Controlling Smart Devices by Thought
We previously demonstrated the ability to deploy state-of-the-art algorithms onto low-cost microcontrollers for real-time prosthetic control. We are now developing low-cost integrated circuits that communicate with smartphones via Bluetooth for internet-of-things applications. This project is currently funded by the University of Utah's Partners for Innovation, Ventures, Outreach and Technology (PIVOT) and the Veteran's Affairs (VA).
EMG-based Activity Tracking for Health Monitoring
We have developed a smart-watch capable of tracking hand activity. Now, we are developing models capable of detecting activities of daily living based on the neuromuscular and kinematic activity of the hands. Knowing what a person is doing within their own home allows us to: 1) make assistive technology smarter, 2) track rehabilitation compliance/outcomes, 3) monitor patient compliance remotely (e.g., if caregivers when patients with memory impairments haven't taken their medication), and 4) identify at risk individuals (e.g., suicide and addiction relapse often coincide with deviations from normal routines).
Reinforcement Learning for Functional Electrical Stimulation
We are leveraging deep reinforcement learning to rapidly optimize stimulation parameters for functional electrical stimulation. We are using optimal stimulation to produce isolated finger movements and functional grasps to both assist and rehabilitate stroke and spinal cord injury patients. This project is currently funded by the University of Utah and additional support has been provided by Ripple Neuro.
Controlling Adaptive Sports Equipment by Thought
We are developing neural interface technology that can be used to provide spinal-cord-injury patients with dexterous control over adaptive sports equipment, including skis and sailboats. This project integrates brain-computer interfaces for intuitive and dexterous control with vibrotactile feedback systems to provide closed-loop feedback. This project is currently funded by the University of Utah in collaboration with Jeff Rosenbluth, Tetradapt, and the TRAILS foundation.
Shared Human-Machine Control of Prostheses
We recently developed a “self-aware” prosthesis that is capable of autonomously interacting with nearby objects. We are now measuring the functional and psychological impact of shared human-machine control of prostheses – such that the human operator works synergistically with an autonomous computer operator. This project is currently funded by the National Science Foundation in collaboration with David Warren. Additional support has been provided by the University of Utah.
Software Engineering 2.0 for Brain-Computer Interfaces
We previously demonstrated that the training data plays a critical role in the run-time performance of brain-computer interfaces. We are now taking a software engineering 2.0 approach to improving the performance of brain-computer interfaces, in which we optimize training data to improve the performance of deep-learning algorithms. This project is currently funded by the National Institutes of Health. Additional funding has been provided by Facebook Reality Labs.