skip to main content
Home  /  Research


Cognitive neural prosthetic control of a robotic limb


Subject EGS controlling a robotic limb with decoded intention signals

We develop systems and algorithms to decode subjects' dexterous movement intentions from cognitive-level motor control areas of cortex. To support this work, we record the action potentials of individual neurons, as well as the continuously-sampled local field potentials, from intracortical microelectrode arrays. Trajectories, grasp patterns and higher order movement intentions, such as goals and decisions, encoded in these areas can be transformed to real-time movements of external devices such as a cursor on a computer screen or a multi-degree of freedom robotic arm. By decoding the subjects' intentions from these high-level motor signals, we hypothesize that paralyzed persons will be able to more easily and more rapidly attain control with the brain-machine interface.

Research in Cognitive Neuroengineering


Example of a hybrid system which blends recorded activity of user intent with machine vision and intelligent robotic algorithms.  From Katyal et al. 2014.

Traditional brain-machine interfaces (BMI) link channels of neural activity to individual degrees of freedom (DOF) in robotic prostheses. However, current neural recording technology yields many fewer DOFs than are available in robotic prostheses, and low-level motor signals can be inefficient and difficult to use with increasing DOFs. Instead, we decode high-level cognitive intent from the posterior parietal cortex (PPC) for execution by a sensor-coupled robotic control system. Improving BMI technology in this way requires novel methods for decoding cognitive intents, parsing the physical environment to provide contextual actions, and controlling the robotic prosthesis in real-time to navigate dynamic environments. We have partnered with colleagues at Johns Hopkins University Applied Physics Laboratory (JHUAPL) to develop machine vision and learning systems, and with colleagues at Keck Hospital of USC to provide surgical and medical expertise. These institutions have a strong history of working together on BMI technologies: Caltech, Keck, and JHUAPL were all part of the original cross-disciplinary team which initiated the ongoing clinical trials still conducted at Caltech. By incorporating fast, intuitive neural decoding with robust machine vision and control, this project capitalizes on both the deep insights of cognitive neuroscience and the most advanced capabilities in robotic intelligence to develop a BMI that realizes the promise of independence for persons with tetraplegia.

BMI Learning


Cognitive strategy to learn a new mapping of activity to target locations using existing patterns of activity in a BMI task.  From Hwang et al. 2013.

Behavioral performance improves along with changes in neural activity; however, both the nature and the constraints of learning in brain-machine interfaces are still poorly understood. By perturbing the mapping between the neural activity and 2D movement signals, we explore how neurons learn to compensate for errors in brain-machine interface tasks in human tetraplegic participants. In particular, we are exploring how conscious shifts in cognitive strategy along with implicit learning mechanisms come together to improve control performance.

Computational mechanisms of action suppression


Frontal-subthalamic circuitry. A. The hyperdirect, indirect and direct basal ganglia pathways (Schroll and Hamker, Frontiers in Systems Neuroscience). B. Three putatively different hyperdirect pathways for three kinds of STN-mediated pause.

An area of particular interest is the study of action suppression. Action suppression occurs in at least 3 ways: (i) action selection, in which one suppresses competing motor programs when selecting an action, (ii) decision conflict, in which one broadly suppresses the response set when presented with conflicting information, to allow time to make the decision, and (iii) outright stopping, in which one completely suppresses the action when it is rendered inappropriate. The basal ganglia (BG) and, in particular, the subthalamic nucleus (STN) have been functionally implicated in each of these contexts, but in association with distinct frontal areas (e.g. primary motor, pre-frontal). How and whether these different types of action suppression map to distinct frontal-BG networks and how these distinct functions relate neuroanatomically remains incompletely understood. In collaboration with UCLA and UCSD we are working on understanding how the frontal cortex-basal ganglia (BG) circuits mediate these 3 action suppression functions. Our lab contributes to developing neuro-computational models of action suppression that can inform and be constrained by electrophysiological data collected from Parkinson's Disease (PD) patients at UCLA. These models will push forward knowledge by (i) helping assess whether a single architecture can account for all 3 types of "pausing" action (both neutrally and behaviorally) or whether we need separate architectures, (ii) testing long-standing theories on how the pause mechanisms are implemented in STN and (iii) allowing us to simulate brain perturbations (disease or stimulation) and to model the effects on neuronal responses, on the interaction between cortical/subcortical regions, and on behavior, such as reaction-time and accuracy.

Somatosensation through electrical stimulation of cortex


Hand representation for intracortical microstimulation in a non-human primate.  From Klaes et al. 2014.

The variety and consistency of percepts that can be elicited in humans through intracortical stimulation remains an open question. To address this, we use intracortical electrical stimulation to primary somatosensory cortex to generate sensory percepts. First, we are creating a map of elicited responses by exploring and evaluating the different sensations and their locations on the body, generated through many different configurations of electrical stimuli. Second, we are working on classifying the stimulus parameters that create a clear distinction between different sensory percepts, each of which have a unique and significant role in providing feedback during neural prosthetic control. Finally, this project seeks to understand the basic science underlying the neural properties of primary somatosensory cortex in the context of these stimulation percepts. Understanding these fundamental properties could help us predict the sensory outcome of the stimulations and deliver relevant sensory information to patients.

Memory functions in human PPC


A memory task used by the Rutishauser lab for human single neuron recordings.  From Kaminski et al. 2017.

Memories allow us to utilize previous experiences for making decisions. The medial temporal lobe (MTL) is essential for episodic memories, but how memories influence decision making is poorly understood. The posterior parietal cortex (PPC) is thought to support memory-based decision making by integrating memory-based information provided by the MTL. Interestingly, memory based processing seem to be especially strong in AIP, a region that is commonly associated with grasping behaviors and has been implanted in human subjects as part of a BMI clinical trial. In collaboration with Ueli Rutishauser's lab at Cedar-Sinai we are recording single neurons in AIP while tetraplegic patients perform a new/old recognition memory task to better understand memory coding in PPC.

Awareness and Intent


Proposed map of intentions to reach, saccade, and grasp within the posterior parietal cortex.  From Andersen and Buneo 2002.

Neural prosthetics research has typically assumed that decoded neural signals directly reflect the conscious intentions of the user. We tackle this assumption head-on by looking at the relationship between neural signals and conscious intent in subjects participating in a brain-machine interface clinical trial. We find evidence that neural activity recorded during task performance can reflect implicit computations that support task performance but are not directly tied to the participant's awareness. We have developed sophisticated algorithms that can incorporate knowledge of how both implicit and explicit motor intentions are coded in the neural population.

Action Observation


Observed grasping action used by the Orban lab.  From Abdollahi et al. 2013

The ability to discriminate the actions of others is fundamental to a diverse set of social interactions, however, basic knowledge of the neural circuits that enable action observation lags behind other visual sciences. Non-human primate single neuron studies have focused on premotor cortex or lateral occipito-temporal cortex, however the PPC is activated in human fMRI studies of action observation. This neuroimaging work suggests a distributed coding scheme with different classes of action processed in different subregions of PPC similar to object recognition in IT (e.g. FFA specialized for faces, EBA specialized for body parts, PPA specialized for places, etc.) In collaboration with Guy Orban's lab at the University of Parma we are using the rare opportunity to record single neurons from human PPC to investigate selectivity for actions. This work helps to better understand visual processing for observed actions but may better inform how actions are encoded in human PPC.

Closing the loop for bidirectional brain-machine interfaces


Schematic of a bidirectional brain-machine interface.  From Andersen et al. 2014.

Even with advances in cognitive control of neural prosthetics, the lack of sensory percepts during dexterous manipulation likely limits the performance of brain-machine interfaces. Incorporating the principles from our other projects focused on developing control and feedback systems, we aim to close the loop in neural prosthetics to develop a bidirectional BMI. We will provide cognitive-level control of a robotic limb and hand while simultaneously delivering feedback about its position, movement direction, and forces when interacting with other objects. Such a system will allow us to explore learning effects in both the decoding and encoding of task-related brain signals. Finally, we are developing advanced signal processing techniques to support simultaneous stimulation and recording. The natural and intuitive combination of integrated control and somatosensation more closely mimics the physiological sensory-motor loop, thus enhancing the utility of BMIs for paralyzed users.

Ultrasound functional imaging in non-human primates


Ultrafast Doppler vascular map. 2-dimensional Power Doppler coronal image of a NHP brain vascular system around parietal cortex.

High-density neural interfaces require the ability to observe large-scale patterns of neural activity with high spatiotemporal resolution. Ideally, to facilitate their application in research and potential clinical studies, such interfaces should be non-invasive or minimally invasive. Current methods to monitor neural activity, including electrophysiology, optical imaging and functional MRI, fall short of these criteria. Recently, functional ultrasound (fUS) was introduced as an revolutionary technology to image neural activity non-invasively by offering an order of magnitude improved spatial and temporal resolution compared to functional MRI (100 µm and < 10 ms) using transducers that can be mounted on freely moving animals. In collaboration with the Shapiro lab at Caltech and the Tanter lab at the Institute Langevin, ESPCI in Paris-France, we are working on developing a novel technique to perform real-time functional ultrasound-guided electrophysiology (FUGE) and pharmacological intervention in non-human primates (NHPs).

Reference Frame


Task for determining hand and eye centered coordinate frames.  From Bremner and Andersen 2014.

A decoding algorithm coupled with the training paradigm embodies the assumptions we make about a brain area. For example, modern decoding algorithms assume that intentions of the user are encoded relative to the controlled effector, independent of factors such as eye or head position. We have studied the reference frame of intentions across a range of brain areas demonstrating that intentions can be encoded in a variety of ways depending on the specifics of the cortical area. We explore the reference frames of intention coding in paralyzed human subjects to ensure that decoding algorithms are properly calibrated to the reference frame of the recorded neural populations.

State-estimator in the Posterior Parietal Cortex (PPC)


Model of state estimation in the posterior parietal cortex from Mulliken et al. 2008.

Humans and animals usually make decisions and select actions based on noisy sensory information and incomplete knowledge of the environment. To avoid instabilities due to these factors, the brain uses an internal forward model to predict how ongoing actions affect the state of the body and the environment. Predictions are integrated with the incoming sensory information to update the estimate of the next state. Despite many years of research, little is known about where and how the brain implements this state-estimation process. We hypothesize that parietal cortex contains an internal estimate of the movement kinematics that is updated via perceptual feedback. To test this hypothesis, we recorded neural activity from the parietal cortex while humans with tetraplegia and NHPs control a cursor using different effectors (humans use their head and animals their hand) to perform a variety of movements (e.g., center-out to peripheral targets, obstacle avoidance movements, etc). By delaying the visual feedback between the effector and cursor position, we explore how the parietal cortex, as well as other brain areas (e.g., M1, PMd) that are implanted with multi-electrode arrays, respond and adapt to this perturbation. So far, we have found that immediately after applying the perturbation in NHPs, there is a significant drop in signal fidelity in area 5d. As the session progresses the signal strength returns to its original level. We interpreted these results as evidence that area 5d contains an adaptive internal estimate of the current state of the hand, which compensates for the delayed visual feedback. Currently, we are testing how human PPC responds to perturbations and how other types of perturbations (e.g., visuomotor rotations) affect PPC and other brain areas.