Grants and Contributions:

Title:
Interdisciplinary studies of spatial motion estimation and motor planning
Agreement Number:
RGPIN
Agreement Value:
$125,000.00
Agreement Date:
May 10, 2017 -
Organization:
Natural Sciences and Engineering Research Council of Canada
Location:
Quebec, CA
Reference Number:
GC-2017-Q1-02635
Agreement Type:
Grant
Report Type:
Grants and Contributions
Additional Information:

Grant or Award spanning more than one fiscal year. (2017-2018 to 2022-2023)

Recipient's Legal Name:
Green, Andrea (Université de Montréal)
Program:
Discovery Grants Program - Individual
Program Purpose:

Whether we’re running to catch a ball or turning to reach for a coffee cup, our ability to interact with the environment depends critically on knowing our motion and orientation in space. As we move, the central nervous system (CNS) combines information from multiple sensory sources (e.g., vestibular, somatosensory, visual) to construct estimates of our motion that we use to control posture, navigate, perform voluntary actions, and maintain clear visual perception. This poses computational challenges because individual sensors often provide ambiguous motion information and different tasks require different motion representations. These, in turn, contribute to behavior via a range of different mechanisms. The goal of my research program is to study how self-motion estimates are computed and how they are used in the planning and control of motor behavior.

Our interactions with the environment (e.g., reaching to an object) are often executed with the body in motion. However, we still know surprisingly little about how self-motion estimates contribute to voluntary actions. Similarly, how we compute the types of body- and world-centered motion representations used for tasks such as reaching, postural control and navigation remains poorly understood. In this application, I am requesting the renewal of my NSERC Discovery Grant to continue ongoing human behavioral and computational modeling studies aimed at these questions. Project 1 explores the mechanisms by which vestibular signals contribute to reach planning and execution. One set of experiments addresses how the CNS compensates for both the spatial displacement of the limb and additional forces imposed on it by unexpected body motion. It tests the hypothesis that the processing of vestibular signals for online reach execution takes into account knowledge of limb biomechanics. A second set of experiments studies how the processing of vestibular signals for reaching is influenced by objects in the environment other than the reach goal, which may become obstacles and influence the reach path we choose. Project 2 develops computational models to explore how the CNS integrates sensory signals to create the types of motion estimates required for behaviors such as reaching, postural control and navigation. It builds on our existing theoretical framework for how brainstem-cerebellar circuits compute self-motion estimates to incorporate recent physiological findings and to explore hypotheses for how they create novel body- and world-centered motion representations. The aim is to generate precise predictions for neural response properties that can be tested experimentally.

Ultimately, this work will provide new insights into brain mechanisms that allow us to interact with our environment as we move. It will have application for treating balance and spatial disorientation problems as well as for robotics, neural prosthetics and space travel.