Brain to World: Interacting with the Real World
What principles does the human brain use to interact with a complex and dynamic world? For instance, how do we catch an object with our hands, or quickly perceive the source of a sound? Can we understand these principles well enough to reproduce them in a computer simulation? In a robot?
To answer these questions, Canada Research Chair Dinesh Pai is developing computational models to be used to investigate multisensory perception and control of movement in biological and engineered systems. His models embrace the external physical world we perceive and interact with, and the internal systems of our bodies that are concerned with movement and sensing. The models are designed for fast response; they exploit precomputation, learning, and memory. They are also useful in robotics and interactive virtual environments with realistic graphics, sounds, and haptics (touch).
As part of his research, Pai develops algorithms for large-scale simulations of the musculoskeletal system, which include feedback from sense organs and contact with external objects. These will be used in models to help develop new treatments for neuromuscular conditions such as spinal cord injury and stroke.
Pai's research is producing more realistic computer animations of human movement, better human-computer interfaces, and better robots. Its influence on biomedical computing cannot be underestimated given its important role in helping to create comprehensive models based on medical imaging and efficient simulation.