At the time of birth, a child’s sensory experience is very limited. Throughout development, however, children’s ability to perceive the world around them develops at an incredible rate, allowing for them to learn a host of new skills. Just how this perceptual development takes place is one area of research in our lab.
One area of perception that is learned throughout development is the ability to perceptually bind sensory information. Perceptual binding is essentially the ability to take two or more pieces of sensory information and integrate them into a single perception. In the visual realm, face perception is a common example of this. When you see a face, you don’t see two eyes and ears, a nose and a mouth, you simply ‘see’ the face as a whole. We are not born knowing that these elements make up a face though, so how is it that we come to know this. This leads to our first developmental research question:
- What is the developmental trajectory of perceptual binding, and how does the developing brain ‘learn’ to integrate sensory inputs both within sensory modalities (e.g. vision or audition) and across sensory modalities (e.g. audiovisual)?
One possible mechanism that our lab is currently investigating is statistical learning. Even though we may not be consciously aware of it, our sensory systems are continuously monitoring our environment and evolving a sense of how things usually are. Sensory systems can then use this information to predict what the environment will be like in the future, and allow us to perceive the world more efficiently. This process is called statistical learning, and it is one of the most powerful forms of learning during development. Our labs research questions concerning statistical perceptual learning include:
- What pieces of sensory information drive the ability to learn how to associate auditory and visual information?
- Does the ability of statistical learning change throughout development?
- Can we harness the power of statistical learning to improve an individual’s ability to perceive the world?
For peer-reviewed articles related to this line of research, see:
Lowe, M. X., Stevenson, R. A., Wilson, K. E., Ouslis, N. E., Barense, M. D., Cant, J. S., & Ferber, S. (In Press). Sensory Processing Patterns Predict the Integration of Ensemble Statistics with Items Held in Visual Working Memory. Journal of Experimental Psychology: Human Perception and Performance.
Altieri, N. A., Stevenson, R. A., Wallace, M. T., & Wenger, M. J. (2015). Learning to associate auditory and visual stimuli: Behavioral and neural mechanisms. Brain Topography, 28(3), 479-493.
Stevenson, R. A., Siemann, J. K, Schneider, B. C., Eberly, H. E., Woynaroski, T. G., Camarata, S. M., & Wallace, M. T. (2014). Arrested development of audiovisual speech perception in autism spectrum disorders. Journal of Autism and Developmental Disabilities, 44(6), 1470-1477.
Schlesinger, J., Stevenson, R. A., & Wallace, M. T. (2014). Effects of multisensory training on pitch perception of a pulse oximeter. Anesthesia & Analgesia, 118(6), 1249-1253.
Stevenson, R. A., Wilson, M. M., Powers, A. R., & Wallace, M. T. (2013). The effects of visual training on multisensory temporal processing. Experimental Brain Research, 255(4): 479-489.