Yan Gai, Ph.D.
Assistant Professor of Biomedical Engineering
Saint Louis University
Training & Education
2010-2014, Postdoc, University of Wisconsin Madison
2007-2010, Postdoc, New York University
2007, Ph.D. Bioengineering,Syracuse University
2002, M.E. Electrical Engineering,South China University of Technology
1999, B.S. Electrical Engineering,South China University of Technology
Our research combines behavioral, electrophysiological, and computational approaches to study functions and mechanisms of the mammalian auditory pathways in speech perception and sound localization.
Our first project involves simulating cochlear-implant hearing with a noise-vocoding technique.We design novel speech-processing algorithms and present the processed speech to human listeners. By varying the algorithms, we try to understand the important sound features for a given task, and propose better strategies for the hearing prosthetic industry.
We record brain signals using electroencephalogram (EEG) when the subjects are actively participating in the tasks or passively learning. The goal is to identify the type of brain signals that corresponds to the subject’s performance. We then examine single-neuron responses at subcortical levels to the same sound to see if psychophysical behaviors can be explained.
We also construct a complex systems model that includes the auditory pathways from the inner ear all the way to the midbrain. There are certain manipulations one could easily implement with a model, but not with physiology or behavior. Together we aim to understand the following questions:
- What features of sound are important for a given auditory task?
- How do the neural mechanisms function to carry out the task?
- Are some of the features overlooked by hearing-aid or cochlear-implant strategies?
Dr. Gai teaches two junior-level courses in the areas of medical devices and signals, and one graduate-level course in neuroengineering and imaging.