I am working on literally reverse-engineering the brain, beginning with the auditory pathway. I will show real-time demonstrations (movies) of the various representations of speech and music that are computed in the cochlea, cochlear nucleus, superior olive, and inferior colliculus, synchronized with the input sounds. I will also demonstrate the world's first real-time high-resolution 240-tap, 10-octave, 44 kHz-sampling cochlear model, implemented on a multi-FPGA board in a PC. I will also demonstrate work by an Interval colleague, Dr. John Woodfill, of real-time high-resolution stereo vision.
About the speaker:
Lloyd Watts holds a B.Sc. in Engineering Physics from Queen's University, an M.Sc in Electrical Engineering from Simon Fraser University, and a Ph.D. in Electrical Engineering from the California Institute of Technology, where he studied with professor Carver Mead. He has worked at Microtel Pacific Research in Burnaby, B.C., Synaptics in San Jose, Arithmos in Santa Clara, and currently is employed at Interval Research Corporation in Palo Alto. His research has concentrated on understanding the computations of the human auditory pathway, and implementing and visualizing those computations in real-time in the least expensive medium he can find that will get the job done. For more information, see his website.