Speaker: Tom Dean
Affiliation: Stanford, Wu Tsai Neurosciences Institute & Brown University, Department of Computer Science
Date: Tuesday, April 9, 2020
Title: Biological Blueprints for Human Inspired AI: Part II (SLIDES) (VIDEO)
Talking Points:
on the mapping between biological and artificial neural networks – focusing on what is lost and gained in translation
toward a complete embodied cognitive neural architecture for end-to-end training – what to include / what to leave out
core architectural components – basal ganglia (BG), hippocampus (HPC), prefrontal (PFC), motor and sensory cortex
anatomy and physiology of the basal ganglia, cortical and subcortical connections, action orchestration not selection
inhibitory and excitatory signaling pathways involving gap junctions, ion channels, transcription and neuromodulation
block diagram separates functional units and defines processing stages, recurrent connections and gated transmission
Hochreiter & Schmidhuber's long short-term memory (LSTM) models the striatum and working memory in frontal cortex
hippocampal formation, episodic memory and expanding role in facilitating generalization and flexible relational memory
different network topologies and specialized cell types supporting pattern separation, completion and integration
HPC and cerebellum (CB) exhibit similar physiology and cytology with the CB generally cast in role of procedural memory
differential neural computer (DNC) model was roughly based on the hippocampus and supports similar indexing features
cortical columns, repeated general processing units specialized for particular functions, anterior-posterior dichotomy
multiple layers versus multiple processing levels, feedback, long-distance reciprocal connections, Fuster's hierarchy
component learning: CB supervised, BG reinforcement (extrinsic motivation), and PFC unsupervised (intrinsic motivation)
life-long learning, transfer learning, catastrophic interference, sample complexity, grounding, perception-action cycle
if we built a complete architecture from these components, how could we use it to implement a programmer's apprentice?
Resources: Below you'll find an assortment of course-related resources:
An introduction to the concepts, methods and technologies covered in this course, including basic neural network components and topics relating to cognitive and systems neuroscience;
The calendar listing the presentation day and topic of discussion for each of the student-guided discussion sections — five primary to start with and eight additional depending on interest;
A description of the first five discussion sections including suggestions for presentation and resources in the form of technical papers, tutorials and recorded lectures;
A short primer on how to run a student-guided discussion section, including suggestions for topic preparation, audiovisual materials and reading assignments;
A document that is part autobiography and part history of my involvement in artificial intelligence and computational neuroscience over a period spanning more than forty years;