Introduction: Part II

Speaker: Tom Dean
Affiliation: Stanford, Wu Tsai Neurosciences Institute & Brown University, Department of Computer Science
Date: Tuesday, April 9, 2020
Title: Biological Blueprints for Human Inspired AI: Part II (SLIDES) (VIDEO)

Talking Points:

  1. on the mapping between biological and artificial neural networks – focusing on what is lost and gained in translation

  2. toward a complete embodied cognitive neural architecture for end-to-end training – what to include / what to leave out

  3. core architectural components – basal ganglia (BG), hippocampus (HPC), prefrontal (PFC), motor and sensory cortex

  4. anatomy and physiology of the basal ganglia, cortical and subcortical connections, action orchestration not selection

  5. inhibitory and excitatory signaling pathways involving gap junctions, ion channels, transcription and neuromodulation

  6. block diagram separates functional units and defines processing stages, recurrent connections and gated transmission

  7. Hochreiter & Schmidhuber's long short-term memory (LSTM) models the striatum and working memory in frontal cortex

  8. hippocampal formation, episodic memory and expanding role in facilitating generalization and flexible relational memory

  9. different network topologies and specialized cell types supporting pattern separation, completion and integration

  10. HPC and cerebellum (CB) exhibit similar physiology and cytology with the CB generally cast in role of procedural memory

  11. differential neural computer (DNC) model was roughly based on the hippocampus and supports similar indexing features

  12. cortical columns, repeated general processing units specialized for particular functions, anterior-posterior dichotomy

  13. multiple layers versus multiple processing levels, feedback, long-distance reciprocal connections, Fuster's hierarchy

  14. component learning: CB supervised, BG reinforcement (extrinsic motivation), and PFC unsupervised (intrinsic motivation)

  15. life-long learning, transfer learning, catastrophic interference, sample complexity, grounding, perception-action cycle

  16. if we built a complete architecture from these components, how could we use it to implement a programmer's apprentice?

Resources: Below you'll find an assortment of course-related resources: