Seminar: Representations of Meaning

Course numberLinguist 236 / Psychology 236c
MeetingsThu 2:15-5:05 pm, 100-101K
Email
DiscussionPiazza discussion site
InstructorNoah D. Goodman
Office hoursBy appointment
Office420-356
InstructorChristopher Potts
Office hoursBy appointment
Office460-101


  Topic Reading/Reference
Apr 4
  1. Typed lambda calculi and possible worlds models
c
  1. Lewis, David. 1970. General semantics. Synthese 22(1): 18-67.
  2. Katz, Fred M. and Katz, Jeffrey J. 1977. Is necessity the mother of intension? The Philosophical Review 86(1): 70-96.
  3. Partee, Barbara H. 1973. Some transformational extensions of Montague Grammar. Journal of Philosophical Logic 2(4): 509-534.
  4. Montague, Richard. 1973. The proper treatment of quantification in ordinary English. In Jaakko Hintikka, Julius Matthew Emil Moravcisk, and Patrick Suppes, eds., Approaches to Natural Language, 221-242. Dordrecht: D. Reidel.
  5. Barker, Chris and Shan, Chung-chieh. 2008. Donkey anaphora is in-scope binding. Semantics and Pragmatics 1(1): 1-46.
11
  1. Natural logic and natural language semantics
  2. Varieties of situation semantics
c
  1. MacCartney, Bill and Manning, Christopher D. 2009. An extended model of natural logic. In Proceedings of the Eighth International Conference on Computational Semantics, 140-156. Tilburg, The Netherlands: ACL.
  2. Kratzer, Angelika. 2007. Situations in natural language semantics. In Edward N. Zalta, ed., Stanford Encyclopedia of Philosophy. Stanford, CA: CSLI.
  3. Maienborn, Claudia. 2011. Event semantics. In Claudia Maienborn, Klaus von Heusinger, and Paul Portner, eds., Semantics: An International Handbook of Natural Language Meaning, 802–829. Berlin: Mouton de Gruyter.
  4. Chierchia, Gennaro and Turner, Raymond. 1988. Semantics and property theory. Linguistics and Philosophy 11(3): 261-302.
  5. Kratzer, Angelika. 2002. Facts: particulars or information units?. Linguistics and Philosophy 25(5--6): 655-670.
18
  1. Grounded language understanding
  2. n
  1. Tom Kwiatkowski, Luke Zettlemoyer, Sharon Goldwater, Mark Steedman. (2011) Lexical Generalization in CCG Grammar Induction for Semantic Parsing. Proceedings of the 2011 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning.
  2. Cynthia Matuszek, Evan Herbst, Luke Zettlemoyer, Dieter Fox. (2012) Learning to Parse Natural Language Commands to a Robot Control System. In Proc. of the 13th Int’l Symposium on Experimental Robotics (ISER).
  3. David L. Chen and Raymond J. Mooney. (2011) Learning to Interpret Natural Language Navigation Instructions from Observations. AAAI Conference on Artificial Intelligence (AAAI).
  4. Luke S. Zettlemoyer and Michael Collins. (2007) Online Learning of Relaxed CCG Grammars for Parsing to Logical Form. Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 678–687, Prague, June 2007.
  5. Tom Kwiatkowski, Luke Zettlemoyer, Sharon Goldwater, Mark Steedman. (2010) Inducing Probabilistic CCG Grammars from Logical Form with Higher-Order Unification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP).
  6. Cynthia Matuszek, Nicholas FitzGerald, Luke Zettlemoyer, Liefeng Bo, and Dieter Fox. A Joint Model of Language and Perception for Grounded Attribute Learning. In Proceedings of the International Conference on Machine Learning (ICML), 2012.
25
  1. Grounded language understanding II
  2. n
  1. Stefanie Tellex, Thomas Kollar, Steven Dickerson, Matthew R. Walter, Ashis Gopal Banerjee, Seth Teller, Nicholas Roy. (2011). Approaching the Symbol Grounding Problem with Probabilistic Graphical Models. AI Magazine. 32(4): 64-76.
  2. Liang, Percy, Jordan, Michael, and Klein, Dan. 2011. Learning dependency-based compositional semantics. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, 590-599. Portland, OR: ACL.
  3. Stefanie Tellex, Pratiksha Thaker, Robin Deits, Dimitar Simeonov, Thomas Kollar, Nicholas Roy. (2012). Toward Information Theoretic Human-Robot Dialog. Robotics: Science and Systems.
  4. David L. Chen, Joohyun Kim, Raymond J. Mooney. (2010) Training a Multilingual Sportscaster: Using Perceptual Context to Learn Language. Journal of Artificial Intelligence Research (JAIR), 37, pages 397-435, 2010.
  5. Jeffrey Mark Siskind. (2001) Grounding the Lexical Semantics of Verbs in Visual Perception using Force Dynamics and Event Logic. Journal of Artificial Intelligence Research 15 (2001) 31-90.
May 2
  1. Stochastic lambda calculus
  2. n
  1. N. D. Goodman. (2012) Grounding Lexical Meaning in Core Cognition. Extract of grant submitted to ONR. (Please do not circulate.)
  2. T. Gerstenberg, and N. D. Goodman. (2012) Ping Pong in Church: Productive use of concepts in human probabilistic inference. Proceedings of the Thirty-Fourth Annual Conference of the Cognitive Science Society.
  3. Goodman, Tenenbaum, O'Donnell. Probabilistic Models of Cognition (using Church).
  4. Cameron Freer, Dan Roy, and Josh Tenenbaum. (2012) Towards common-sense reasoning via conditional simulation: Legacies of Turing in Artificial Intelligence. In Turing's Legacy (ASL Lecture Notes in Logic).
9
  1. Distributional approaches to word meanings
  2. c
  1. Harris, Zellig. 1954. Distributional structure. Word 10(23):146-162.
  2. Turney, Peter D. and Pantel, Patrick. 2010. From frequency to meaning: vector space models of semantics. Journal of Artificial Intelligence Research 37: 141-188.
  3. Collobert, Ronan and Weston, Jason. 2008. A unified architecture for natural language processing: deep neural networks with multitask learning. In Proceedings of the 25th International Conference on Machine Learning, 160-167. New York: ACM.
  4. Maas, Andrew L., Daly, Raymond E., Pham, Peter T., Huang, Dan, Ng, Andrew Y., and Potts, Christopher. 2011. Learning word vectors for sentiment analysis. Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, Portland, OR: ACL.
  5. Andrew M. Saxe, James L. McClelland, Surya Ganguli. Learning hierarchical categories in deep neural networks.
16
  1. No class, in honor of the first Academy Awards
23
  1. Entailment in vector-space models
  2. Notes on (Socher's) RNN-based models of semantic composition
  3. n,c
  1. Mitchell, Jeff and Lapata, Mirella. 2010. Composition in distributional models of semantics. Cognitive Science 34(8): 1388-1429.
  2. Baroni, Marco, Bernardi, Raffaella, Do, Ngoc-Quynh, and Shan, Chung-chieh. 2012. Entailment above the word level in distributional semantics. In Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics, 23–32. Avignon, France: ACL.
  3. Socher, Richard, Huval, Brody, Manning, Christopher D., and Ng, Andrew Y. 2012. Semantic compositionality through recursive matrix-vector spaces. In Proceedings of the 2012 Conference on Empirical Methods in Natural Language Processing, 1201-1211. Stroudsburg, PA: ACL.
  4. Grefenstette, Edward, Sadrzadeh, Mehrnoosh, Clark, Stephen, Coecke, Bob, and Pulman, Stephen. 2011. Concrete sentence spaces for compositional distributional models of meaning. In Proceedings of the 9th International Conference on Computational Semantics, 125-134. Portland, OR: ACL.
  5. Socher, Richard, Pennington, Jeffrey, Huang, Eric H., Ng, Andrew Y., and Manning, Christopher D. 2011. Semi-supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, 151–161. Edinburgh, Scotland, UK.: Association for Computational Linguistics.
30
  1. Discussion
  2. c,n
 

[Stanford S13 academic calendar]