Speech and Language Processing (3rd ed. draft)
Dan Jurafsky and James H. Martin

Here's our September 21, 2021 draft! This is just an update draft, fixing bugs and filling in various missing sections (more on transformers, including for MT, various updated algorithms, like for dependency parsers, etc.). Expect another release later this fall with drafts of some of the 3 missing chapters.

We are really grateful to all of you for finding bugs and offering great suggestions!

Individual chapters are below; here is a single pdf of all the chapters in the Sep 21, 2021 draft of the book-so-far

As always, typos and comments very welcome (just email slp3edbugs@gmail.com and let us know the date on the draft)!
(Due to reorganizing, still expect some missing latex cross-references throughout the pdfs, don't bother reporting those missing ref/typos.)

Feel free to use the draft slides in your classes.

When will the whole book be finished?
Don't ask. But we're still shooting for before the end of 2021 for the 3 remaining chapters (Intro, Contextual Embeddings, Semantic Parsing) + random missing sections, but we'll see, and then the publishing process of course takes time.

And if you need last year's draft chapters, they are here.

Chapter Slides Relation to 2nd ed.
1:Introduction [Ch. 1 in 2nd ed.]
2: Regular Expressions, Text Normalization, Edit Distance 2: Text Processing [pptx] [pdf]
2: Edit Distance [pptx] [pdf]
[Ch. 2 and parts of Ch. 3 in 2nd ed.]
3: N-gram Language Models 3: N-grams [pptx] [pdf]
[Ch. 4 in 2nd ed.]
4: Naive Bayes and Sentiment Classification 4: Naive Bayes + Sentiment [pptx] [pdf]
[new in this edition]
5: Logistic Regression 5: LR [pptx] [pdf]
[new in this edition]
6: Vector Semantics and Embeddings 6: Vector Semantics [pptx] [pdf] [new in this edition]
7: Neural Networks and Neural Language Models 7: Neural Networks [pptx] [pdf] [new in this edition]
8: Sequence Labeling for Parts of Speech and Named Entities 8: POS/NER Intro only [pptx] [pdf] [expanded from Ch. 5 in 2nd ed.]
9: Deep Learning Architectures for Sequence Processing
10: Machine Translation [newly written for this edition, earlier MT was Ch. 25 in 2nd ed.]
11: Transfer Learning with Contextual Embeddings and Pre-trained language models [new in this edition]
 
12: Constituency Grammars [Ch. 12 in 2nd ed.]
13: Constituency Parsing [expanded from Ch. 13 in 2nd ed.]
14: Dependency Parsing [new in this edition]
 
15: Logical Representations of Sentence Meaning
16: Computational Semantics and Semantic Parsing
17: Information Extraction [Ch. 22 in 2nd ed.]
18: Word Senses and WordNet
19: Semantic Role Labeling and Argument Structure [expanded from parts of Ch. 19, 20 in 2nd ed]
20: Lexicons for Sentiment, Affect, and Connotation 20: Affect [pptx] [pdf] [new in this edition]
 
21: Coreference Resolution [mostly newly written; some sections expanded from parts of Ch 21 in 2nd ed]
22: Discourse Coherence [mostly new for this edition]
 
23: Question Answering [mostly newly written ; a few sections on classic algorithms expanded from parts of Ch 23 in 2nd ed]
24: Chatbots and Dialogue Systems 24: Dialog [pptx] [pdf] [mostly new, parts expanded from Ch 24 in 2nd ed]
25: Phonetics [Ch 7 in 2nd ed]
26: Automatic Speech Recognition and Text-to-Speech [Mostly newly written, expanded from some parts of Chs 8 and 9 in 2nd ed]
 
Appendix Chapters (will be just on the web)
A: Hidden Markov Models
B: Spelling Correction and the Noisy Channel
C: Statistical Constituency Parsing [Ch. 14 in 2nd ed.]