Announcements

Sep 28, 2016: Labs

The times for the midterm and final exam are of course AM, not PM as erroneously stated on the syllabus. The first homework assignment is up and due on October 5.

Sep 22, 2016: Labs

Occasionally we will post links to "labs" which supplement the day's lecture. These labs feature code and output produced by the course staff to illustrate a concept. For instance, Lab 2 (under the Lectures tab) shows you how we generated the bias-variance decomposition example in lecture 2. Feel free to read through the lab to improve your understanding and to try your hand at recreating or modifying our examples.


Meeting time and recorded lectures

Stats 202 meets MWF 9:30-10:20 am in Skilling 80.

All lectures will be recorded on video by the Stanford Center for Professional Development and posted on their site.

Lecture slides will be posted on this site (see the Lectures link on the left).


Course description

Stats 202 is an introduction to Data Mining. By the end of the quarter, students will:

  • Understand the distinction between supervised and unsupervised learning and be able to identify appropriate tools to answer different research questions.
  • Become familiar with basic unsupervised procedures including clustering and principal components analysis.
  • Become familiar with the following regression and classification algorithms: linear regression, ridge regression, the lasso, logistic regression, linear discriminant analysis, K-nearest neighbors, splines, generalized additive models, tree-based methods, and support vector machines.
  • Gain a practical appreciation of the bias-variance tradeoff and apply model selection methods based on cross-validation and bootstrapping to a prediction challenge.
  • Analyze a real dataset of moderate size using R.
  • Develop the computational skills for data wrangling, collaboration, and reproducible research.
  • Be exposed to other topics in machine learning, such as missing data, prediction using time series and relational data, non-linear dimensionality reduction techniques, web-based data visualizations, anomaly detection, and representation learning.

Prerequisites

Introductory courses in statistics or probability (e.g., Stats 60), linear algebra (e.g., Math 51), and computer programming (e.g., CS 105).


Communication

The vast majority of questions about homework, the lectures, or the course should be asked on our Piazza forum, as others will benefit from the responses. You can join the Piazza forum using the link www.piazza.com/stanford/fall2016/stats202. We strongly encourage students to respond to one another's questions!

Questions from which others cannot benefit can be emailed to the staff mailing list stats202-aut1617-staff@lists.stanford.edu.

Personal staff email addresses should only be used for sensitive matters (e.g., concerns about specific course staff).


Staff and office hours

Consult this table for up-to-date office hour information. For online office hours, we provide persistent meeting links which will be active at the advertised office hour times. Upon clicking the link, you will have the option of joining the meeting by phone, browser, or BlueJeans app.

Office hours Location
Instructor Guenther Walther MTh 10:30-11:30am, or by appointment Sequoia 135
TA Kelvin Guu Th 5:30-7:30pm Gates 256
TA Jiyao Kou F 11am-1pm Sequoia 233
TA Evan Patterson M, Tu 3-4pm Bluejeans Meeting
TA Amir Sepehri Tu 2:15-4:15pm Littlefield 334
TA Matteo Sesia Tu 8:30-10:30am Sequoia 207
TA Qian Zhao Mon 5:30-7:30pm Sequoia 220

Textbook

The only textbook required is An Introduction to Statistical Learning with Applications in R by Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani (Springer, 1st ed., 2013). The book is available at the Stanford Bookstore and free online through the Stanford Libraries.

We may occasionally assign (optional) supplementary readings from the optional text The Elements of Statistical Learning by Hastie, Tibshirani, and Friedman (Springer, 2nd ed.).

In our lecture notes, the abbreviation ISL = Introduction to Statistical Learning and ESL = Elements of Statistical Learning.


Exams

(If you are an online SCPD student, please see SCPD info for more information on remote exam instructions and timings.)

  • Midterm exam: Monday, October 31, 9:30-10:20 am (in our normal classroom).
  • Final exam: Tuesday, December 13, 8.30-11.30 am (in a room TBD).

If you cannot take these exams at those dates then you will need to take this class in a different quarter. There will be no alternative dates for these exams unless by official university business such as certain athletic commitments. If you do better on the final than on the midterm then the final supersedes the midterm.


Homework

There will be 7 graded homework assignments, due on Wednesdays at the start of class. An ungraded assignment (Homework 0) will help you install and become familiar with the tools used in this course. The homework assignments and staff solutions will be posted on this website and will be accessible by enrolled students (see the Homework link on the left).

After attempting homework problems on an individual basis, you may discuss a homework assignment with up to two classmates. However, you must write up your own solutions individually and explicitly indicate with whom (if anyone) you discussed the homework problems at the top of your homework solutions. In your solutions, please show your work and include all relevant code written. Please also keep in mind the university honor code.

This quarter, we will be using the Gradescope online submission and scoring system for all homework submission. Gradescope will send a Stats 202 enrollment notification to your Stanford email address. If you have not received such a notification by Thursday Sep. 29, please contact the course staff via the staff mailing list.

Your problem sets should be submitted as PDF or image files through Gradescope. Here are some tips for scanning and submitting through Gradescope.

Any regrade requests should be submitted through Gradescope within one week of receiving your grade. Please, read the relevant solutions and review the relevant course material prior to sending a request and specify (1) the part(s) of the homework you believe were wrongly graded and (2) why you deserve additional credit. We will typically regrade the entirety of any homework for which any regrade is requested and the resulting score may be higher or lower than the original one.

Late homework will not be accepted, but the lowest homework score will be ignored.


Kaggle competition

An important part of the class will be an in-class prediction challenge hosted by Kaggle. This competition will allow you to apply the concepts learned in class and develop the computational skills to analyze data in a collaborative setting.

To learn more about the competition see the link on the left.


Grading

  • Homework: 35% (lowest score dropped).
  • Midterm: 20%.
  • Final: 40%.
  • Kaggle competition: 5% (based on satisfactory participation).

Tentative outline

Day Topic Chapters Homework
Mon 9/26 Class logistics, HW 0 HW 0 out
Wed 9/28 Supervised and unsupervised learning 2 HW 1 out
Fri 9/30 Principal components analysis 10.1,10.2,10.4 HW 0 due
Mon 10/03 Clustering 10.3, 10.5
Wed 10/05 Linear regression 3.1-3.3 HW 1 due, HW 2 out
Fri 10/07 Linear regression 3.3-3.6
Mon 10/10 Classification, logistic regression 4.1-4.3
Wed 10/12 Linear discriminant analysis 4.4-4.5 HW 2 due, HW 3 out
Fri 10/14 Classification lab 4.6
Mon 10/17 Cross validation 5.1
Wed 10/19 The Bootstrap 5.2-5.3 HW 3 due, HW 4 out
Fri 10/21 Regularization 6.1, 6.5
Mon 10/24 Shrinkage 6.2
Wed 10/26 Shrinkage lab 6.6 HW 4 due
Fri 10/28 Dimension reduction 6.3, 6.7
Mon 10/31 Midterm exam
Wed 11/02 Splines 7.1-7.4 HW 5 out
Fri 11/04 Smoothing splines, GAMs, Local regression 7.5-7.7
Mon 11/07 Non-linear regression lab 7.8
Wed 11/09 Decision trees 8.1, 8.3.1-2 HW 5 due, HW 6 out
Fri 11/11 Bagging, random forests, boosting 8.2, 8.3.3-4
Mon 11/14 Support vector machines 9.1-9.2
Wed 11/16 Support vector machines 9.3-9.5 HW 6 due, HW 7 out
Fri 11/18 Support vector machines lab 9.6
Mon 11/21 Thanksgiving
Wed 11/23 Thanksgiving
Fri 11/25 Thanksgiving
Mon 11/28 Non-linear dimensionality reduction
Wed 11/30 Wavelets HW 7 due
Fri 12/02 Data scraping, data wrangling
Mon 12/05 Web visualizations
Wed 12/07 Final review All chapters Kaggle deadline
Fri 12/09 Final review All chapters
Tue 12/13 Final exam