CS294S: Building the World’s Best Virtual Assistant for Mobile Devices and the Internet of Things.

CS294S is a research project course to explore how consumers will interact with software in the future. While Alexa, Siri, and Google Now can only handle simple commands today, the virtual assistant of the future will handle sentences such as “open the blinds when the alarm goes off”, “find a tennis partner in the afternoon”, and “remind me to call mom when I finish playing this VR game”. Ultimately, the goal is to develop technology that lets end-users “program” using natural language.

Siri Icon
Alexa Icon
Allo Icon


Best project

Speaker Recognition (Quinlan Jung, Michael Xing, Prasad Kawthekar, Junwon Park)

Understanding who is speaking is a crucial part of a multi-user virtual assistant. This project implements a multi-user speech interface that recognizes the speaker and maps the command to his virtual assistant. Applications include customization, parental control, as well as meeting transcriptions and the ability for a VA to sit passively in a conversation and react when called.

Best video

Data Visualization for Virtual Assistants (Michael Fischer)

Voice, the primary interaction modality of commercial VAs such as Echo and Google Home, is a great form of input, but not the best output modality for all answers. This project explores how a virtual assistant can produce semantically rich data visualizations on a large screen.

Virtual Assistant Infrastructure

SNER: Specific (Snorkel) Named Entity Recognition for Almond (Brian Higgins)

Named Entity Recognition is a well-studied problem in the NLP literature, but the commonly used 5 classes (Person, Organization, Location, Misc, Other) are not sufficient for virtual assistants. This project explores the quick creation of a dataset to train a state-of-the-art recognizer for a new class of entities.

Mango: Location-Aware Assistant (Derin Dutz)

Location is a central concept in a mobile virtual assistant. As people move about, the virtual assistant moves with them, knows where they are, and reacts accordingly. This project explores the infrastructure and APIs to provide continuous tracking and support numerous use cases.

M-Almond: Automating Multi-Party Tasks with End-User Programmable Virtual Assistants (Rakesh Ramesh, Giovanni Campagna, Silei Xu, Michael Fischer)

How does a doctor get access to a patient’s IoT information? How can one monitor his parents security camera while they are away? M-Almond lets users specify these tasks in natural language, and automatically asks for permission, all without divulging personal information to a 3rd party.

Domain Specific Assistants

ABM (Almond Bike Market): A Marketplace Virtual Assistant Augmented by Collaborative Information Sharing (Luke Chen, Silei Xu)

Every Stanford student has a checklist of tasks to do on his first days, and high on that checklist is buying a bike. ABM improves the experience of buying and selling bikes, with a marketplace that can be queried in natural language, and the ability to connect buyers and sellers through a virtual assistant.

Annie: a Social Virtual Assistant for Shooting (Alex Leishman, Megan Wilson)

Can a virtual assistant help with the sport of shooting? This project explores a virtual assistant that uses computer vision to let people track their shooting performance at ranges, with regulatory targets, and share it with friends.

VIRA: A Virtual Assistant for Gamers (Ikechi Akujobi, Matthew Chen, Chris Salguero)

Gaming, and specifically gaming in Virtual Reality, is all about immersion. Can we use a voice virtual assistant to help us with tasks, such as searching for walkthroughs online, that would otherwise interrupt the immersion?

NutriCoach: Tracking Nutrition with A Virtual Assistant (Katy Shi, Trina Sarkar, Ana Caro Mexia, Shawn Fenerin)

This project helps us manage our diet and find foods that satisfy our dietary restrictions. It provides a natural language interface to log meals, and it automatically learns recommendation for next meals based on the right calorie and protein budget, as well as user preferences.

Milo: Assistant for Anxiety (Clay Jones, Kevin Rakestraw, Andrew Declerck)

Many people, including students, are affected by chronic anxiety and anxiety attacks. This project helps them track and visualize their behavior over time, recorded from a wearable device (FitBit) and by answering questions. We perform an exploratory study, where one user is followed for 45 days, and report on the behavioral correlations we observe.

Virtual Assistants for Patients and Doctors (Jason Liu, Stephanie Palocz, Albert Chu)

More than 40% of Americans suffers from a chronic disease, and about 1 in 12 suffers from asthma. Managing these diseases requires constant monitoring and consistent compliance with medication. This project explores the use of a virtual assistant based on M-Almond to let the doctor and the patient share information automatically, both collected from IoT devices and manual input from the patient.

Course Design

A Project Based Course.

CS294S is a project course designed to help students with their first research experience. The development of future intelligent systems requires expertise in diverse areas, such as AI, HCI, programming languages, distributed systems, networking, and security. Students interested in any of the above topics are invited to attend. This class is intended for undergraduate and graduate students who have taken at least two Computer Science courses.

Students, in groups of 1 to 3, can propose their projects or choose among suggested topics. Available to the students is the open-source Almond virtual assistant infrastructure, which can be used to prototype a natural language interface in just one day.

Students are required to come to campus for this course, as in-class participation and after-class project meetings are required. Students can take this course multiple times for credit. CS 294S can be taken to fulfill the CS 194 senior project requirement. Students can sign up for CS 294W if they wish to fulfill their writing requirement as well.



  • Class Participation: 15%
  • Homework 1: 5%
  • Project: 80%

Class Spreadsheet

Project Proposal Guidelines

10 minutes + 5 minutes for questions. Your proposal should include:

  • What is the motivation of the project?
  • What is the state of the art?
  • What will you do?
  • What do you think you will learn at the end?
  • What will be your demo?
  • What will each of you do at each step? (Detailed schedule)

Final Discussion Guidelines

15 minutes + 3 minutes for questions. Your presentation should include:

  • What was the project at the beginning? (Quick refresher for the class)
  • What did you do?
  • What are your results?
  • What did you learn?
  • What would you do if you had more time? (Future work)

Final Report Guidelines

There is no page limit (minimum or maximum), but CS294W students must have a substantial writing. The final report is due June 13 electronically to cs294s-spr1617-staff@lists.stanford.edu. Your report should include:

  • Introduction and motivation
  • Design and implementation
  • Results
  • What did you learn
  • Related work
  • Future work
  • Conclusion

Research Areas & Project Topics

Here are some suggested project topics:

Home automation.

What would the conversation be like with a programmable virtual assistant? For example, “please make sure the heat is on by the time I come back from work every day”, or “have the TV play every Warrior game whenever I am home”.

Mobile virtual assistants

What if our virtual assistant is a home robot, a drone, or a bike? Can it go and get the delivery from Amazon when it arrives? Can it pick up laundry or pizza? Can it help make sure a kid is safe going to school?

Asthma patient care

A patient’s virtual assistant can help manage chronic diseases such as asthma. For example, an asthma patient should not run when a certain kind of pollen count is high or the doctor should be informed if the reading on the patient's flow meter falls below a threshold.

Sharing-economy market maker

Uber enables drivers and riders find each other with the help of GPS. There is a long tail of sharing-economy markets, such as buying and selling bicycles on campus or finding partners to play tennis. Can we create a flexible system that enables virtual assistants to help connect people across a diverse set of markets?

Virtual assistants for virtual reality (VR)

With no access to the keyboard or the mouse, speech is a natural way for VR users to interact with their virtual surroundings. What makes a good speech-based virtual assistant for VR? (Two HTC Vives are available for students to work with in this class).

API knowledge base

Almond has a repository to support semantic parsing, mapping natural language snippets into code. Can we scrape the web automatically or use crowdsourcing to populate this knowledge base?

Machine-learning algorithms for semantic parsing

How can we crowdsource data to enable machine learning? How can we improve the machine learning algorithm to translate natural language into code.

Complementing natural language with graphical interfaces

Graphical interfaces are useful when natural language processing fails and when there is a lot of information to display. How do we create multi-modal interfaces that allow users to easily move between speech and more traditional UIs or to use them together at the same time?


The course meets Tuesday and Thursday, from 10:30 AM to 11:50 AM in Gates 100.

This schedule is tentative and subject to change. Please pay attention to emails sent to the student list.

Date Topic
Tue April 4 Overview of the course. Overview of the Almond virtual assistant.
Thu April 6 Intro + Brainstorming
Tue April 11 Project discussions
Thu April 13 More ideas + in-class discussions. Homework 1 released.
Tue April 18 Mini Hackathon
Thu April 20 Project Proposal. Homework 1 due
Tue April 25 Project Proposal
Thu April 27 Project Proposal
Tue May 2 Student-led discussion: Getting Started with NLP
Thu May 4 Student-led discussion: NLP Semantic Parsing
Tue May 9 Student-led discussion: NLP Everything Else
Thu May 11 Student-led discussion: Speech
Tue May 16 Mini hackathon / work session
Thu May 18 Student-led discussion: Displays (big screens, VR)
Tue May 23 Student-led discussion: IoT
Thu May 25 Student-led discussion: Health
Tue May 30 Final project discussion
Thu June 1 Final project discussion
Tue June 6 Final project discussion
Mon June 12: (12:15-3:15 pm) Final Project Demo and Poster Session (during the scheduled Final Exam period)
Tue June 13 Final Report Due


  1. Almond: The Architecture of an Open, Crowdsourced, Privacy-Preserving, Programmable Virtual Assistant
    Giovanni Campagna, Rakesh Ramesh, Silei Xu, Michael Fischer, and Monica S. Lam.
    In Proceedings of the 26th World Wide Web Conference,
    Perth, Australia, April 2017.

Teaching Staff

Monica Lam


Office hours: by appointment

Giovanni Campagna

Teaching Assistant

Office hours: Monday and Wednesday, 3 PM to 4 PM (starting Wed April 12th)