Course Project Reports for 2017

There were two options for the course project. Students either chose their own topic ("Final Project") or built models for reading comprehension on the SQuAD dataset ("Default Final Project").

Prize Winners

Congratulations to our prize winners for having exceptional class projects!

Final Project Prize Winners

  1. Beating Atari with Natural Language Guided Reinforcement Learning by Alexander Antonio Sosa / Christopher Peterson Sauer / Russell James Kaplan
  2. Image-Question-Linguistic Co-Attention for Visual Question Answering by Shutong Zhang / Chenyue Meng / Yixin Wang
  3. Ruminating Neural Networks with Auto-regressive Attention Units by Jin Xie / Hao Sheng / Junzi Zhang
Outstanding posters

Default Final Project Prize Winners

  1. Reading Comprehension on the SQuAD Dataset by Fnu Budianto
  2. Question Answering System with Bi-Directional Attention Flow by Fei Xia / Junjie Ke / Yolanda Wang
Outstanding posters

Sponser Prize

Audience Selection Prize

Final Projects

Project NameAuthors
Deep Almond: A Deep Learning-based Virtual AssistantRakesh Ramesh / Giovanni Campagna
Effective Word Representation for Named Entity RecognitionWendi Liu / Eric Li / Tim Hsieh
Learning Effective Embeddings from Medical NotesSebastien Pierre Romain Dubois / Nathanael Romano
An Examination of the CNN/DailyMail Neural Summarization TaskEduardo Torres Montano / Liezl Legaspi Puzon / Vincent Chen
Word Sense Disambiguation Using Skip-Gram and LSTM ModelsArmin Justin Namavari / Tyler Otha Smith / Shrey Gupta
Sequential LSTM-based Encoder for NLIAnkita Sharma / Yokila Arora
Looking for Low-proficiency Sentences in ELL WritingShayne Miel
Natural Language Inference with Attentive Neural NetworksJulien Juilliard Kawawa-Beaudan / Varun Kumar Vijay / Kenny Kin Fai Leung
Neural Image Captioning for Intelligent Vehicle-to-Passenger CommunicationFredrik Karl Anders Gustafsson
Neural Stance Detectors for Fake News ChallengeShanshan Xu / Quan Zhou / Qi Zeng
Automated Essay FeedbackSawyer Birnbaum / Noah Samuel Arthurs
Discourse Parsing via Weighted Bag-of-Words, Coattention Neural Encoder, and Coattentative Convolutional Neural NetworkAlex Fu / Yang Yuan / Borui Wang
Speech recognition with DNN-LASJack Jin / Pengda Liu / Geng Zhao
TrumpBot: Seq2Seq with Pointer Sentinel ModelFilip Zivkovic
Implementing A Neural Cache LSTMChristina Wadsworth / Raphael Martin Palefsky-Smith
Natural Language Learning Supports Reinforcement LearningAndrew Kyle Lampinen
From Vision to NLP: A MergeAlisha Mangesh Rege / Payal Bajaj
Learning to Rank with Attentive Media AttributesYang Yang / Baldo Antonio Faieta
Summarizing Git Commits and GitHub Pull Requests Using Sequence to Sequence Neural Attention ModelsAli-Kazim Zaidi
Neural Networks in Predicting Myers Brigg Personality Type From Writing StyleAnthony Kai Kwang Ma / Gus Liu
Estimating High-Dimensional Temporal Distributions Application to Music & Language GenerationAdrien Descamps / Maxime Alexandre Voisin
“Is it true?” – Deep Learning for Stance Detection in NewsShruti Bhargava / Neel Vinod Rakholia
Abstract Meta-Concept Features for Text-IllustrationInes Chami
Detecting Key Needs in CrisisEmma Marriott / Emma Marriott / Tulsee Doshi / Jay H Patel
Towards Automatic Identification of Fake News: Headline-Article Stance Detection with LSTM Attention ModelsJohn Merriman Sholar / Saachi Jain / Sahil Chopra
Image Titles - Variations on Show, Attend and TellRobert Konrad / Vincent Sitzmann / Timon Dominik Ruban
Fake News, Real Consequences: Recruiting Neural Networks for the Fight Against Fake NewsChris Coffey Proctor / Richard Lee Davis
Image Captioning with PragmaticsNoam Weinberger / Nico Manuel Chaves / Reuben Harry Cohn-Gordon
What’s good for the goose is good for the GANder - Comparing Generative Adversarial Networks for NLPBrendan Mooney Corcoran / Christina Jenny Hung
Implementation and Optimization of Differentiable Neural ComputersCarol Hsin
Cosine Siamese Models for Stance DetectionDelenn Tzu Chin / Kevin Chen / Akshay Kumar Agrawal
A Neural Chatbot with PersonalityDavid Rey Morales / Huyen Thi Khanh Nguyen / Tessera Chin
Deep Classification and Generation of Reddit Post TitlesRolland Wu He / Tyler Foster Chase / Will Qiu
Comment Abuse Classification with Deep LearningTheodora Chu / Max Wang / Kylie Amanda Jue
Writing Style Conversion using Neural Machine TranslationSe Won Jang / Jesse Min / Mark Kwon
Abstractive Text Summarization using Attentive Sequence-to-Sequence RNNsAbiel Gutierrez / Elliott Morgan Jobson
Neural Conversational Model with Mutual Information RankingChenye Zhu / Harrison Chi-Wei Ho
Exploring the Effects of External Semantic Data on Word EmbeddingsWilliam Chun Ki Hang / Brian Hu Zhang / Zihua Liu
Music Composition using Recurrent Neural NetworksAxel Sly / Yuki Inoue / Nipun Agarwala
Detecting and Identifying Bias-Heavy Sentences in News ArticlesShreya Shankar / Nick Paul Hirning / Andy Shiangjuei Chen
Neural Joke GenerationHe Ren / Quan Yang
Named Entity Recognition and Compositional Morphology for Word Representations in ChineseChristopher Heung / Emily Ling / Cindy Shinying Lin
Tagging Patient Notes With ICD-9 CodesSandeep Ayyar / Oliver John Bear Don't Walk
GraphNet: Recommendation System Based on Language and Network StructureXin Li / Rex Ying / Yuanfang Li
“Unmatched” Attention for Natural Language InferenceHomero Gabriel Roman Roman / Vinson Luo / Alex Tamkin
Aiding Sentiment Evaluation with Social NetworkFan Yang / Pengfei Gao / Hao Yin
Reversing Dictionaries and Solving Crossword Clues with Deep LearningMeena Chetty / Viraj R Mehta
Quora Question DuplicationElkhan Dadashov / Sukolsak Sakshuwong / Katherine Yu
Automatic Code CompletionLindsey Makana Kostas / Tara Gayatri Balakrishnan / Adam Palazzo Ginzberg
Deep Causal Inference for Average Treatment Effect Estimation of Poems PopularityDerek Farren
Distributed representations of politiciansBobbie Macdonald
Predicting Stock Movement through Executive TweetsMike Logan Jermann
Information Retrieval from Surgical Reports using Data ProgrammingZeshan Mohammed Hussain / Hardie Hardie Cate / Elliott Jake Chartock
Stance Detection for the Fake News Challenge: Identifying Textual Relationships with Deep Neural NetsPhilipp Thun-Hohenstein / Ali Khalid Chaudhry / Darren Baker
Predicting Short Story EndingsMichael Mernagh
Comparing Deep Learning and Conventional Machine Learning for Authorship Attribution and Text GenerationFrancisco Alejandro Romero / Gregory Charles Luppescu
Duplicate Question Pair Detection with Deep LearningTravis Addair
Song Title Prediction with Bidirectional Recurrent Sequence TaggingRyan Holmdahl
Music Genre Classification by Lyrics using a Hierarchical Attention NetworkAlex Tsaptsinos
Recurrent and Contextual Models for Visual Question AnsweringAbhijit Sharang / Eric Chun Kai Lau
Identifying Nominals with No Head Match Co-references Using Deep LearningMatthew David Stone / Ramnik Arora
Stance Detection for Fake News IdentificationAtli Kosson / Eli Wang / Damian Mrowca
Transfer Learning on Stack Exchange TagsShifan Mao / Weiqiang Zhu / Jake Dong
Detecting Duplicate Questions with Deep LearningStuart Lee Sy / Christopher Tzong-Ran Yeh / Yushi Homma
Ensembling Insights for Baseline Text ModelsHenry Richman Ehrenberg / Dan Iter
Deep Poetry: Word-Level and Character-Level Language Models for Shakespearean Sonnet GenerationStanley Xie / Max Austin Chang / Ruchir Rastogi
Abstractive Summarization with Global Importance ScoresVivian Hoang-Dung Nguyen / Shivaal Kaul Roy
Too Many QuestionsJeffrey Jialei Zhang / Ann He
Awkwardly: A Response SuggesterKai-Chieh Huang / Quinlan Rachel Jung
Evaluating Generative Models for Text GenerationRaunaq Rewari / Prasad Kawthekar / Suvrat Bhooshan
Using Neural Networks to Predict Emoji Usage from Twitter DataConnie Xiao Zeng / Luda Zhao
The Challenge of Fake News: Automated Stance Detection via NLPJeff T. Sheng / Evan Taylor Ragosa Rosenman
RNNs for Stance Detection between News ArticlesGraham John Yennie / Jason Yu Chen / Joe Robert Johnson
“The Pope Has a New Baby!” Fake News Detection Using Deep LearningSamir Bajaj
Transfer Learning: From a Translation Model to a Dense Sentence Representation with Application to Paraphrase DetectionMax Ferguson
Stance Detection for the Fake News Challenge with Attention and Conditional EncodingFerdinand Legros / Oskar Jonathan Triebe / Stephen Robert Pfohl
Language Dynamics analysis through Word2Vec EmbeddingsJeha Yang / Claire Louise Donnat
TL;DR: Improving Abstractive Summarization Using LSTMsSang Goo Kang / Samuel Kim
News Article Summarization with Attention-based Deep Recurrent Neural NetworksChao Wang / Chang Yue / Yoyo Yu
Biomedical Named Entity Recognition Using Neural NetworksGeorge Mavromatis
Lazy Prices: Vector Representations of Financial Disclosures and Market OutperformanceAlex Hanyu Lin / Victor Cheung / Kai Maroon Kuspa
Coreferent Mention Detection using Deep LearningAditya Barua / Piyush Sharma
Hybrid Word-Character Neural Machine Translation for Modern Standard ArabicSiggi Kjartansson / Pamela Toman
Autoregressive Attention for Parallel Sequence ModelingJeremy Andrew Irvin / Dillon Anthony Laird
The ROUGE-AR: A Proposed Extension to the ROUGE Evaluation Metric for Abstractive Text SummarizationSydney Chase Maples
Understanding and Predicting the Usefulness of Yelp ReviewsDavid Zhan Liu
Are Latent Sentence Vectors Cross-Linguistically Invariant?Michael Hermann Hahn
Neural Review Ranking Models for Ads at YelpFlorian Karl Hartl / Vishnu Purushothaman Sreenivasan
Melody-to-Chord using paired model and multi-task learning language modelingYen-Kai Huang / Wei-Ting Hsu / Mu-Heng Yang
Classifying Reddit comments by subredditIan Tam
Dating Text From Google NGramsAashna Shroff / Kelsey Marie Josund / Akshay Rampuria
Universal Dependency Parser: A Single Parser for Many Languages on Arc-SwiftFrank Fan / Michelle Guo
Exploring Optimizations to Paragraph VectorsZoe Michelle Robert / Maya Thadaney Israni / Gabbi Samantha Fisher
Determining Entailment of Questions in the Quora DatasetAlbert Jia-Xiang Tung / Eric Yanmin Xu
Extending the Scope of Co-occurrence EmbeddingJack Mi / Yuetong Wang / Jiren Zhu
Smart Initialization Yields Better Convergence Properties in Deep Abstractive SummarizationManeesh Dilip Apte / Casey Chu / Liam O'hart Kinney
LSTM Encoder-Decoder Architecture with Attention Mechanism for Machine ComprehensionBrian Magid Higgins / Eugene Jinyoung Nho
Deep Learning based Authorship IdentificationChen Qian / Tianchang He / Rao Zhang
Computational models for text summarizationLeo Michael Keselman / Ludwig Schubert
Multitask Learning and Extensions of Dynamic Coattention NetworkKeven Wang / Xinyuan Huang
Predicting State-Level Agricultural Sentiment with Tweets from Farming CommunitiesDarren Hau / Swetava Ganguli / Jared Alexander Dunnmon / Brooke Elena Husic
“Nowcasting” County Unemployment Using Twitter DataThao Thi Thach Nguyen / Megha Bhushan Srivastava
Rationalizing Sentiment Analysis in TensorflowHenry John Neeb / Kevin Eugene Shaw / Aly Rachel Kane
Word2Vec using Character n-gramsVarsha Sankar / Radhika Pramod Patil / Deepti Sanjay Mahajan
Natural Language Inference for Quora DatasetKyu Koh Yoo / Muhammad Majid Almajid / Yang Wong
Logfile Failure Prediction using Recurrent and QuasiRecurrent Neural NetworksIsuru Umayangana Daulagala / Austin Jiao
Abstractive Text Summarization with Quasi-Recurrent Neural NetworksJeffrey Kenichiro Hara / Sho Arora / Peter August Adelson
Tell Me What I SeeVictor Valeriiovych Makoviichuk / Peter Lapko / Boris Borisovich Kovalenko
Backprop to the Future: A Neural Network Approach to Linguistic Change over TimeDai Shen / Michael Xing / Eun Seo Jo

Default Final Projects

Project NameAuthors
Natural Language Processing with Deep Learning Reading ComprehensionWissam Baalbaki / Dan Zylberglejd
Reading ComprehensionVishakh Hegde
Reading Comprehension on SQuAD DatasetAndrew Nicholas Declerck / Kevin Thomas Rakestraw
Dynamic Coattention with Sentence InformationAlex Heneghan Ruch
Reading Comprehension On SQuAD Using TensorflowMicah Daniel Silberstein / Chase Brandon / Michael Holloway
#SQuADGoals: A Comparative Analysis of Models for Closed Domain Question AnsweringEric Mutua Musyoka / Andrew Elijah Duffy / Dan Michael Shiferaw
Reading Comprehension with Coattention EncodersChan Lee / Jae Hyun Kim
Neural Question Answer Systems: Biased Perspectives and Joint Answer Predictions.Nick Paul Troccoli / Lucy L. Wang / Sam Redmond
An exploration of Approaches for the Stanford Question Answering DatasetCharles Chen
Question Answering on the SQuAD DatasetMoosa Hasan Zaidi / Nawaf Adel Alnaji
Question AnsweringJacob Elias Perricone / Blake Jennings
Question Answering on the SQuAD DatasetDangna Li
Neural tetworks for text-based answer extractionBrian Russell Hicks
Dynamic Coattention Networks for Reading ComprehensionHayk Tepanyan
Bifocal Perspectives for Machine ComprehensionAndrew Shao-Chong Lim / Adam Muhannad Abdulhamid / Pavitra Tirumanjanam Rengarajan
Building upon Multi-Perspective Matching for SQuADLouis Yvan Duperier / Yoann Le Calonnec
Recurrent Neural Networks and Machine Reading ComprehensionHana Lee
Match LSTM based Question Answering Hershed Tilak / Yangyang Yu / Michael Cannon Lowney
Exploration and Analysis of Three Neural Network Models for Question AnsweringWilliam Jiang
Question Answering on the SQuAD DatasetMudit Jain
Question AnsweringPeeyush Agarwal
Harder Learning: Improving on Dynamic Co-Attention Networks for Question-Answering with Bayesian ApproximationsMarcus Vincent Gomez / Brandon Bicheng Cui / Udai Baisiwala
Extractive Question Answering Using Match-LSTM and Answer PointerJake Alexander Rachleff / Cameron John Van De Graaf / Alexander Haigh
SQuAD Question Answering Problem: A match-lstm implementationPhilippe Fraisse
Implementation and Improvement of Match-LSTM in Question-Answering SystemBen Zhang / Haomin Peng
Coattention-Based Multi-Perspective Matching Network for Machine ComprehensionQiujiang Jin / Bowei Ma
Deep Question Answering using Bi-directional Attention-based LSTMs and Weighted Distribution PenalizationAkhil Prakash / Pranav Ananth Sriram / Vineet Ahluwalia
Deep Learning for Question Answering on the SQUADReza Takapoui / Ramtin Keramati / Kian Kevin Katanforoosh
SQuAD Reading Comprehension with AttentionAustin Hou
An LSTM Attention-based Network for Reading ComprehensionRafa Goissis Setra
Question Answering on the SQuAD DatasetBrad Garrison Girardeau
Question Answering on SQuADHaque Muhammad Ishfaq / Chenjie Yang
Question Answering with Deep LearningTristan Drake Mcrae
Understanding Multi-Perspective Context Matching for Machine ComprehensionCindy Catherine Orozco Bohorquez
Machine Comprehension Using Multi-Perspective Context Matching and Co-AttentionTarun Gupta / Andrei Bajenov
Question Answering Using Bi-Directional RNNAojia Zhao / Simon Kim
Reading Comprehension with SQuADArcha Jain
Reading ComprehensionClare Chen / Kevin Rui Luo
Exploring Different Matching/Attention Mechanism for Machine Comprehension Task on SQuAD DatasetGeorge Pakapol Supaniratisai / Nattapoom Asavareongchai
Implementation of Multi-Perspective Context Matching for Machine ComprehensionWen Yau Aaron Loh
Question Answering with SQuAD: Variations on Multi-Perspective Context MatchingJason Freeman / Raine Morgan Hoover
Reading ComprehensionJason Liu / Christina Kao / Christopher Vo
A Convolution Network Approach to Machine ComprehensionLisa Yan
Deep Coattention Networks for Reading ComprehensionAmy Lawson Bearman
Reading ComprehensionRyan Patrick Burke / Leah Lynn Brickson / Alexandre Robicquet
Machine Comprehension using Dynamic Recurrent Neural Network and Gated Recurrent UnitYi-Hong Kuo / Hsin-Ya Lou / Hsiang-Yu Yang
Neural Network-based Question Answering SystemSriraman Madhavan / Sanyam Mehra / Kushaagra Goyal
Rolling Deep with the SQuAD: Question AnsweringTanuj Thapliyal / Dhruv Amin / Reid Westwood
Question Answering with Multi-Perspective Context MatchingJoey William Blackshaw Asperger
Exploration of Attention in Question AnsweringAnthony Perez
Machine Comprehension with Exploration on Attention MechanismChen Guo
Machine Comprehension with Modified Bi-Directional Attention Flow NetworkMingxiang Chen / Sijun He / Jiajun Sun
Reading ComprehensionJoris Van Mens / Nickolas Samuel Westman / Ilya Kuleshov
Random Coattention Forest for Question AnsweringYi-Chun Chen / Ting-Po Lee / Jheng-Hao Chen
Decoding Coattention Encodings for Question AnsweringQandeel Tariq / John Wang Clow / Alex Kolchinski
Multi-Perspective Context Matching for SQuAD DatasetHuizi Mao / Xingyu Liu
Improving Match-LSTM for Machine ComprehensionMike Yu / Kevin Foschini Moody / Dennis Xu
SQuAD Reading Comprehension LearningThomas Pascal Jean Ayoul / Sebastien Nathan Raphael Levy
Exploring Deep Learning Models for Machine Comprehension on SQuADJoseph Kuruvilla Charalel / Yifei Feng / Junjie Zhu
Reading ComprehensionSuraj Heereguppe Radhakrishna / Chiraag Sumanth / Jayanth Ramesh
Simple Dynamic Coattention NetworksWenqi Wu
Question Answering Using Regularized Match-LSTM and Answer PointerDebnil Arindam Sur / Ellen Lindsey Blaine
Extending Match-LSTMKeegan Rochard Mosley / Sebastian Goodman
Global Span Representation Model for Machine Comprehension on SQuADSunmi Lee / Jaebum Lee
BiDAF Model for Question AnsweringGenki Kondo / Ramon Tuason / Daniel Grazian
A new model for Machine Comprehension via multi-perspective context matching and bidirectional attention flowNima Hamidi / Amirata Ghorbani
Co-Attention with Answer-Pointer for SQuAD Reading Comprehension TaskMindy Lea Yang / Tommy Fan / Chenyao Yu
Reading Comprehension on SQuADMeghana Vijay Rao / Brexton Pham / Zachary Davis Taylor
Coattention Answer-Pointer Networks for Question AnsweringYanshu Hong / Tian Zhao / Yiju Hou
Coattention Model for Question AnsweringMarie Eve Vachovsky / Tina Ivy Vachovsky
Modular Sequence Attention Mix ModelAhmed Hussain Jaffery / Kostya Sebov
Question Answering Using Match-LSTM and Answer PointerBrandon Chauloon Yang / Cindy Wang / Annie Hu
SQuAD Question Answering using Multi-Perspective MatchingShloka Mitesh Desai / Sheema Usmani / Zach Daniel Maurer
Relevancy-Scaled Deep Co-Attentive Networks for Question AnsweringVarun Abhijit Gupta / Orry Chris Despo / Nadav Aharon Hollander
A simple sequence attention model for machine comprehensionMarcello Mendes Hasegawa
Question Answering with Recurrent Span RepresentationsTimothy Man Hay Lee / Kevin Xinzhi Wu / John Louie
Question Answering on the SQuAD Dataset with Part-of-Speech TaggingNancy Xu / Philip Ken-Ka Hwang / Joan Creus-Costa
NQSotA Continuation Curriculum Learning with Question Answering on the SQuAD DatasetRahul Sunil Palamuttam / Luke Taylor Johnston / William Chen
CS 224N Assignment 4: Question Answering on SQuADKevin Matthew Garbe / Aykan Ozturk / Huseyin Atahan Inan
Seq2seq-Attention Question Answering ModelWenqi Hou / Yun Nie
Question Answering on the SQuAD Dataset Using Multi-Perspective Context MatchingSam Edward Herbert Colbran / Stanislav Fort
Bidirectional Attention Flow Model for Reading ComprehensionMichael Painter / Bardia Beigi / Soroosh Hemmati
Maching Comprehension using SQuAD and Deep LearningJosh Robert King / Filippo Ranalli / Ajay Uday Mandlekar
Machine Comprehension with MMLSTM and ClusteringFrank Anthony Cipollone / Zachary Barnes / Tyler Romero
Dynamic Coattention Networks with Encoding MaxoutMorgan Lee Tenney / Thaminda Mewan Edirisooriya / Hansohl Eliott Kim
Implementation and New Variants Exploration of the Multi-Perspective Context Matching Deep Neural Network Model for Machine ComprehensionYutong Li / Prateek Murgai
Attention-based Recurrent Neural Networks for Question AnsweringDapeng Hong / Billy Wan
Co-Dependent Attention on SQuADSiyue Wu / Fabian Chan / Xueyuan Mei
Start and End Interactions in Bidirectional Attention Flow for Reading ComprehensionSean Thomas Rafferty / Ted Harbin Li
Implementing Multi-Perspective Context Matching for the SQuAD Task in TensorFlowChristopher Michael Pesto
CAESAR: Context-Awareness Enabled and Summary-Attentive ReaderKshitiz Tripathi / Long-Huei Chen / Mario Sergio Rodriguez
Ensemble Learning For Machine Comprehension: Bidirectional Attention Flow ModelsDivya Shree Saini / Stephen Ou / William Adams Du
An End-to-End Neural Architecture for Reading ComprehensionNed Joseph Danyliw / Miguel Sebastian Camacho-Horvitz / Meredith Noelani Burkle
Neural Methods for Question AnsweringAyush Kanodia / Manik Dhar / Pratyaksh Sharma
Implementation and Analysis of Match-LSTM for SQuADMichael Graczyk
Multiple Turn Comprehension for the BiDirectional Attention Flow ModelThomas Liu
Question Answering on the SQuAD DatasetDo-Hyoung Park / Vihan Sankaran Lakshman
Answering SQuADFaraz Waseem / Atishay Jain
Game, Set, Match-LSTM: Question Answering on SQuADEric Osemen Ehizokhale / Ian Narciso Torres
Bidirectional LSTM-RNN with Bi-Attention for reading comprehensionGuoxi Xu / Bera Shi / Ziyi Yang
Filter-Context Dynamic Coattention Networks for Question AnsweringYangxin Zhong / Jian Huang / Peng Yuan
Question Answering System using Dynamic Coattention NetworksAdeline Emily Wong / Bojiong Ni / James Shi
Machine Question and AnsweringDiana Dan Khanh Le / Malina Jiang / Joseph Chang
Question Answering on the SQuAD DatasetSaghar Hosseinisianaki / Yun Yun Li
Learning Reading Comprehension with Neural NetsJason Huang / Li Cai / Charles Huyi
Convolutional Encoding in Bidirectional Attention Flow for Question AnsweringDaniel Roy Miller
Incorporating Part-of-Speech tags and Named Entities into Match-LSTMRaunak Kasera / Dilsher Ahmed
Reading Comprehension with Deep LearningAtticus Reed Geiger / Dylan Ziqing Liu
Unilateral Multi-Perspective Matching for Machine ComprehensionMax Calvin Schorer / Jason Wang / Sigberto Viesca
Question Answering on the SQuAD DatasetXin Jin / Milind Mukesh Rao / Abbas Kazerouni
Reading Comprehension on the Stanford Question Answering DatasetShashwat Udit
Machine Comprehension for SQuAD datasetVikas R Bahirwani / Erika Debra Menezes
Machine Comprehension with Deep Learning on SQuAD datasetNeha Gupta / Yianni Dimitrios Laloudakis / Yash Vyas