**What is deep learning?**

Deep learning is a type of machine learning and an approach to artificial intelligence. By building complex representations of the world in terms of simpler ones, deep learning techniques have achieved state-of-the-art results in computer vision, speech recognition, natural language processing, clinical applications, and other areas — promising (and threatening) to transform society.

**What is this course?**

This graduate course introduces students to the theory and practice of deep learning. By the end of the course, student should understand fundamental concepts and foundational work in the field, know how to choose between and implement different deep learning models to solve substantive problems, and be able to evaluate and critique work that uses deep learning techniques.

**What is this webpage?**

This webpage provides details for this course, including a schedule. The course syllabus contains course policies and other information. Announcements during lecture provide other updates.

**Key links, locations, and times**

Course syllabus

Paper presentation schedule and rubric

Project rubric

Assignment submissions: link

Textbook: *Deep Learning*. Goodfellow, Bengio, and Courville. MIT Press, First Edition

Lecture location: Woodruff Memorial Research Building, 4004

Lecture time: Mondays and Wednesdays, 4:00pm–5:15pm

**Instructor**

Name: Matthew Reyna

Email address: matthew.a.reyna@emory.edu

Office hours: Tuesdays, 1:30pm–3:00pm; Thursdays, 9:30am–11:00am; or by appointment

Office location: Woodruff Memorial Research Building, 4119

**Teaching assistant**

Name: Hejie Cui

Email address: hejie.cui@emory.edu

Office hours: Fridays, 1:00pm–4:00pm

Office location: Mathematics and Science Center E308 (The Computer Lab)

**Prerequisites**

Previous coursework in multivariate calculus, linear algebra, probability theory or statistics, and machine learning (CS534 or equivalent). Proficiency with numerical computing in Python and mathematical typesetting in LaTeX.

**Grading**

Students will be evaluated on periodic homework (30% of the course grade, lowest dropped) and quizzes (10%, lowest dropped), presentations and discussions of papers (30%), and a semester project (30%).

For the semester projects, each student (or each pair of students) will present a 3-minute “lightning” talk about their project during class on either Wednesday, April 22 or Monday, April 27 and submit a 4-page paper about their by 4:00pm on Monday, April 27. The project rubric describes the expectations for the project presentations and papers. The slides and paper must be submitted on this webpage or by email before **Tuesday, April 28 at 9:00am ET**.

Talks should highlight the background, goals, achievements, and challenges of the project. (Shorter talks are often harder to prepare than longer ones.) Papers should describe the project at greater length and provide adequate details about the methods and results. Papers must use the IEEE conference template in LaTeX and be between 4 and 4.5 pages long including the title, author list, and abstract but excluding the references. (Shorter papers are often harder to write than longer ones.) Previous guidance from homeworks and paper presentation will be helpful, and questions are welcome.

**Lectures**

This schedule is subject to change.

**Monday, January 13, 2020**

Introduction, linear algebra, probability, numerical computing. Slides, Homework 1 and solutions, and Homework 2.**Wednesday, January 15, 2020**

Basics of machine learning. Slides.**Monday, January 20, 2020**

MLK Holiday. No class.**Wednesday, January 22, 2020**

Deep feedforward networks. Slides.**Monday, January 27, 2020**

Deep feedforward networks (continued). Slides and Homework 3 and partial solutions.**Wednesday, January 29, 2020**

Regularization and optimization. Slides.**Monday, February 3, 2020**

Convolutional neural networks (CNNs). Slides.**Wednesday, February 5, 2020**

CNNs (continued). Slides.**Monday, February 10, 2020**

CNNs (continued). Slides, Homework 4 and partial solutions, Homework 5, and paper about saddle points.**Wednesday, February 12, 2020**

Guest lecture about RNNs (continued). Slides.**Monday, February 17, 2020**

Homework discussion. No slides. Paper about Pointer networks.**Wednesday, February 19, 2020**

RNNs (continued). Slides and paper about ImageNet.**Monday, February 24, 2020**

LSTMs and seq2seq. Slides and paper about retinal fundus.**Wednesday, February 26, 2020**

seq2seq (continued). Slides and paper about SMILY.**Monday, March 2, 2020**

Project meetings. Paper about focal loss.**Wednesday, March 4, 2020**

Project meetings. Homework 6 and partial solutions, paper about LSTMs, and paper about reinforcement learning.**Monday, March 9, 2020**

Spring break. No class.**Wednesday, March 11, 2020**

Spring break. No class.**Monday, March 16, 2020**

Extended spring break. No class. Lecture from Ian Goodfellow about generative adversarial networks (GANs).**Wednesday, March 18, 2020**

Extended spring break. No class. Lecture from Nicholas Carlini about adversarial examples and a quiz (source file).**Monday, March 23, 2020**

Attention and transformers. Slides.**Wednesday, March 25, 2020**

Reinforcement learning. Slides.**Monday, March 30, 2020**

Midterm project presentations.**Wednesday, April 1, 2020**

Midterm project presentations (continued).**Monday, April 6, 2020**

Midterm project presentations (continued). Paper about NLP and paper about clinical NLP.**Wednesday, April 8, 2020**

Markov models, linear factor models, and graphical models. Slides and paper about BERT.**Monday, April 13, 2020**

Homework discussion. No slides. Paper about DeepBeat and paper about GANs.**Wednesday, April 15, 2020**

Autoencoders. Slides and paper about reinforcement learning with Atari.**Monday, April 20, 2020**

Variational autoencoders. Slides and paper about reward modeling.**Wednesday, April 22, 2020**

Final project presentations.**Monday, April 27, 2020**

Final project presentations (continued).

Home ■ People ■ Projects ■ Publications ■ Teaching