CS 584: Deep Learning

CS 584: Deep Learning

What is deep learning?
Deep learning is a type of machine learning and an approach to artificial intelligence. By building complex representations of the world in terms of simpler ones, deep learning techniques have achieved state-of-the-art results in computer vision, speech recognition, natural language processing, clinical applications, and other areas — promising (and threatening) to transform society.

What is this course?
This graduate course introduces students to the theory and practice of deep learning. By the end of the course, student should understand fundamental concepts and foundational work in the field, know how to choose between and implement different deep learning models to solve substantive problems, and be able to evaluate and critique work that uses deep learning techniques.

What is this webpage?
This webpage provides details for this course, including a schedule. The course syllabus contains course policies and other information. Announcements during lecture provide other updates.

Key links, locations, and times
Course syllabus
Paper presentation schedule and rubric
Project rubric
Assignment submissions: link
Textbook: Deep Learning. Goodfellow, Bengio, and Courville. MIT Press, First Edition
Lecture location: Woodruff Memorial Research Building, 4004
Lecture time: Mondays and Wednesdays, 4:00pm–5:15pm

Name: Matthew Reyna
Email address: matthew.a.reyna@emory.edu
Office hours: Tuesdays, 1:30pm–3:00pm; Thursdays, 9:30am–11:00am; or by appointment
Office location: Woodruff Memorial Research Building, 4119

Teaching assistant
Name: Hejie Cui
Email address: hejie.cui@emory.edu
Office hours: Fridays, 1:00pm–4:00pm
Office location: Mathematics and Science Center E308 (The Computer Lab)

Previous coursework in multivariate calculus, linear algebra, probability theory or statistics, and machine learning (CS534 or equivalent). Proficiency with numerical computing in Python and mathematical typesetting in LaTeX.

Students will be evaluated on periodic homework (30% of the course grade, lowest dropped) and quizzes (10%, lowest dropped), presentations and discussions of papers (30%), and a semester project (30%).

For the semester projects, each student (or each pair of students) will present a 3-minute “lightning” talk about their project during class on either Wednesday, April 22 or Monday, April 27 and submit a 4-page paper about their by 4:00pm on Monday, April 27. The project rubric describes the expectations for the project presentations and papers. The slides and paper must be submitted on this webpage or by email before Tuesday, April 28 at 9:00am ET.

Talks should highlight the background, goals, achievements, and challenges of the project. (Shorter talks are often harder to prepare than longer ones.) Papers should describe the project at greater length and provide adequate details about the methods and results. Papers must use the IEEE conference template in LaTeX and be between 4 and 4.5 pages long including the title, author list, and abstract but excluding the references. (Shorter papers are often harder to write than longer ones.) Previous guidance from homeworks and paper presentation will be helpful, and questions are welcome.

This schedule is subject to change.

  1. Monday, January 13, 2020
    Introduction, linear algebra, probability, numerical computing. Slides, Homework 1 and solutions, and Homework 2.
  2. Wednesday, January 15, 2020
    Basics of machine learning. Slides.
  3. Monday, January 20, 2020
    MLK Holiday. No class.
  4. Wednesday, January 22, 2020
    Deep feedforward networks. Slides.
  5. Monday, January 27, 2020
    Deep feedforward networks (continued). Slides and Homework 3 and partial solutions.
  6. Wednesday, January 29, 2020
    Regularization and optimization. Slides.
  7. Monday, February 3, 2020
    Convolutional neural networks (CNNs). Slides.
  8. Wednesday, February 5, 2020
    CNNs (continued). Slides.
  9. Monday, February 10, 2020
    CNNs (continued). Slides, Homework 4 and partial solutions, Homework 5, and paper about saddle points.
  10. Wednesday, February 12, 2020
    Guest lecture about RNNs (continued). Slides.
  11. Monday, February 17, 2020
    Homework discussion. No slides. Paper about Pointer networks.
  12. Wednesday, February 19, 2020
    RNNs (continued). Slides and paper about ImageNet.
  13. Monday, February 24, 2020
    LSTMs and seq2seq. Slides and paper about retinal fundus.
  14. Wednesday, February 26, 2020
    seq2seq (continued). Slides and paper about SMILY.
  15. Monday, March 2, 2020
    Project meetings. Paper about focal loss.
  16. Wednesday, March 4, 2020
    Project meetings. Homework 6 and partial solutions, paper about LSTMs, and paper about reinforcement learning.
  17. Monday, March 9, 2020
    Spring break. No class.
  18. Wednesday, March 11, 2020
    Spring break. No class.
  19. Monday, March 16, 2020
    Extended spring break. No class. Lecture from Ian Goodfellow about generative adversarial networks (GANs).
  20. Wednesday, March 18, 2020
    Extended spring break. No class. Lecture from Nicholas Carlini about adversarial examples and a quiz (source file).
  21. Monday, March 23, 2020
    Attention and transformers. Slides.
  22. Wednesday, March 25, 2020
    Reinforcement learning. Slides.
  23. Monday, March 30, 2020
    Midterm project presentations.
  24. Wednesday, April 1, 2020
    Midterm project presentations (continued).
  25. Monday, April 6, 2020
    Midterm project presentations (continued). Paper about NLP and paper about clinical NLP.
  26. Wednesday, April 8, 2020
    Markov models, linear factor models, and graphical models. Slides and paper about BERT.
  27. Monday, April 13, 2020
    Homework discussion. No slides. Paper about DeepBeat and paper about GANs.
  28. Wednesday, April 15, 2020
    Autoencoders. Slides and paper about reinforcement learning with Atari.
  29. Monday, April 20, 2020
    Variational autoencoders. Slides and paper about reward modeling.
  30. Wednesday, April 22, 2020
    Final project presentations.
  31. Monday, April 27, 2020
    Final project presentations (continued).