Program Outline

The ten days program will be structured as follows.

Weekdays

09:00 - 10:15             Lecture 1

10:15 - 10:45             Break

10:45 - Noon            Lecture 2

Noon - 13:30            Lunch break

13:30 - 14:45             Lecture 3

15:30 - 16:45             Daily review session (lead by Lawrence Carin or Jun Zhu)

In the evenings, from 7-8pm, we will have special-topic lectures on applications of deep learning, and we will also have coding “boot camps” on the use of deep learning tools, like Tensorflow.

Weekends

Shanghai excursion

Course summary from July 25th to August 3rd

July 25th Tuesday

Lawrence Carin

Introduction to machine learning and deep learning. These lectures will cover basic ideas in model building, unsupervised/supervised models and semi-supervised models. Will introduce students to concepts in statistical model design, and in optimization-based approaches. Basic and widely used deep models will be reviewed. These overview lectures will try to position the students for the remainder of the Summer School.

Evening Session: Welcome banquet

July 26th Wednesday

John Paisley

Develop detailed understanding of latent Dirichlet allocation (LDA), along with factor analysis. Discussion of topic-model analysis of a corpus. Introduce stochastic and streaming VB (extension of VB to large-scale problems, and big data). Demonstrate how stochastic gradient descent allows scaling to large data sets, and how variational methods allow consideration of uncertainty.

Evening Session: Tutorial on Tensor Flow (Kevin Liang and David Carlson)

July 27th Thursday    

Xiaolin Hu

Introduction to neural networks and multi-layered perceptrons. Review of the convolutional neural network (CNN). Discussion of optimization-based model learning, with detailed discussion of backpropagation. Detailed review of Tensorflow, with discussion of application to CNNs. Discuss application of Tensorflow to vision and image analysis.

Evening Session: Continued tutorial on Tensor Flow for deep learning (Kevin Liang and David Carlson)

July 28th Friday

Finale Doshi-Velez

Introduction to the basics of reinforcement learning (RL). Discussion of how recent deep learning image models have been integrated into RL, to achieve state-of-the-art results in many fields. Discuss multiple applications of RL.

Evening Session: Ice Cream social

July 29th Saturday

Excursion in Shanghai

July 30th Sunday

Excursion in Shanghai

July 31st Monday

Minlie Huang

Neural network methods applied to natural language processing (NLP). Discussion of word embeddings, and models of phrases, sentences and documents. Advanced topics from deep learning for NLP, including recurrent neural networks (long short-term memory).

Evening Session: Machine learning for health analysis (Maciej Mazurowski)

August 1st Tuesday

Liwei Wang (morning lectures: 9-10:15am and 10:45am-noon)

Introductory lectures on learning theory, including VC theory, generalization error bounds, empirical processes. Discussion of the theory of learning, and of model generalization.

Jun Zhu (afternoon lecture: 1:30-2:45pm)

Development of new methods for generative models, with an introduction to generative adversarial networks (GAN). The basics of GAN will be developed, as well as applications to problems in image analysis and synthesis.

Evening Session: Deep learning for health analysis (Maciej Mazurowski)

August 2nd Wednesday

David Carlson

Optimization-based learning using optimization approaches that scale to big data. This includes stochastic gradient descent and recent methods to improve upon this, including spectral descent and development of preconditioners. Discuss online learning for streaming big data. Also discuss some of the recent model developments in which such learning methods are employed, like the autoencoder and the variational autoencoder (VAE).

Evening Session: Review of Deep Learning Research at Microsoft (David Wipf)

August 3rd Thursday

David Wipf

Relationship between iterative algorithms and deep networks. Discussion of learning optimization algorithms, and learning to learn. Will also provide a review of industry-scale deep learning and computer vision.