Lecture 4: Optimization

Lecture 4: Optimization

HomeMichigan OnlineLecture 4: Optimization
Lecture 4: Optimization
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
Lecture 4 covers optimization algorithms used to minimize the loss functions discussed in the previous lecture. We introduce the core gradient descent algorithm and compare numerical and analytical approaches to computing gradients. We discuss extensions to the basic gradient descent algorithm, including stochastic gradient descent (SGD) and momentum. We also discuss more advanced first-order optimization algorithms such as AdaGrad, RMSProp, and Adam, and briefly touch on second-order optimization.

Slides: http://myumi.ch/v2xAr
_________________________________________________________________________________________________

Computer vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. At the core of many of these applications are visual recognition tasks such as image classification and object detection. Recent developments in neural network approaches have significantly improved the performance of these cutting-edge visual recognition systems. This course is a deep dive into the details of neural network-based deep learning methods for computer vision. During this course, students will learn to implement, train, and debug their own neural networks and gain a detailed understanding of the latest research in computer vision. We will cover learning algorithms, neural network architectures, and practical engineering tricks for training and fine-tuning networks for visual recognition tasks.

Course website: http://myumi.ch/Bo9Ng

Course leader: Justin Johnson http://myumi.ch/QA8Pg

Please take the opportunity to connect with your friends and family and share this video with them if you find it useful.