Chevron Left
返回到 Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

学生对 DeepLearning.AI 提供的 Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization 的评价和反馈

4.9
63,493 个评分

课程概述

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

热门审阅

AA

Oct 22, 2017

Assignment in week 2 could not tell the difference between 'a-=b' and 'a=a-b' and marked the former as incorrect even though they are the same and gave the same output. Other than that, a great course

AM

Oct 8, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

筛选依据:

3026 - Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization 的 3050 个评论(共 7,283 个)

创建者 QIQING

Sep 25, 2017

thanks to Andrew, for his shareing of experience in this field.

创建者 Sebastian F

Sep 14, 2017

Hands on learning! Interesting lessons and easy to follow code.

创建者 Rabiya M

Dec 6, 2023

Extraordinary Course with exceptional learning points. Great!!

创建者 KUSHAGRA S

Dec 16, 2021

Great! Though last assignment as a bit difficult but its okay,

创建者 Maya S

Sep 26, 2020

amazing course ! well build and explained thank you very much!

创建者 Satyam S

Aug 28, 2020

Really helped me build my fundamentals about hyper parameters.

创建者 YANSKY

Jul 28, 2020

This course is excellent! Every material explained very clear.

创建者 Muhammad U

Jul 17, 2020

Really helps you to understand what is going behind the scene.

创建者 Anirban G

Jul 5, 2020

Excellent basic introduction to hyper-parameter optimizations.

创建者 sathya p

Jun 16, 2020

Well done guys. Great work indeed.

Hope to see courses from you

创建者 Harshit S

May 29, 2020

Good way to present the concept and then explain the intuition

创建者 Zhi L

Mar 17, 2020

Thanks to Andrew and the team for providing such great course!

创建者 靳文彬

Mar 9, 2020

Andrew is a great teacher, the whole series is almost perfect!

创建者 John G

Feb 16, 2020

Great overview of optimizing networks and intro to tensorflow.

创建者 Alexandre F

Jan 19, 2020

Great overview of the rules of thumbs to optimize DL NN tuning

创建者 Apolo T A B

Oct 29, 2019

Bom curso, mas os notebooks foram muito melhores que as aulas!

创建者 dyfbobby

Jun 16, 2019

Further understanding on deep nn construction and optimization

创建者 Nelson F

Jun 2, 2019

As expected from Andrew Ng's previous courses, just excellent!

创建者 Binjer

May 25, 2019

讲解的非常细致和深入,对初学者友好。而随着对ML领域的理解不断丰富,也会产生更深入的理解。非常感谢为这门课程做出贡献的所有人

创建者 Sabarish K

May 21, 2019

Dismantled the Neural Nets into several understandable blocks.

创建者 amravi s

May 11, 2019

well planned and excellent method of teaching.Loved the course

创建者 Gyuho S

Mar 31, 2019

Awesome. It is getting serious and more fun as it gets Deeper!

创建者 jackytu256

Mar 10, 2019

great course to deeply understand the meaning of Deep learning

创建者 Vital

Mar 3, 2019

Easy, thanks for good suggestions and interesting information.