Chevron Left
返回到 Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

学生对 DeepLearning.AI 提供的 Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization 的评价和反馈

4.9
63,493 个评分

课程概述

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

热门审阅

AA

Oct 22, 2017

Assignment in week 2 could not tell the difference between 'a-=b' and 'a=a-b' and marked the former as incorrect even though they are the same and gave the same output. Other than that, a great course

AM

Oct 8, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

筛选依据:

3226 - Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization 的 3250 个评论(共 7,283 个)

创建者 陈沁然

Mar 10, 2018

lesson is quite good, but sometimes the voice is too low.

创建者 Pavel K

Mar 3, 2018

As useful as the previous "Course 1". Strongly recommend!

创建者 Abdallah M

Jan 3, 2018

Amazing course and an amazing instructor. I learned a lot

创建者 Shiloh T S

Dec 3, 2017

Great dive into the art/science of hyperparameter tuning.

创建者 Yang X

Nov 26, 2017

Thank you Andrew! you make me love deep learning so much!

创建者 Gultekin B (

Nov 2, 2017

Well-prepared assignments, excellent! Thanks a lot again

创建者 James T S

Oct 19, 2017

Superb course that includes valuable practical knowledge.

创建者 Chris F

Sep 17, 2017

Everything comes together in the programming assignments!

创建者 Sun J

Aug 24, 2017

Very good coverage on the deeplearning technical details.

创建者 Joseph M

Aug 23, 2017

A brief introduction to Tensorflow is included at the end

创建者 Umer S

Aug 5, 2024

Great content , great community of learners and helpers!

创建者 Srinivas C

Jul 10, 2022

Gives clear understanding of regularization techniques.

创建者 Yağız S

Apr 15, 2021

That's really awesome course, I'd recommend it everyone.

创建者 Furong S

Jan 16, 2021

Great course. There are some grading errors to be fixed.

创建者 farzad

Dec 11, 2020

Easy to follow and a lot of valuable information. Thanks

创建者 Akshaykumar S

Jul 11, 2020

Great topic coverage and exercises. Loved every aspect !

创建者 Sanjeev K Y

Jun 28, 2020

Specialization is an absolute delight for Deep Learning.

创建者 VIJAYARAGHAVAN V

Jun 4, 2020

Excellent course, Wonderful experience to learn new DNN.

创建者 Arvind K V

May 13, 2020

I really understand the nuts and bolts of neural network

创建者 Nguyen T H

Apr 24, 2020

Great lessons and explain basic knowledge of ML. Thanks.

创建者 Bhanuprakash B

Mar 25, 2020

Best course to get clear with concepts of Deep Learning.

创建者 dennis w

Jan 7, 2020

Andrew Ng is the best!!! He makes everything so clear.

创建者 Hisham R

Nov 29, 2019

Excellent Course. Covered and explained topics very well

创建者 VISHESH S

Nov 20, 2019

Amazing experience and once again amazing teaching style

创建者 Alfredo P

Nov 17, 2019

I continue enjoying the journey. We'll see in the future