返回到 Natural Language Processing with Attention Models
DeepLearning.AI

Natural Language Processing with Attention Models

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

状态:Model Evaluation
状态:Fine-tuning
中级课程小时

精选评论

ND

5.0评论日期:Sep 23, 2021

It's a great way to get started with state-of-the-art NLP techniques, following the recommended papers is extremely useful.

RJ

4.0评论日期:Sep 28, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

EG

5.0评论日期:Mar 5, 2023

Very well-designed and organized. Instructors are excelent. Some slides could be better; it would be great if that were fixed.

QD

4.0评论日期:Nov 3, 2022

The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.

SB

5.0评论日期:Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

DB

5.0评论日期:Jan 24, 2023

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

MS

4.0评论日期:Oct 2, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

OS

5.0评论日期:Mar 17, 2024

I would give this class 6 stars if possible. Topics were well explained and, at a good pace. Thanks to the team @DLAI

MN

4.0评论日期:Mar 27, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

JH

5.0评论日期:Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

AM

5.0评论日期:Oct 12, 2020

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding! Thank you!

WZ

5.0评论日期:Dec 28, 2023

This NLP specialization is very well designed. I refreshed my AL learning at school years ago, and learned new things here.

所有审阅

显示:20/263

Xu Ouyang
1.0
评论日期:Sep 26, 2020
Lucas Fernandes
2.0
评论日期:Sep 27, 2020
Shikhin Mehrotra
1.0
评论日期:Sep 28, 2020
Konstantinos Krommydas
2.0
评论日期:Oct 5, 2020
Boris Kabakov
1.0
评论日期:Sep 25, 2020
Ryan Baten
2.0
评论日期:Oct 6, 2020
Vincent Fritsch
1.0
评论日期:Nov 27, 2020
Ravi Shankar Karedla
3.0
评论日期:Oct 6, 2020
D. Refaeli
1.0
评论日期:Mar 22, 2021
Eitan Israeli
2.0
评论日期:Oct 2, 2020
Han-Chung Lee
2.0
评论日期:Oct 4, 2020
Jeremy Ong Chun Hooi
5.0
评论日期:Oct 5, 2020
Paul Jay Ledbetter III
1.0
评论日期:Nov 3, 2020
Muhammad Maiz Ghauri
1.0
评论日期:Dec 5, 2020
Brooke Fujita
1.0
评论日期:Nov 8, 2020
Siddharth Shukla
1.0
评论日期:Sep 19, 2021
Logan Markewich
3.0
评论日期:Apr 15, 2021
Jesús Díaz Martín
2.0
评论日期:Nov 11, 2020
Jorge Antonio Chan-Lau
3.0
评论日期:Oct 28, 2020
Raviteja Reddy Ganta
3.0
评论日期:Oct 14, 2020