返回到 Natural Language Processing with Attention Models
DeepLearning.AI

Natural Language Processing with Attention Models

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

状态:Recurrent Neural Networks (RNNs)
状态:Embeddings
中级课程小时

精选评论

ND

5.0评论日期:Sep 23, 2021

It's a great way to get started with state-of-the-art NLP techniques, following the recommended papers is extremely useful.

RJ

4.0评论日期:Sep 28, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

QD

4.0评论日期:Nov 3, 2022

The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.

EG

5.0评论日期:Mar 5, 2023

Very well-designed and organized. Instructors are excelent. Some slides could be better; it would be great if that were fixed.

MN

4.0评论日期:Mar 27, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

SB

5.0评论日期:Dec 31, 2020

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

MS

4.0评论日期:Oct 2, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

AT

4.0评论日期:Oct 14, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

DB

5.0评论日期:Jan 24, 2023

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

DS

5.0评论日期:Apr 28, 2023

The course is so great you should definitely check it out it will give you deep insight through Natural language processing.

VG

4.0评论日期:Mar 9, 2024

It could have been better if Transformers library from hugging face is explored more. and topics like Vision Transformers and utilization of Transformers for computer vision is explored.

LL

5.0评论日期:Jun 22, 2021

T​his course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

所有审阅

显示:20/263

Xu Ouyang
1.0
评论日期:Sep 26, 2020
Lucas Fernandes
2.0
评论日期:Sep 27, 2020
Shikhin Mehrotra
1.0
评论日期:Sep 28, 2020
Konstantinos Krommydas
2.0
评论日期:Oct 5, 2020
Boris Kabakov
1.0
评论日期:Sep 25, 2020
Ryan Baten
2.0
评论日期:Oct 6, 2020
Vincent Fritsch
1.0
评论日期:Nov 27, 2020
Ravi Shankar Karedla
3.0
评论日期:Oct 6, 2020
D. Refaeli
1.0
评论日期:Mar 22, 2021
Eitan Israeli
2.0
评论日期:Oct 2, 2020
Han-Chung Lee
2.0
评论日期:Oct 4, 2020
Jeremy Ong Chun Hooi
5.0
评论日期:Oct 5, 2020
Paul Jay Ledbetter III
1.0
评论日期:Nov 3, 2020
Muhammad Maiz Ghauri
1.0
评论日期:Dec 5, 2020
Brooke Fujita
1.0
评论日期:Nov 8, 2020
Siddharth Shukla
1.0
评论日期:Sep 19, 2021
Logan Markewich
3.0
评论日期:Apr 15, 2021
Jesús Díaz Martín
2.0
评论日期:Nov 11, 2020
Jorge Antonio Chan-Lau
3.0
评论日期:Oct 28, 2020
Raviteja Reddy Ganta
3.0
评论日期:Oct 14, 2020