Chevron Left
返回到 Generative AI Language Modeling with Transformers

学生对 IBM 提供的 Generative AI Language Modeling with Transformers 的评价和反馈

4.5
118 个评分

课程概述

This course provides a practical introduction to using transformer-based models for natural language processing (NLP) applications. You will learn to build and train models for text classification using encoder-based architectures like Bidirectional Encoder Representations from Transformers (BERT), and explore core concepts such as positional encoding, word embeddings, and attention mechanisms. The course covers multi-head attention, self-attention, and causal language modeling with GPT for tasks like text generation and translation. You will gain hands-on experience implementing transformer models in PyTorch, including pretraining strategies such as masked language modeling (MLM) and next sentence prediction (NSP). Through guided labs, you’ll apply encoder and decoder models to real-world scenarios. This course is designed for learners interested in generative AI engineering and requires prior knowledge of Python, PyTorch, and machine learning. Enroll now to build your skills in NLP with transformers!...

热门审阅

RR

Oct 10, 2024

Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!

AB

Dec 29, 2024

This course gives me a wide picture of what transformers can be.

筛选依据:

1 - Generative AI Language Modeling with Transformers 的 23 个评论(共 23 个)

创建者 Ohad H

Feb 2, 2025

The narration is poor. Instead of an expert lecturer, a narrator reads the text without understanding its meaning. Many fundamental terms are left unexplained.

创建者 raul v r

Oct 11, 2024

Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!

创建者 Deleted A

Oct 22, 2024

It is an excellent specialisation, except the pace of the speaker is very fast. It is difficult to understand, and it sounds very artificial.

创建者 Alexandre T

Feb 22, 2025

Fantastic class but it takes WAY MORE TIME than what is reported, unless you just don't do the labs or casually read them high level. Going in-depth in the labs and doing the necessary work to understand all key concepts, and codes, will take you easily 3-4x more times depending on your current level of expertise. Example: a lab of 30 minutes has a length of 15 A4 pages when you print it. Now imagine all these pages contain key notions & codes. Superb class, but required time is highly underestimated (like most of the IBM Generative AI Engineering certification).

创建者 vikky b

Nov 17, 2024

need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.

创建者 XUETING W

Dec 2, 2024

Good content but I truly cannot understand...

创建者 Makhlouf M

Apr 6, 2025

Great course! Clear explanations, solid structure, and just the right mix of theory and hands-on content. Thanks to Dr. Joseph Santarcangelo, Fateme Akbari, and Kang Wang for making complex concepts so accessible. Really enjoyed it and learned a lot about transformers and GenAI!

创建者 Robert R

Sep 1, 2025

I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.

创建者 José L C C

Sep 5, 2025

Excelente curso IBM, una buena oportunidad para aprener sobre la IA

创建者 LO W

Nov 10, 2024

Get more familiar with transformer and its application in language

创建者 Ana A B

Dec 30, 2024

This course gives me a wide picture of what transformers can be.

创建者 Muhammad A

Jan 17, 2025

Exceptional course and all the labs are industry related

创建者 Harshit R

Jul 25, 2025

honestly i learnt a lot

创建者 329_SUDIP C

Dec 2, 2024

Nice Course

创建者 Purva T

Jul 26, 2024

good.

创建者 Francesco D G

Dec 15, 2024

Maybe a little chaotics. Slides should be available.

创建者 David C

Jul 27, 2025

Some labs are outdated, the contents are rushed and the assessments are inadequate. Nevertheless, it provides a good-enough broad and general picture.

创建者 Mykola K

Apr 23, 2025

The course is interesting and challenging. The lab assignments should be divided into more parts. There's too much code to grasp in a single lab session, making it difficult to follow the task. A major drawback is the extremely long training time of the model in the lab work. For example, BERT took over an hour to train. During that time, it's easy to lose interest in continuing the course. Either the model needs to be simplified to train faster, or the performance of the environment running the Jupyter Notebook should be significantly improved.

创建者 Kareem A

Sep 11, 2025

The course is too machinery and jumps directly into deep topics without smooth introductions of the background or concepts. It is hard to follow the sequence of ideas.

创建者 Chenhao D

Sep 1, 2025

Too much detail squeezed in a short time

创建者 Conor J C

Jul 29, 2025

Found this a very difficult course to understand. I do not recommend this at all, if you are not already a highly experienced professional in the field. Very badly explained, and impossible to keep up with. Way to much emphasize on technical lingo that to the untrained hear goes right over someones head. Continously had to google terms but did not help in making sense of it. Found very off putting in continuing course. Do not recommend at all.

创建者 Mohammad D

Mar 26, 2025

It's one of the worst courses I've seen. I couldn't understand anything from their explanation and I had to resort to external resources to understand the topic (and I am already someone with ML background).

创建者 Ethan K

Aug 27, 2025

This course is soooo boring. It feels like it's written by robots for robots. I want to see humans teaching material and making it understandable, interesting, and relatable. This is just ai-slop.