学生对 Google Cloud 提供的 Transformer Models and BERT Model 的评价和反馈
课程概述
热门审阅
WC
Mar 12, 2024
Excellent and concise presentation of Transformer and BERT models. The course designer may consider adding programming assignments to illustrate the concepts and to reinforce student learning.
NK
May 13, 2025
I am looking for free resources for experimentation, or a lighter model that can run on my laptop.
1 - Transformer Models and BERT Model 的 25 个评论(共 27 个)
创建者 Swastik N
•Jun 29, 2023
This course should never have been published to coursera with such a state, it should have been a free YouTube video at best. What can be done better: - Make it a full course, i.e., start by introducing sequence modelling, why use attention mechanism over something like LSTM etc. - Start introducing concepts such as tokenization, vectorization, etc. - Describe transformer model in detail, take a simple 1 encoder / 1 decoder block and explain how it works with a toy example dataset. - Teach implementation of the toy example in code. - Introduce BERT with clear goal on what it aims to solve - Then comes the part that was discussed in the current course.
创建者 Ankit M
•Sep 13, 2023
The videos do not properly explain how to setup Google Cloud account and help with running the lab. Since its just a couple of videos, I expected some solid lab work- but there are no efforts made by Google to help learners setup their lab environment and get the stuff running.
创建者 A B
•Dec 21, 2023
The introduction is too quick and shallow, with no other material offered to make up for it. Also, it forces you to use Google Cloud.
创建者 SRIKARA S
•Jan 4, 2025
The course needs to cover the 'why' and 'how' in addition to 'what' aspects of the topics. The content lacks depth of explanation making it less useful than just getting familiarity with a bunch of technical terms. In fact, the AI coach in coursera provided excellent explanations of complex topics such as encoder, decoder, attention, etc. which the training video itself fails to provide.
创建者 Minseok S
•Feb 19, 2025
Explanation on the Transformer Model is too short. I expected a detailed one on the Transformer Model Architecture. And It is not easy to follow up the notebook. I am still struggling to configure the notebook.
创建者 Naman A
•Jul 17, 2023
course was amazing gave me a good overview of BERT model and concepts like Encoding and decoding but not for beginner :>
创建者 Peter F
•Aug 25, 2024
I'm none the wiser.
创建者 Waheed A
•Jun 25, 2024
The course's lab was highly interactive and effectively reinforced the material by providing hands-on experience with real-life datasets. The quiz was well-designed to test understanding and retention, making sure that key concepts were grasped thoroughly.
创建者 Karen K A E
•Oct 24, 2025
Excelente curso. La explicación fue clara, práctica y bien estructurada. El instructor domina el tema y logra que conceptos complejos se entiendan fácilmente. Lo recomiendo totalmente para quienes deseen profundizar y aplicar lo aprendido de inmediato.
创建者 Joaquin S
•Jan 31, 2025
Concise, challenging, thought-provoking. This course is an immersive look into the inner workings of Transformer models, and the BERT model
创建者 Riddhimaan R
•Jul 22, 2025
it is a short but very effective video. the content is crisp and easy to understand if you have decent understanding of NN.
创建者 Nabih S
•Jun 25, 2023
very clear and detailed explanation of the transformers with practical example of training BERT model
创建者 Javi22 C
•Jul 18, 2025
Nice course
创建者 Mr. M H H
•Nov 7, 2023
good
创建者 Wingyan C
•Mar 13, 2024
Excellent and concise presentation of Transformer and BERT models. The course designer may consider adding programming assignments to illustrate the concepts and to reinforce student learning.
创建者 Kashish B
•Mar 22, 2025
Lab no longer works end to end. I was able to run until we started building classification model. TesnsorFlow code is no longer compiling.
创建者 Kian M L
•Oct 29, 2023
I need to use my on GCP to run the lab. Otherwise, very good introduction to get going on Transformers
创建者 Naveen K
•May 14, 2025
I am looking for free resources for experimentation, or a lighter model that can run on my laptop.
创建者 이성재
•Jul 21, 2025
IS this all?
创建者 Andreas J
•Apr 17, 2025
The explanation isn't all that useful. It distilled the basics of BERT together into a very short video, which does mention many details but not in the detail required to understand those based on the provided explanation. Either the course should stick to a higher level info for a short video, or go more into detail and take the necessary time to explain the concepts.
创建者 NAYEEM I
•Sep 25, 2023
Personally, I don't think it comes under short course; it is like a superficial walkthrough short-video. It could have been a better short course. Maybe a better option will be to club all the introduction courses from Google Cloud together. Also, I wouldn't insist on a certification for this course.
创建者 Aniket P
•Feb 27, 2025
The course content was good but the instructor sounded like a robot. I know instructor uses notes to teach but this instructor was just reading them out loud without providing any further explanations.
创建者 Arjun V
•Sep 26, 2023
Very quick, not very detailed; gives overall view if that's what its meant to do.
创建者 Tianhao Z
•Jan 16, 2024
The first lecture was OK, but I don't think too much details about how the self-attention and FFN is constructed and why. The second part is just hard to follow without any instructions on how to get the exact setup, and, at the end, getting everything from TF hub is really just for illustration without too much to learn.
创建者 Gary S
•Dec 14, 2024
This course uses far to much jargon, without defining terms, and without providing simple examples that feature neural net architecture and matrix algebra. A complete waste of time.