Chevron Left
返回到 Generative AI with Large Language Models

学生对 DeepLearning.AI 提供的 Generative AI with Large Language Models 的评价和反馈

4.8
3,521 个评分

课程概述

In Generative AI with Large Language Models (LLMs), you’ll learn the fundamentals of how generative AI works, and how to deploy it in real-world applications. By taking this course, you'll learn to: - Deeply understand generative AI, describing the key steps in a typical LLM-based generative AI lifecycle, from data gathering and model selection, to performance evaluation and deployment - Describe in detail the transformer architecture that powers LLMs, how they’re trained, and how fine-tuning enables LLMs to be adapted to a variety of specific use cases - Use empirical scaling laws to optimize the model's objective function across dataset size, compute budget, and inference requirements - Apply state-of-the art training, tuning, inference, tools, and deployment methods to maximize the performance of models within the specific constraints of your project - Discuss the challenges and opportunities that generative AI creates for businesses after hearing stories from industry researchers and practitioners Developers who have a good foundational understanding of how LLMs work, as well the best practices behind training and deploying them, will be able to make good decisions for their companies and more quickly build working prototypes. This course will support learners in building practical intuition about how to best utilize this exciting new technology. This is an intermediate course, so you should have some experience coding in Python to get the most out of it. You should also be familiar with the basics of machine learning, such as supervised and unsupervised learning, loss functions, and splitting data into training, validation, and test sets. If you have taken the Machine Learning Specialization or Deep Learning Specialization from DeepLearning.AI, you’ll be ready to take this course and dive deeper into the fundamentals of generative AI....

热门审阅

AB

Nov 1, 2023

Very insightful, in depth and well explained course, that provides a solid explanation about the technical aspects, economical considerations and project lifecycle of AI LLM powered solutions

SM

Oct 12, 2024

Pretty good overview for Product Managers and leaders who are interested in learning about Generative AI with hands-on labs that are not too detailed, yet help you develop the intuition.

筛选依据:

651 - Generative AI with Large Language Models 的 675 个评论(共 843 个)

创建者 Maciej J

Jan 9, 2024

Awesome!

创建者 David G G G

Jun 29, 2023

Amazing!

创建者 akula j

Sep 17, 2024

helpful

创建者 Abdullah B

Mar 20, 2024

Perfect

创建者 Aminah N

Dec 19, 2024

useful

创建者 Vipul C H

Nov 30, 2023

thanks

创建者 Praveen H

Sep 24, 2023

superb

创建者 Justin H

Sep 1, 2023

Brutal

创建者 Николай Б

Jul 30, 2023

Greate

创建者 zaidiabbas786 A

Apr 21, 2025

teert

创建者 Adarsh51

Mar 2, 2025

Nice!

创建者 Egies R F

Feb 23, 2025

goodd

创建者 Simone L

Aug 21, 2023

Super

创建者 mehmet o

Aug 5, 2023

great

创建者 SUBHADEEP C

Oct 25, 2025

good

创建者 Pooja S K

Sep 21, 2025

Good

创建者 Afiga

Sep 11, 2025

Good

创建者 ABEER H M

Aug 27, 2024

شكرا

创建者 Khaoula E

Mar 30, 2024

good

创建者 Buri B

Mar 3, 2024

nice

创建者 Nivrutti R P

Feb 25, 2024

good

创建者 zed a

Jan 24, 2024

good

创建者 Padma M

Dec 10, 2023

good

创建者 Fraz

Dec 10, 2023

All the instructors were good and delivery was mostly excellent, however, the course was a bit too short can be improved in several ways. There were very few quizes in the video lectures and the ones that were present, were too easy or obvious (does not require much thinking). There should be good, quality quizes in most video lessons similar to the OG ML course by Andrew Ng. The inline quizes in videos help "reinforce" the learning in humans. This is proven by the research yet to be carried out :D Another aspect that I did not like was the jupyter notebooks to run excercises, all solutions were already provided and it does not help in learning the concepts if all we have to do is to press Shift+Enter and merely observe code and results. Actual learning requires some trail and error as part of the exercises, once again the OG ML course by Andrew Ng did a good job of accomplishing this with Octave exercises.

创建者 Deleted A

Nov 2, 2023

A delightful and very up-to-date (most of the references have been published in the last 2 years) overview of LLMs with hands-on lab sessions in Python. Prompt engineering, zero/one/few-shot inference, instruction fine tuning (FT), parameter-efficient FT (PEFT), Low-rank Adaptation (LoRA), RL from human feedback, program-aided language (PAL) models, retrieval augmented generation (RAG), etc, etc. In short, everything you need to know about the state-of-the-art in LLMs in 2023. There are a couple of things that disappointed me though. The first one is that, unlike other Coursera courses, there isn't any discussion forum to interchange ideas with other students or post questions. The second one is that there isn't any clear contact (either from the course's intructors or from Coursera) to ask questions regarding problems with the AWS platform when working on the labs.