IBM
Generative AI Engineering with LLMs 专项课程
IBM

Generative AI Engineering with LLMs 专项课程

Advance your ML career with Gen AI and LLMs. Master the essentials of Gen AI engineering and large language models (LLMs) in just 3 months.

Sina Nazeri
Fateme Akbari
Wojciech 'Victor' Fulmyk

位教师:Sina Nazeri

12,036 人已注册

包含在 Coursera Plus

深入学习学科知识
4.5

(286 条评论)

中级 等级

推荐体验

12 周 完成
在 4 小时 一周
灵活的计划
自行安排学习进度
深入学习学科知识
4.5

(286 条评论)

中级 等级

推荐体验

12 周 完成
在 4 小时 一周
灵活的计划
自行安排学习进度

您将学到什么

  • In-demand, job-ready skills in gen AI, NLP apps, and large language models in just 3 months.

  • How to tokenize and load text data to train LLMs and deploy Skip-Gram, CBOW, Seq2Seq, RNN-based, and Transformer-based models with PyTorch

  • How to employ frameworks and pre-trained models such as LangChain and Llama for training, developing, fine-tuning, and deploying LLM applications.

  • How to implement a question-answering NLP system by preparing, developing, and deploying NLP applications using RAG.

要了解的详细信息

可分享的证书

添加到您的领英档案

授课语言:英语(English)

了解顶级公司的员工如何掌握热门技能

Petrobras, TATA, Danone, Capgemini, P&G 和 L'Oreal 的徽标

精进特定领域的专业知识

  • 向大学和行业专家学习热门技能
  • 借助实践项目精通一门科目或一个工具
  • 培养对关键概念的深入理解
  • 通过 IBM 获得职业证书

专业化 - 7门课程系列

您将学到什么

  • Differentiate between generative AI architectures and models, such as RNNs, transformers, VAEs, GANs, and diffusion models

  • Describe how LLMs, such as GPT, BERT, BART, and T5, are applied in natural language processing tasks

  • Implement tokenization to preprocess raw text using NLP libraries like NLTK, spaCy, BertTokenizer, and XLNetTokenizer

  • Create an NLP data loader in PyTorch that handles tokenization, numericalization, and padding for text datasets

您将获得的技能

类别:Large Language Modeling
类别:Generative AI
类别:Natural Language Processing
类别:Data Processing
类别:Prompt Engineering
类别:Data Pipelines
类别:Artificial Intelligence
类别:Text Mining
类别:Deep Learning
类别:PyTorch (Machine Learning Library)

您将学到什么

  • Explain how one-hot encoding, bag-of-words, embeddings, and embedding bags transform text into numerical features for NLP models

  • Implement Word2Vec models using CBOW and Skip-gram architectures to generate contextual word embeddings

  • Develop and train neural network-based language models using statistical N-Grams and feedforward architectures

  • Build sequence-to-sequence models with encoder–decoder RNNs for tasks such as machine translation and sequence transformation

您将获得的技能

类别:Natural Language Processing
类别:PyTorch (Machine Learning Library)
类别:Artificial Neural Networks
类别:Data Ethics
类别:Statistical Methods
类别:Feature Engineering
类别:Text Mining
类别:Generative AI
类别:Large Language Modeling
类别:Deep Learning

您将学到什么

  • Explain the role of attention mechanisms in transformer models for capturing contextual relationships in text

  • Describe the differences in language modeling approaches between decoder-based models like GPT and encoder-based models like BERT

  • Implement key components of transformer models, including positional encoding, attention mechanisms, and masking, using PyTorch

  • Apply transformer-based models for real-world NLP tasks, such as text classification and language translation, using PyTorch and Hugging Face tools

您将获得的技能

类别:PyTorch (Machine Learning Library)
类别:Large Language Modeling
类别:Natural Language Processing
类别:Text Mining
类别:Generative AI
类别:Applied Machine Learning

您将学到什么

  • Sought-after, job-ready skills businesses need for working with transformer-based LLMs in generative AI engineering

  • How to perform parameter-efficient fine-tuning (PEFT) using methods like LoRA and QLoRA to optimize model training

  • How to use pretrained transformer models for language tasks and fine-tune them for specific downstream applications

  • How to load models, run inference, and train models using the Hugging Face and PyTorch frameworks

您将获得的技能

类别:PyTorch (Machine Learning Library)
类别:Generative AI
类别:Performance Tuning
类别:Natural Language Processing
类别:Large Language Modeling
类别:Prompt Engineering

您将学到什么

  • In-demand generative AI engineering skills in fine-tuning LLMs that employers are actively seeking

  • Instruction tuning and reward modeling using Hugging Face, plus understanding LLMs as policies and applying RLHF techniques

  • Direct preference optimization (DPO) with partition function and Hugging Face, including how to define optimal solutions to DPO problems

  • Using proximal policy optimization (PPO) with Hugging Face to build scoring functions and tokenize datasets for fine-tuning

您将获得的技能

类别:Large Language Modeling
类别:Generative AI
类别:Reinforcement Learning
类别:Natural Language Processing
类别:Performance Tuning
类别:Prompt Engineering

您将学到什么

  • In-demand, job-ready skills businesses seek for building AI agents using RAG and LangChain in just 8 hours

  • How tapply the fundamentals of in-context learning and advanced prompt engineering timprove prompt design

  • Key LangChain concepts, including tools, components, chat models, chains, and agents

  • How tbuild AI applications by integrating RAG, PyTorch, Hugging Face, LLMs, and LangChain technologies

您将获得的技能

类别:Natural Language Processing
类别:Prompt Engineering
类别:Generative AI
类别:LLM Application
类别:Artificial Intelligence
类别:Large Language Modeling
类别:Generative AI Agents

您将学到什么

  • Gain practical experience building your own real-world generative AI application to showcase in interviews

  • Create and configure a vector database to store document embeddings and develop a retriever to fetch relevant segments based on user queries

  • Set up a simple Gradio interface for user interaction and build a question-answering bot using LangChain and a large language model (LLM)

您将获得的技能

类别:User Interface (UI)
类别:Generative AI
类别:Natural Language Processing
类别:Prompt Engineering
类别:Database Management Systems
类别:Data Storage Technologies
类别:Document Management
类别:LLM Application

获得职业证书

将此证书添加到您的 LinkedIn 个人资料、简历或履历中。在社交媒体和绩效考核中分享。

位教师

Sina Nazeri
IBM
2 门课程50,739 名学生
Fateme Akbari
IBM
4 门课程27,585 名学生
Wojciech 'Victor' Fulmyk
IBM
8 门课程83,012 名学生

提供方

IBM

人们为什么选择 Coursera 来帮助自己实现职业发展

Felipe M.
自 2018开始学习的学生
''能够按照自己的速度和节奏学习课程是一次很棒的经历。只要符合自己的时间表和心情,我就可以学习。'
Jennifer J.
自 2020开始学习的学生
''我直接将从课程中学到的概念和技能应用到一个令人兴奋的新工作项目中。'
Larry W.
自 2021开始学习的学生
''如果我的大学不提供我需要的主题课程,Coursera 便是最好的去处之一。'
Chaitanya A.
''学习不仅仅是在工作中做的更好:它远不止于此。Coursera 让我无限制地学习。'
Coursera Plus

通过 Coursera Plus 开启新生涯

无限制访问 10,000+ 世界一流的课程、实践项目和就业就绪证书课程 - 所有这些都包含在您的订阅中

通过在线学位推动您的职业生涯

获取世界一流大学的学位 - 100% 在线

加入超过 3400 家选择 Coursera for Business 的全球公司

提升员工的技能,使其在数字经济中脱颖而出

常见问题