Selecting the appropriate architecture for a large language model (LLM) application is a critical decision for any technical team, influencing costs, performance, and security. The course "Design, Compare and Analyze LLM Architectures" is tailored for engineers, architects, and technical leads involved in these pivotal "build vs. buy" assessments. It offers a structured approach to designing and justifying system architectures. Learners will learn to enhance their visual communication skills by creating sequence diagrams that illustrate the trade-offs between synchronous and asynchronous processing flows. The course also emphasizes strategic analysis of deployment options, comparing self-hosting an open-source model with utilizing a managed API. Key skills developed include calculating Total Cost of Ownership (TCO), evaluating latency, and understanding data privacy implications, enabling participants to make informed, business-focused recommendations. By the end of this course, you will be able to confidently design, defend, and document your architectural choices to any stakeholder.

Design, Compare and Analyze LLM Architectures

位教师:LearningMate
访问权限由 New York State Department of Labor 提供
您将学到什么
Design and justify LLM architectures by modeling system flows and analyzing self-hosting vs. managed API trade-offs.
您将获得的技能
要了解的详细信息
了解顶级公司的员工如何掌握热门技能

积累特定领域的专业知识
- 向行业专家学习新概念
- 获得对主题或工具的基础理解
- 通过实践项目培养工作相关技能
- 获得可共享的职业证书

该课程共有2个模块
This module introduces the critical role of visual modeling in system design. You will discover why sequence diagrams are essential for comparing synchronous and asynchronous architectures. You will also learn the fundamental components of these diagrams. You will finally practice creating your own diagram to clarify complex LLM application flows and prevent the kinds of costly misunderstandings that plague projects without clear visual documentation.
涵盖的内容
2个视频1篇阅读材料1个作业
This module tackles the crucial "build vs. buy" decision that every technical leader faces. You will delve into the complex trade-offs between self-hosting an open-source LLM and using a managed API. The focus is on conducting a rigorous, business-aware analysis that balances Total Cost of Ownership (TCO), performance benchmarks, and critical data privacy considerations.
涵盖的内容
2个视频1篇阅读材料2个作业
获得职业证书
将此证书添加到您的 LinkedIn 个人资料、简历或履历中。在社交媒体和绩效考核中分享。
位教师

提供方
人们为什么选择 Coursera 来帮助自己实现职业发展

Felipe M.

Jennifer J.

Larry W.

Chaitanya A.
从 Computer Science 浏览更多内容
¹ 本课程的部分作业采用 AI 评分。对于这些作业,将根据 Coursera 隐私声明使用您的数据。





