The course “HDFS Architecture and Programming” offers a comprehensive understanding of the Hadoop Distributed File System (HDFS) architecture, components, and advanced programming techniques. You will gain practical experience in setting up and configuring Hadoop for Java development, while mastering key concepts such as file and directory CRUD operations, data compression, and serialization. By the end of the course, you will be proficient in using HDFS to handle large-scale data processing, enabling you to build scalable, high-availability solutions.

HDFS Architecture and Programming
本课程是 Big Data Processing Using Hadoop 专项课程 的一部分
访问权限由 Coursera Learning Team 提供
您将学到什么
Understand HDFS architecture, components, and how it ensures scalability and availability for big data processing.
Learn to configure Hadoop for Java programming and perform file CRUD operations using HDFS APIs.
Master advanced HDFS programming concepts like compression, serialization, and working with specialized file structures like Sequence and Map files.
您将获得的技能
要了解的详细信息

添加到您的领英档案
9 项作业
了解顶级公司的员工如何掌握热门技能

积累特定领域的专业知识
- 向行业专家学习新概念
- 获得对主题或工具的基础理解
- 通过实践项目培养工作相关技能
- 获得可共享的职业证书

该课程共有4个模块
This course provides a comprehensive understanding of Hadoop Distributed File System (HDFS) architecture and its key components. Students will gain hands-on experience with HDFS, learning how to set up Java programming environments and configure Hadoop. The course covers essential topics such as the HDFS programming model, file and directory CRUD operations, and compression techniques. You will also explore serialization, deserialization, and specialized file structures like Sequence and Map Files. By the end of the course, You will be equipped to leverage HDFS for scalable, highly available big data solutions.
涵盖的内容
2篇阅读材料
In this module, we will cover the working model and architecture behind Hadoop Distributed File System (HDFS) 1.0 and the capabilities and deficiencies of HDFS 1.0 architecture.
涵盖的内容
6个视频4篇阅读材料3个作业
In this module, we will cover HDFS programming concepts, HDFS API, and steps to write an HDFS client program for CRUD (Create, Read, Update and Delete) on files.
涵盖的内容
6个视频5篇阅读材料3个作业
In this module, we will cover HDFS advanced programming concepts, such as CRUD on directories, compression, serialization and deserialization, and file-based data structures like sequence files.
涵盖的内容
6个视频5篇阅读材料3个作业
获得职业证书
将此证书添加到您的 LinkedIn 个人资料、简历或履历中。在社交媒体和绩效考核中分享。
位教师

人们为什么选择 Coursera 来帮助自己实现职业发展

Felipe M.

Jennifer J.

Larry W.

Chaitanya A.
从 Information Technology 浏览更多内容

Johns Hopkins University

University of California San Diego




