Fundamentals of Hadoop
In this free course, study the components of Apache Hadoop and learn about big data. Also see how to use HDFS, MapReduce, SQOOP, Pig, Oozie, Hive and much more.
In this short course, you will be introduced to the components and tools of Apache Hadoop. Learn how to store and process large datasets ranging in size from gigabytes to petabytes with big data. The HDFS (Hadoop distributed file system) architecture, data processing using MapReduce, and importing and exporting data using SQOOP will be covered. The course also has a section that provides you with practical knowledge and hands-on activities.
What You Will Learn In This Free Course
Hadoop Components and Tools
In this module, you will be introduced to Apache Hadoop and its components. You will learn the HDFS architecture and perform data processing using MapReduce. Importing and exporting data using SQOOP, as well as creating and executing Pig Latin scripts to perform ETL will be explained. Also, how to create and use the Oozie workflow will be covered.
Course assessment
User Reviews
Be the first to review “Fundamentals of Hadoop”
You must be logged in to post a review.


There are no reviews yet.