Administrator Training for Apache Hadoop Johor, Malaysia

Administrator Training for Apache Hadoop, KL, Malaysia.

Administrator Training For Apache Hadoop Course Description

Duration: 4.00 days (32 hours), RM 3200

This four day administrator training for Apache Hadoop provides participants with a comprehensive understanding of all the steps necessary to operate and maintain a Hadoop cluster. From installation and configuration through load balancing and tuning,Galactic Solutions training course is the best preparation for the real-world challenges faced by Hadoop administrators. 

Galactic Solutions. 

B4-2-40,Parklane Oug,jalan 1/152,taman oug, 
KL-58200,Malaysia 
Telephone: +60377732231 
E-mail: sales@galacticsolutions.com.my

Intended Audience For This Administrator Training For Apache Hadoop Course

  • » System administrators and others responsible for managing Apache Hadoop clusters in production or development environments.
  •  Administrator Training For Apache Hadoop Course Objectives

    • » The internals of YARN, MapReduce, and HDFS
    • » Determining the correct hardware and infrastructure for your cluster
    • » Proper cluster configuration and deployment to integrate with the data center
    • » How to load data into the cluster from dynamically generated files using Flume and from RDBMS using Sqoop
    • » Configuring the FairScheduler to provide service-level agreements for multiple users of a cluster
    • » Best practices for preparing and maintaining Apache Hadoop in production
    • » Troubleshooting, diagnosing, tuning, and solving Hadoop issues
    • Administrator Training For Apache Hadoop Course Outline

        1. The Case for Apache Hadoop
          1. Why Hadoop?
          2. Core Hadoop Components
          3. Fundamental Concepts
        2. HDFS
          1. HDFS Features
          2. Writing and Reading Files
          3. NameNode Memory Considerations
          4. Overview of HDFS Security
          5. Using the Namenode Web UI
          6. Using the Hadoop File Shell
        3. Getting Data into HDFS
          1. Ingesting Data from External Sources with Flume
          2. Ingesting Data from Relational Databases with Sqoop
          3. REST Interfaces
          4. Best Practices for Importing Data
        4. YARN and MapReduce
          1. What Is MapReduce?
          2. Basic MapReduce Concepts
          3. YARN Cluster Architecture
          4. Resource Allocation
          5. Failure Recovery
          6. Using the YARN Web UI
          7. MapReduce Version 1
        5. Planning Your Hadoop Cluster
          1. General Planning Considerations
          2. Choosing the Right Hardware
          3. Network Considerations
          4. Configuring Nodes
          5. Planning for Cluster Management
        6. Hadoop Installation and Initial Configuration
          1. Deployment Types
          2. Installing Hadoop
          3. Specifying the Hadoop Configuration
          4. Performing Initial HDFS Configuration
          5. Performing Initial YARN and MapReduce Configuration
          6. Hadoop Logging
        7. Installing and Configuring Hive, Impala, and Pig
          1. Hive
          2. Impala
          3. Pig
        8. Hadoop Clients
          1. What is a Hadoop Client?
          2. Installing and Configuring Hadoop Clients
          3. Installing and Configuring Hue
          4. Hue Authentication and Authorization
        9. Cloudera Manager
          1. The Motivation for Cloudera Manager
          2. Cloudera Manager Features
          3. Express and Enterprise Versions
          4. Cloudera Manager Topology
          5. Installing Cloudera Manager
          6. Installing Hadoop Using Cloudera Manager
          7. Performing Basic Administration Tasks Using Cloudera Manager
        10. Advanced Cluster Configuration
          1. Advanced Configuration Parameters
          2. Configuring Hadoop Ports
          3. Explicitly Including and Excluding Hosts
          4. Configuring HDFS for Rack Awareness
          5. Configuring HDFS High Availability
        11. Hadoop Security
          1. Why Hadoop Security Is Important
          2. Hadoop’s Security System Concepts
          3. What Kerberos Is and How it Works
          4. Securing a Hadoop Cluster with Kerberos
        12. Managing and Scheduling Jobs
          1. Managing Running Jobs
          2. Scheduling Hadoop Jobs
          3. Configuring the FairScheduler
          4. Impala Query Scheduling
        13. Cluster Maintenance
          1. Checking HDFS Status
          2. Copying Data Between Clusters
          3. Adding and Removing Cluster Nodes
          4. Rebalancing the Cluster
          5. Cluster Upgrading
        14. Cluster Monitoring and Troubleshooting
          1. General System Monitoring
          2. Monitoring Hadoop Clusters
          3. Common Troubleshooting Hadoop Clusters
          4. Common Misconfigurations
    Advertisements