18756 Stone Oak Park Way, Suite200, San Antonio TX 78258 USA
100 Queen St W, Brampton, ON L6X 1A4, Canada
country flagUnited States
share button

HDP Operations: Administration Foundations Training


What HDP Operations: Administration Foundations training is all about?

The HDP Operations: Administration Foundation Training is recommended for system administrators who are responsible for installation, management and configuration of the Hortonworks Data Platform (HDP). The training also provides deep knowledge and hands-on training in utilizing Apache Ambari as the operational management platform for HDP.

Schedule

Contact us to customize this class with your preferred dates, times and location.
You can call us on 1-800-961-0337 or Chat with our representative.

What are the course objectives for HDP Operations: Administration Foundations training?
  • Installing Hortonworks Data Platform (HDP).
  • Working in HDP environment, Hadoop and Big data.
  • Managing Groups/Ambari users along with other Hadoop services.
  • Managing HDFS storage.
  • Configuring HDFS Transparent Data Encryption, YARN Resource Manager and HDFS Storage.
  • Configuring the YARN Modify Cluster Nodes & Capacity Scheduler.
  • Configuring the YARN Rack Awareness, availability and HDFS.
  • Protecting, backup and monitoring a Cluster.
Who should attend HDP Operations: Administration Foundations training?

This training is intended for IT administrators along with operators who wanted to enhance their skill in managing and installing depended hybrid data platform in Linux utilizing Ambari.

What are the prerequisites for HDP Operations: Administration Foundations training?

The recommended prerequisite for this course is experience and working in hybrid data platform. However, fundamental knowledge of Linux environment is also needed.

What is the course outline for HDP Operations: Administration Foundations training?
  • 1. DAY 1
  • Describe Apache Hadoop
  • Summarize the Purpose of the Hortonworks Data Platform Software Frameworks
  • List Hadoop Cluster Management Choices
  • Describe Apache Ambari
  • Identify Hadoop Cluster Deployment Options
  • Plan for a Hadoop Cluster Deployment
  • Perform an Interactive HDP Installation using Apache Ambari
  • Install Apache Ambari
  • Describe the Differences Between Hadoop Users, Hadoop Service Owners, and Apache Ambari Users
  • Manage Users, Groups and Permissions
  • Identify Hadoop Configuration Files
  • Summarize Operations of the Web UI Tool
  • Manage Hadoop Service Configuration Properties Using the Apache Ambari Web UI
  • Describe the Hadoop Distributed File System (HDFS)
  • Perform HDFS Shell Operations
  • Use WebHDFS
  • Protect Data Using HDFS Access Control Lists (ACLs)
  • LABS:

  • Setting Up the Environment
  • Installing HDP
  • Managing Ambari Users and Groups
  • Managing Hadoop Services
  • Using HDFS Storage
  • Using WebHDFS
  • Using HDFS Access Control Lists
  • 2. DAY 2
  • Describe HDFS Architecture and Operation
  • Manage HDFS using Ambari Web, NameNode and DataNode UIs
  • Manage HDFS using Command-line Tools
  • Summarize the Purpose and Benefits of Rack Awareness
  • Configure Rack Awareness
  • Summarize Hadoop Backup Considerations
  • Enable and Manage HDFS Snapshots
  • Copy Data Using DistCP
  • Use Snapshots and DistCP Together
  • Summarize the Purpose and Operation of HDFS Centralized Caching
  • Configure HDFS Centralized Cache
  • Define and Manage Cache Pools and Cache Directives
  • Identify HDFS NFS Gateway Use Cases
  • Recall HDFS NFS Gateway Architecture and Operation
  • Install and Configure an HDFS NFS Gateway
  • Configure an HDFS NFS Gateway Client
  • LABS:

  • Managing HDFS Storage
  • Managing HDFS Quotas
  • Configuring Rack Awareness
  • Managing HDFS Snapshots
  • Using DistCP
  • Configuring HDFS Storage Policies
  • Configuring HDFS Centralized Cache
  • Configuring an NFS Gateway
  • 3. DAY 3
  • Describe YARN Resource Management
  • Summarize YARN Architecture and Operation
  • Identify and Use YARN Management Options
  • Summarize YARN Response to Component Failure
  • Understand the Basics of Running Simple YARN Applications
  • Summarize the Purpose and Operation of the YARN Capacity Scheduler
  • Configure and Manage YARN Queues
  • Control Access to YARN Queues
  • Summarize the Purpose and Operation of YARN Node Labels
  • Describe the Process used to Create Node Labels
  • Describe the Process Used to Add, Modify and Remove Node Labels
  • Configure Queues to Access Node Label Resources
  • Run Test Jobs to Confirm Node Label Behavior
  • LABS:

  • Managing YARN Using Ambari
  • Managing YARN Using CLI
  • Running Sample YARN Applications
  • Setting Up for Capacity Scheduler
  • Managing YARN Containers and Queues
  • Managing YARN ACLs and User Limits
  • Working with YARN Node Labels
  • 4. DAY 4
  • Summarize the Purpose of NameNode HA
  • Configure NameNode HA Using Ambari
  • Summarize the Purpose of ResourceManager HA
  • Configure ResourceManager HA using Apache Ambari
  • Identify Reasons to Add, Replace and Delete Worker Nodes
  • Demonstrate How to Add a Worker Node
  • Configure and Run the HDFS Balancer
  • Decommission and Re-commission a Worker Node
  • Describe the Process of Moving a Master Component
  • Summarize the Purpose and Operation of Apache Ambari Metrics
  • Describe the Features and Benefits of the Apache Ambari Dashboard
  • Summarize the Purpose and Benefits of Apache Ambari Blueprints
  • Recall the Process Used to Deploy a Cluster Using Ambari Blueprints
  • Recall the Definition of an HDP Stack and Interpret its Version Number
  • View the Current Stack and Identify Compatible Apache Ambari Software Versions
  • Recall the Types of Methods and Upgrades Available in HDP
  • Describe the Upgrade Process, Restrictions and Pre-upgrade Checklist
  • Perform an Upgrade Using the Apache Ambari Web UI
  • LABS:

  • Configuring NameNode HA
  • Configuring Resource Manager HA
  • Adding, Decommissioning and Re-commissioning a Worker Node
  • Configuring Ambari Alerts
  • Deploying an HDP Cluster Using Ambari Blueprints
  • Performing an HDP Upgrade – Express
FAQ's

It is programmed for four days.

HDP certifications are hands on, performance-based exams. The candidates are required to complete the set of tasks. Before taking on the exam, you must run a 'Speed Test' to ensure that you have all the recommended requirements like 20/5 Mbps (download/upload) speed, 8-16 GB RAM.

The exam is conducted on Hortonworks Data Platform 2.6 installed with Ambari 2.6 and Hortonworks Data Platform 3.0 managed with Ambari 2.7.

HDP Certified Professionals get a digital badge that can be proudly displayed on resume, symptom profiles, email signature etc.

You should not use VPN duirng the exam. VPN will route the traffic to your corporate network as well as out of corporate firewall harming your ability to remain connected with exam environment; it may also increase the latency and technical issues.

4 Days | $ 2800
4
  241 Ratings

1353 Learners

Get In Touch

Are you being sponsored by your employer to take this class?
* I authorize Microtek Learning to contact me via Phone/Email