NewSale

Flume and Sqoop for Ingesting Big Data

Engineers who want to port data from legacy data stores to HDFS
    • Learning Style
      Self-Paced Learning
    • Difficulty
      Intermediate
    • Course Duration
      2 Hours
Engineers who want to port data from legacy data stores to HDFS
Start FREE Subscription Trial
Get started with our Learn Subscription Plan that includes this course, PLUS:

  • 328 high impact technical, end user and learning & business management courses
  • 100% online self-paced courses
  • Course completion certificates
  • Live tech support and you will be assigned your personal Learning Concierge
  • 7-Day FREE Trial
    Then Billed
    $24.99
    Every Month Until Canceled
  • Start FREE Trial
Purchase As Individual Course
  • Self-Paced Online Content
  • Attend Course Any Day or Any Time
  • Reports & Statistics
  • Certificate Upon Completion
  • Now Only $50.00 Regular Price $70.00
    Self-Paced Learning
  • Enroll Now
Purchase For Teams
Team Pricing Available - Request A Quote Today!

  • Group Discounts & Private Training Available
  • Free Learning Management Center
  • Group Reporting & Tracking
  • Author / Publish Your Own Courses
  • Request Team Enrollment

Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems. 

Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase. 

Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive. 

Course Objective:

Practical implementations for a variety of sources and data stores ..

  • Sources : Twitter, MySQL, Spooling Directory, HTTP
  • Sinks : HDFS, HBase, Hive

Flume features : 

Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors

Sqoop features : 

Sqoop import from MySQL, Incremental imports using Sqoop Jobs

Audience:

  • Engineers building an application with HDFS/HBase/Hive as the data store
  • Engineers who want to port data from legacy data stores to HDFS

Prerequisite:

  • Knowledge of HDFS is a prerequisite for the course
  • HBase and Hive examples assume basic understanding of HBase and Hive shells
  • HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS
More Information
Lab Access No
Learning Style Self-Paced Learning
Difficulty Intermediate
Course Duration 2 Hours
Language English
Write Your Own Review
You're reviewing:Flume and Sqoop for Ingesting Big Data
Your Rating