MS-MCSA-DE with Azure MCSA: Data Engineering with Azure (MS-MCSA-DE with Azure)

It is the first step on your path to becoming a Data Management and Analytics Microsoft Certified Solutions Expert (MCSE).
  • Virtual Classroom

    Learning Style
  • Intermediate

    Difficulty
  • 6 Days

    Course Duration
  • 6 Days

    SATV Value
Pricing
About Individual Course:
  • Individual course plan gives you access to this course
New
$4,095.00
$4,095.00
/ Seat
It is the first step on your path to becoming a Data Management and Analytics Microsoft Certified Solutions Expert (MCSE).

About this course:

Earning an MCSA: Data Engineering with Azure demonstrates knowledge relevant to the design and building of analytics and operational solutions on Azure as well as the implementation of big data engineering workflows on HDInsight. It is the first step on your path to becoming a Data Management and Analytics Microsoft Certified Solutions Expert (MCSE).

This boot camp prepares you for the following exams:

Course Objective:

After completing this course, students will be able to: 

  • Deploy HDInsight Clusters.
  • Authorizing Users to Access Resources.
  • Loading Data into HDInsight.
  • Troubleshooting HDInsight.
  • Implement Batch Solutions.
  • Design Batch ETL Solutions for Big Data with Spark
  • Analyze Data with Spark SQL.
  • Analyze Data with Hive and Phoenix.
  • Describe Stream Analytics.
  • Implement Spark Streaming Using the DStream API.
  • Develop Big Data Real-Time Processing Solutions with Apache Storm.
  • Build Solutions that use Kafka and HBase.
  • Describe common architectures for processing big data using Azure tools and services.
  • Describe how to use Azure Stream Analytics to design and implement stream processing over large-scale data.
  • Describe how to include custom functions and incorporate machine learning activities into an Azure Stream Analytics job.
  • Describe how to use Azure Data Lake Store as a large-scale repository of data files.
  • Describe how to use Azure Data Lake Analytics to examine and process data held in Azure Data Lake Store.
  • Describe how to create and deploy custom functions and operations, integrate with Python and R, and protect and optimize jobs.
  • Describe how to use Azure SQL Data Warehouse to create a repository that can support large-scale analytical processing over data at rest.
  • Describe how to use Azure SQL Data Warehouse to perform analytical processing, how to maintain performance, and how to protect the data.
  • Describe how to use Azure Data Factory to import, transform, and transfer data between repositories and services.

Audience:

  • The primary audience for this course is data engineers (IT professionals, developers, and information workers) who plan to implement big data engineering workflows on Azure.

Prerequisite:

  • Programming experience using R, and familiarity with common R packages
  • Knowledge of common statistical methods and data analysis best practices.
  • Basic knowledge of the Microsoft Windows operating system and its core functionality.
  • Working knowledge of relational databases.
  • A good understanding of Azure data services.
  • A basic knowledge of the Microsoft Windows operating system and its core functionality.
  • A good knowledge of relational databases.

Suggested prerequisite course:

More Information
Brand Microsoft
Lab Access No
Technology Microsoft
Learning Style Virtual Classroom
Difficulty Intermediate
Course Duration 6 Days
Language English
SATV Value 6 Days
VPA Eligible VPA Eligible
Write Your Own Review
Only registered users can write reviews. Please Sign in or create an account
Sales Support

Sales (866) 991-3924

Mon-Fri. 8am-6pm CST

Have Questions? Ask Us.

Why QuickStart

Turn Training Into A Personalized Learning Experience


  • Problem Solving through ExpertConnect & Peer-To-Peer Learning
  • Find The Quickest Path To Learn With Career Paths
  • Access All Courses With Master Subscription
  • Manage Your Team With Learning Analytics
  • Virtual Classroom Training & Self-Paced Learning
  • Integrate With Your LMS Through API's