DP-3014 Implementing a Machine Learning Solution with Azure Databricks

€695.00
| /

________________________________________________________________

¿Quiere realizar este curso en modalidad telepresencial o presencial?

Póngase en contacto con nosotros por correo: info@nanforiberica.comteléfono: +34 91 031 66 78, o WhatsApp: +34 685 60 05 91

________________________________________________________________

A hands-on course to learn how to use Apache Spark, Delta Lake, and SQL Warehouses on Azure Databricks

Introduction

Azure Databricks is a cloud-based data analytics service that makes running Apache Spark easy and scalable. With Azure Databricks, you can create and run interactive notebooks that allow you to explore, transform, and visualize data, as well as build machine learning applications and workflows. Additionally, Azure Databricks integrates with other Azure services, such as Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure Machine Learning, to create complete and robust data solutions.

In this training program, you will learn how to use Azure Databricks to perform data analysis with Apache Spark, Delta Lake, and SQL Warehouses. The program is aimed at data professionals, analysts, data scientists, and data engineers who want to take advantage of Azure Databricks for their projects. The program has an estimated duration of 115 hours in teletraining mode and is made up of the following modules:

  • Explore Azure Databricks: In this module, you will become familiar with the Azure Databricks user interface, learn how to create and manage clusters, and run notebooks with different languages ​​and libraries.
  • Use Apache Spark in Azure Databricks: In this module, you will learn the basics of Apache Spark, the most popular distributed processing framework for large-scale data analysis. You will learn how to use Spark APIs to read, write, and transform structured and unstructured data, as well as how to use Spark SQL to query data with SQL.
  • Use Delta Lake in Azure Databricks: In this module, you will learn how to use Delta Lake, a storage layer that brings reliability, performance, and quality to data in Azure Databricks. You will learn to create and manage Delta tables, perform read and write operations with ACID, and use the evolutionary schema, auditing, and time travel functionalities.
  • Use SQL Warehouses in Azure Databricks: In this module, you will learn how to use SQL Warehouses, a data storage and analytics solution that combines the power of Azure Databricks with that of Azure Synapse Analytics. You'll learn how to create and connect SQL Warehouses with Azure Databricks, query with SQL and Spark, and use the built-in visualization tools.
  • Run Azure Databricks Notebooks with Azure Data Factory: In this module, you will learn how to use Azure Data Factory, a data integration service that allows you to create and orchestrate data workflows. You'll learn how to create and run Azure Data Factory pipelines that invoke Azure Databricks notebooks, and monitor and debug the results.

Goals

Upon completion of this training program, you will be able to:

  • Create and configure an Azure Databricks environment
  • Run and share interactive notebooks with different languages ​​and libraries
  • Use Apache Spark to read, write, and transform structured and unstructured data
  • Use Spark SQL to query data with SQL
  • Use Delta Lake to create and manage reliable, fast and quality tables
  • Use SQL Warehouses to store and analyze data with SQL and Spark
  • Use Azure Data Factory to create and run data pipelines that invoke Azure Databricks notebooks

Previous requirements

To follow this training program, it is recommended to have the following knowledge and skills:

  • Basic knowledge of data analysis and statistics
  • Basic programming knowledge in Python, Scala or R
  • Basic knowledge of SQL
  • Basic knowledge of Azure and its data services
  • Access to an Azure subscription and Azure Databricks workspace

Assessment

To evaluate your learning, you will need to complete a series of practical exercises and a final project that will consist of creating a data solution with Azure Databricks. The hands-on exercises will be completed on Azure Databricks notebooks and will be self-corrected. The final project will be carried out in a real Azure environment and will be evaluated according to the criteria established by the instructor.

Certification

If you successfully complete the training program, you will receive a certificate of achievement that certifies your knowledge and skills about Azure Databricks. The certificate will be issued in digital format and will have a unique verification code.

    Information related to training

    Soporte siempre a tu lado

    Training support: Always by your side

    Formación presencial y telepresencial

    Do you need another training modality?

    bonificaciones

    Bonuses for companies