• Report Links
    We do not store any files or images on our server. XenPaste only index and link to content provided by other non-affiliated sites. If your copyrighted material has been posted on XenPaste or if hyperlinks to your copyrighted material are returned through our search engine and you want this material removed, you must contact the owners of such sites where the files and images are stored.

Databricks Certified Data Engineer Associate - Preparation


🦊 DNSProxy Layer 7 DDOS Protection 🥷 / DMCA Ignored 🫡 / Advanced Browser Checks 🕸

King

Administrator
Joined
Jul 12, 2021
Messages
25,005
Reaction score
5
Points
38
cc7e8d5f88d026f76cf68080d21b5618.jpeg


Databricks Certified Data Engineer Associate - Preparation
Published 11/2022
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz, 2 Ch
Genre: eLearning | Language: English | Duration: 20 lectures (2h 18m) | Size: 626.7 MB


Preparation course for Databricks Data Engineer Associate certification exam

What you'll learn
Understand how to use Databricks Lakehouse Platform and its tools
Build ETL pipelines using Apache Spark SQL and Python
Process data incrementally in batch and streaming mode
Orchestrate production pipelines
Understand and follow best security practices in Databricks
Requirements
Basic SQL knowledge will be required
Basic Python programming experience will be beneficial, but not necessary
Description
If you are interested in becoming a Certified Data Engineer Associate from Databricks, you have come to the right place! I am here to helping you with preparing for this certification exam.
By the end of this course, you should be able to
Understand how to use and the benefits of using the Databricks Lakehouse Platform and its tools, including
Data Lakehouse (architecture, descriptions, benefits)
Data Science and Engineering workspace (clusters, notebooks, data storage)
Delta Lake (general concepts, table management and manipulation, optimizations)
Build ETL pipelines using Apache Spark SQL and Python, including
Relational entities (databases, tables, views)
ELT (creating tables, writing data to tables, cleaning data, combining and reshaping tables, SQL UDFs)
Python (facilitating Spark SQL with string manipulation and control flow, passing data between PySpark and Spark SQL)
Incrementally process data, including
Structured Streaming (general concepts, triggers, watermarks)
Auto Loader (streaming reads)
Multi-hop Architecture (bronze-silver-gold, streaming applications)
Delta Live Tables (benefits and features)
Build...

Read more

Continue reading...
 
Top