In this talk given at WeAreDevelopers‘ Python Day we will introduce how to use the popular cloud service Databricks to host Apache Spark applications for distributed data processing in combination with Apache Airflow, an orchestration framework for ETL batch workflows.
After a brief exploration of the Databricks Workspace and the fundamentals of Airflow we will take a
deeper look into the functionality Databricks provides in Airflow for orchestrating its workspace. Afterwards, we will find out how to extend and customize that functionality to manage virtually every aspect of the Databricks Workspace from Airflow.
This talk does not require any prior knowledge of Databricks, Spark or Airflow but it does assume familiarity with the fundamentals of the Python programming language especially object oriented programming and REST api requests. The actual distributed data processing with Apache Spark itself is not the focus of this talk.
About Alan Mazankiewicz
Alan finished his Master’s degree from Karlsruhe Institute of Technology in Information Engineering and Management in 2020 before starting his career as a Machine Learning Engineer at inovex GmbH in Cologne, Germany. He (co-) authored two scientific papers in the area of machine learning published at major journals and conferences and is a regular contributor to the open source community.