Skip to content

Demos

Introduction¤

Watch a quick introduction to Laktory, the open-source ETL framework, and learn how you can leverage its pipeline model to efficiently build and deploy dataframe-centric pipelines to Databricks or other data platforms.

Lakehouse As Code¤

A mini-series about setting up and end-to-end Databricks Lakehouse using Laktory.

  1. Unity Catalog: Setting the foundation of a Unity Catalog, including users management, schemas and volumes.

  2. Workspace: Configuring a Workspace with clusters, warehouses and secrets

  3. Pipeline Job: Declaring a Laktory pipeline and deploying it as a Databricks Job.

  4. Delta Live Tables: Declaring a Laktory pipeline and deploying it as a Delta Live Tables

  5. AI/BI Dashboard: How to easily create an AI/BI Dashboard and deploy it to multiple workspaces.

Overview¤

Watch a hands-on demo on how to use Laktory to build a scalable data pipeline. It covers:

  • Declare a data pipeline backed by Polars dataframes and run it locally
  • Change backend to Spark dataframe and run it on a Databricks cluster
  • Deploy as a databricks job and as a Delta Live Tables using Laktory CLI

Requests¤

You may request a live demo from Okube website.