Databricks CI/CD TL;DR
Back to Apache Spark TL;DR Hub

Databricks CI/CD

/tldr: No more “it works in my notebook”

Databricks Asset Bundles Git-Folder Sync Unity Catalog 2025

2025 LAW

Databricks Asset Bundles (DABs)
= The only acceptable CI/CD in 2025

Only Two Ways to Do It Right

Databricks Asset Bundles (DABs)

100% IaC · Jobs + DLT + UC · GitHub Actions / Azure DevOps

Legacy (Dead)

databricks-cli + Repos + manual runs

DABs Project Structure (2025 Standard)

my-project/
├── databricks.yml                 # ← DABs config (environments!)
├── src/
│   ├── jobs/
│   │   └── etl_job.py
│   ├── pipelines/
│   │   └── silver_dlt.py
│   └── models/
│       └── churn_model.sql
├── resources/
│   └── unity-catalog/
│       └── grants.yaml
└── tests/
    └── test_silver.py
            

databricks.yml — The One File That Rules All Environments

bundle:
  name: churn-pipeline

targets:
  dev:
    default: true
    workspace:
      host: https://adb-123456.7.azuredatabricks.net
  prod:
    workspace:
      host: https://adb-987654.3.azuredatabricks.net

resources:
  jobs:
    churn_etl:
      name: "churn-etl-${{env.BUNDLE_TARGET}}"
      tasks:
        - task_key: run_etl
          python_wheel_task:
            package_name: my_project
            entry_point: etl
      schedule:
        quartz_cron_expression: "0 0 5 * * ?"
        timezone_id: UTC
  pipelines:
    silver_pipeline:
      name: "silver-churn-${{env.BUNDLE_TARGET}}"
      target: silver
      development: true
            

GitHub Actions — Deploy in 30 Seconds

name: Deploy to Databricks

on:
  push:
    branches: [ main, develop ]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Deploy to dev
        if: github.ref == 'refs/heads/develop'
        run: databricks bundle deploy -t dev

      - name: Deploy to prod
        if: github.ref == 'refs/heads/main'
        run: databricks bundle deploy -t prod
        env:
          DATABRICKS_TOKEN: ${{ secrets.PROD_TOKEN }}
            

Add This → Never Break Prod Again

# tests/test_silver.py
import chispa
from pyspark.sql import SparkSession

def test_clean_data(spark: SparkSession):
    input_df = spark.createDataFrame([...])
    output_df = silver_transform(input_df)
    expected_df = spark.createDataFrame([...])
    chispa.assert_df_equality(output_df, expected_df)
            
dbt + Great Expectations + chispa = bulletproof

FINAL ANSWER:

Databricks Asset Bundles + GitHub Actions
= Real CI/CD for Databricks

No notebooks in prod.
No manual clicks.
No exceptions.

Databricks Asset Bundles (DABs) • Unity Catalog • Runtime 15+ • The only way in 2025