Databricks DBU Pricing Explained: Every SKU and Rate

A complete reference for understanding Databricks Units. What a DBU actually measures, how rates differ by SKU, and step-by-step examples showing how to calculate your monthly DBU cost.

What is a DBU?

A Databricks Unit (DBU) is a normalized measure of processing capability. It abstracts away the underlying hardware differences so Databricks can price compute consistently across different instance types and cloud providers. When you run a cluster, each node consumes a certain number of DBUs per hour based on its instance type. Your Databricks platform bill is the total DBUs consumed multiplied by the per-DBU rate for your workload type.

Importantly, DBU charges are only the Databricks portion of your bill. You also pay your cloud provider (AWS, Azure, or GCP) for the underlying virtual machines, storage, and network transfer. The DBU charge is essentially the fee for the Databricks platform layer: managed Spark, Delta Lake, Unity Catalog, notebooks, workflows, and SQL warehouses.

Complete DBU Rate Reference

SKURate/DBUDescription
Jobs Light$0.07Lightweight, non-interactive batch jobs
Jobs Compute$0.15Standard batch and scheduled pipelines
Delta Live Tables Core$0.20Declarative ETL with core features
Delta Live Tables Pro$0.25Declarative ETL with CDC, expectations
SQL Classic$0.22Classic SQL warehouse endpoints
SQL Pro$0.55Advanced SQL with query profiling
SQL Serverless$0.70Fully managed SQL, instant startup
All-Purpose Compute$0.40Interactive notebooks and exploration
Model Training$0.65ML model training workloads
Model Inference$0.07Real-time model serving endpoints

Rates shown for AWS. Azure and GCP rates may vary slightly. Verify current rates on the Databricks pricing page.

Instance Type to DBU Mapping

Each cloud instance type consumes a specific number of DBUs per hour. Larger instances consume more DBUs. Here are common AWS instance types and their DBU consumption rates:

Instance TypevCPUsMemoryDBU/hr
i3.xlarge430.5 GB1.0
i3.2xlarge861 GB2.0
i3.4xlarge16122 GB4.0
i3.8xlarge32244 GB8.0
m5.xlarge416 GB0.75
m5.2xlarge832 GB1.5
r5.xlarge432 GB1.0
p3.2xlarge (GPU)861 GB5.5

Step-by-Step Cost Calculation Example

Scenario: Daily ETL Pipeline on AWS

Cluster configuration4x i3.2xlarge nodes
DBU/hr per node (i3.2xlarge)2.0 DBU/hr
Total DBU/hr (4 nodes)8.0 DBU/hr
Runtime per day3 hours
Workdays per month22
Total DBU/month8.0 x 3 x 22 = 528 DBU
Jobs Compute rate$0.15/DBU
Databricks cost528 x $0.15 = $79.20/mo
AWS cost (i3.2xlarge: $0.624/hr)4 x $0.624 x 3 x 22 = $164.74/mo
Total monthly cost$79.20 + $164.74 = $243.94/mo

Frequently Asked Questions

What does DBU stand for?
DBU stands for Databricks Unit. It is a normalized unit of processing capability that measures compute consumption on the Databricks platform. Think of it as compute currency: different workloads consume DBUs at different rates, and each DBU type costs a different amount.
How are DBUs calculated?
DBU consumption depends on two factors: the instance type running your cluster and the workload type. Each instance type has a defined DBU-per-hour rate. For example, an i3.xlarge on AWS consumes 2.0 DBU/hour for Jobs Compute. Your total DBU consumption is: DBU/hour x number of nodes x hours running.
Why do different workloads have different DBU rates?
Databricks charges different rates for different features and SLA levels. Jobs Compute at $0.15/DBU is cheapest because it runs non-interactive batch workloads. All-Purpose Compute at $0.40/DBU is more expensive because it supports interactive notebooks with immediate results. SQL Pro and Model Training are premium because they include advanced optimization and GPU support.