Independent pricing guide. Not affiliated with Databricks, Inc. Rates verified April 2026.

Databricks Pricing on AWS

AWS is the most popular cloud for Databricks deployments and typically offers the lowest total cost. Full DBU rate tables, recommended EC2 instances, and real monthly cost examples.

Why AWS Is the Default Choice

Databricks was built on AWS first and the platform has the deepest feature set on this cloud. AWS offers the widest selection of instance types, the most mature spot market (60% to 70% savings on worker nodes), and Graviton ARM instances that deliver 15% to 20% better price-performance than x86 equivalents. Most Databricks benchmarks and documentation use AWS instance types as the reference.

For organizations without a strong Azure or GCP commitment, AWS is typically the lowest-cost option for Databricks workloads.

AWS DBU Rate Table

Premium tier rates on AWS. Enterprise tier pricing is available through Databricks sales.

Compute TypeDBU RateBest For
Jobs Light Compute$0.07Lightweight automated jobs
Jobs Compute$0.15Production ETL pipelines
Delta Live Tables Core$0.20Declarative ETL
Delta Live Tables Pro$0.25ETL with CDC and expectations
All-Purpose Compute$0.40Interactive notebooks, dev work
SQL Classic$0.22Classic SQL warehouses
SQL Pro$0.55SQL with query profiling
SQL Serverless$0.70Fully managed SQL (includes compute)
Model Training$0.65ML model training
Model Serving$0.07Real-time model endpoints

Recommended EC2 Instances

Common instance types for Databricks clusters on AWS with their DBU consumption rates and hourly costs.

InstancevCPUMemory$/hr (On-Demand)DBU/hrTotal $/hr*
m5.xlarge416 GB$0.1920.75$0.30
m5.2xlarge832 GB$0.3841.50$0.61
i3.xlarge430.5 GB$0.3121.00$0.46
i3.2xlarge861 GB$0.6242.00$0.92
i3.4xlarge16122 GB$1.2484.00$1.85
r5.xlarge432 GB$0.2521.00$0.40
r5.2xlarge864 GB$0.5042.00$0.80
p3.2xlarge861 GB$3.0605.50$3.93

*Total includes EC2 on-demand + Jobs Compute DBU rate ($0.15/DBU). Actual costs vary by workload type and region. US East (N. Virginia) pricing shown.

Monthly Cost Examples

Solo Data Engineer

$35/mo

1x m5.xlarge, 6 hrs/day, 22 days, spot workers

DBU Cost

$20

AWS Cost

$15

Notes

Minimal setup for development and light ETL

Startup Data Team (3 people)

$134/mo

3x i3.xlarge, 8 hrs/day, 22 days, spot workers

DBU Cost

$79

AWS Cost

$55

Notes

Small-scale production pipelines

Mid-Size ETL Platform

$673/mo

5x i3.2xlarge, 12 hrs/day, 22 days, spot workers

DBU Cost

$396

AWS Cost

$277

Notes

Multiple production pipelines

Analytics Team (20 users)

$2,816/mo

SQL Pro warehouse, Medium size, 10 hrs/day

DBU Cost

$1,936

AWS Cost

$880

Notes

Business intelligence and reporting

Enterprise ML Platform

$11,678/mo

8x p3.2xlarge, 10 hrs/day, 22 days, on-demand

DBU Cost

$6,292

AWS Cost

$5,386

Notes

GPU training clusters, model serving

AWS-Specific Optimizations

Spot Instances

Use spot instances for worker nodes to save 60% to 70% on EC2 costs. Keep the driver node on-demand for reliability. Spark handles spot interruptions gracefully by redistributing work to remaining nodes. Best for ETL jobs, batch processing, and ML training. Not recommended for streaming or interactive notebooks.

Graviton Instances

AWS Graviton3 (m6g, r6g, c6g families) offer 15% to 20% better price-performance than x86 equivalents. Databricks fully supports Graviton for Spark workloads. Lower per-hour cost plus slightly lower DBU consumption rates make Graviton the best default choice for non-GPU workloads.

Reserved Instances / Savings Plans

For clusters running 12+ hours daily, AWS Reserved Instances (1-year or 3-year) or Compute Savings Plans can reduce EC2 costs by 30% to 40%. These apply to the cloud infrastructure portion of your bill. Combine with Databricks committed-use discounts for maximum savings on both sides.

AWS Marketplace

Purchase Databricks through AWS Marketplace to apply spend against your Enterprise Discount Program (EDP) commitments. Billing consolidates to your AWS invoice. No price difference versus buying direct, but simplifies procurement and can offset existing AWS commitments.

Frequently Asked Questions

Is Databricks cheapest on AWS?
Generally yes. AWS has the lowest DBU rates across most workload types and the widest selection of instance types, including cost-effective Graviton processors. AWS also has the deepest spot instance market, which can reduce cloud infrastructure costs by 60% to 70%. Azure is typically 10% to 20% more expensive for equivalent configurations.
Can I buy Databricks through AWS Marketplace?
Yes. Purchasing through AWS Marketplace lets you apply Databricks spend against your AWS Enterprise Discount Program (EDP) commitments. Billing is consolidated on your AWS invoice. There is no price difference versus buying direct from Databricks, but the billing simplification and EDP credit benefits make it attractive for organizations with existing AWS commitments.
What are the best EC2 instances for Databricks?
For general ETL: i3.xlarge or i3.2xlarge (good balance of compute and local SSD for Delta caching). For memory-intensive workloads: r5.xlarge or r5.2xlarge. For cost-optimized batch jobs: m5.xlarge with spot pricing. For ML training: p3.2xlarge or g5.xlarge for GPU workloads. Graviton-based instances (m6g, r6g) offer 15% to 20% better price-performance.
Do I pay AWS directly for the infrastructure?
Yes. Databricks on AWS runs in your own AWS account. You pay AWS directly for EC2 instances, EBS volumes, S3 storage, and data transfer. Databricks bills you separately for DBU consumption. This two-bill model gives you full visibility into infrastructure costs and lets you apply AWS Reserved Instances or Savings Plans to the compute portion.