Skip to content
Search icon
🏆 Zencore Named 3x 2025 Google Cloud Partner of the Year. See what it means →

Metadata logo

Metadata: Optimizing Databricks for Smarter Scale with Zencore

Zencore brings smarter scaling strategies to Metadata’s Databricks environment.

METADATA CUSTOMER SPOTLIGHT
 

Optimizing Databricks for Smarter Scale


Project Location:
San Francisco, CA
Industry:
Marketing Technology
Use Case:
Data Pipeline Optimization & Cost Efficiency on Databricks
Website:
metadata.io

Metadata, a committed Databricks customer, wanted to optimize their implementation to manage costs while improving overall performance. Facing economic headwinds, they sought expert guidance to make Databricks more cost-effective. Metadata engaged Zencore, whose cloud optimization expertise helped them achieve improved business value, without compromising stability, with the goal of expanding future Databricks usage.

About Metadata

Metadata is a leading marketing automation platform built for B2B marketers. Their platform uses AI and data to automate paid campaign execution, helping companies generate more pipeline with less manual effort. By integrating with key data sources, Metadata optimises targeting, creative, and budget allocation to maximise ROI across channels.

Project Challenges

Rising Databricks Costs Threatened Scalability

As Metadata expanded its data pipelines, Databricks costs started to climb — putting pressure on the company’s cloud budget.

When Databricks’ account team approached Metadata to discuss expanding the platform’s footprint, Metadata’s engineering leaders made one thing clear: cost was the primary blocker. They needed a proven, efficient optimization strategy to make further investment viable.

Solution

Zencore Delivers Targeted Cost Optimisation for Databricks

Databricks introduced Metadata to Zencore to help evaluate optimization opportunities. Zencore’s team conducted a deep-dive assessment of Metadata’s Databricks implementation and identified several optimizations, including:

Key recommendations included:
  • Implementing job clusters instead of all-purpose clusters to optimize resource allocation
  • Leveraging Graviton instances on AWS, which offer lower-cost DBUs
  • Additional configuration and workload adjustments to enhance efficiency

By implementing these recommendations, Metadata was able to improve cost efficiency, without sacrificing ETL workload stability, and move forward with Databricks as a long term solution.

 

“The recommendations from Zencore were exactly the right balance between cost savings and technical architecture optimizations. They allowed us to lower our DBU pricing and we are on the right track to expand our Databricks usage while maintaining reasonable expenses.”

Francisco Martin | Director of Data Engineering | Metadata
Francisco Martin | Director of Data Engineering | Metadata

Results

20% Cost Savings with Minimal Effort

After making targeted changes, including shifting to job clusters with Graviton instances, Metadata’s Director of Data Engineering reported:

  • ~20% cost reduction in Databricks spending for existing jobs
  • Improved ETL workload stability.
  • Faster time-to-value by implementing optimizations with minimal effort
WANT TO DISCUSS YOUR DATABRICKS ENVIRONMENT?

Let's Talk