Pricing

$0.10

per hour of a spark task

  • Unlimited Users
  • Unlimited Applications
  • Logs & Metrics Monitoring
    Single Sign On
    VPN and VPC Peering Support
    3rd Party Integrations
    Slack & Community Support
    14-Day Free Trial

A single pay-as-you-go plan

A fair pricing mechanism

Competing data platforms base their fee on server uptime, and charge you whether these servers are actually used by Spark or whether they’re sitting idle. Up to 80% of a typical bill originates from wasted compute infrastructure.

At Data Mechanics, we charge when you do real work, not when your servers are idle. This means we’re incentivized to make your infrastructure as efficient as possible and cut down the waste, so that you can reduce your cloud bill and its environmental impact.

They trust us

Pricing mechanics

Frequently Asked Questions

We’ve designed our Pay-As-You-Go plan with your interest in mind, so let us be transparent about it.

How do you compute the duration of our Spark tasks?

We export the Spark event logs of each application running on the platform, and sum the duration of all the Spark tasks. It’s the same information that you can see on the Spark UI, reported by Spark with a millisecond accuracy.

What do I pay if I don’t run any Spark command?

If you don’t run any Spark command, there is no Data Mechanics fee. This can happen if you run a pure Python/Scala application from a notebook or submitted through our API.

You will still incur costs from the cloud provider though, so it’s a good idea to shut down a notebook once you’re done with your work, so that all the pods are destroyed and the Kubernetes cluster can autoscale down.

How can I track my Data Mechanics costs?

As soon as an application finishes, our dashboard provides you this information. For recurring applications (we call them "jobs"), you can also track the evolution of your Data Mechanics costs, along with other key metrics. Finally we produce a billing report with detailed information and useful cost attribution breakdown at the end of each month.

Are you more expensive or cheaper than this other service?


If you're currently on another Spark platform, you're probably using a small number of static cluster and Spark configurations for most of your workloads. You're likely to suffer from ressources overprovisioning, long periods of idleness and parallelization issues as in the graph below.

When these problems happens, other Spark platforms will charge you for the total server uptime, including the wasted compute time. At Data Mechanics, not only do we charge you only for the used compute time, we also tune your configurations automatically and continuously for each of your Spark applications to eliminate the waste altogether.

Some of our customers have reduced their costs by over 50% since they migrated to our platform.

Can I use my free cloud credits?

At the end of the month, you get a bill from the cloud provider and a bill from Data Mechanics. Your cloud credits will apply to the cloud provider bill, which makes up the larger portion of your total costs.

Can I set a quota to limit my Data Mechanics expenses?

You have control over the autoscaling behavior of the Kubernetes cluster and of each Spark application. So you can for example set a maximum size on the cluster and on each Spark application.

Ready to get started ?

🍪 We use cookies to optimize your user experience. By browsing our website, you agree to the use of cookies.
close
30