glossary

Lead Time for Changes

DORA metrics are a set of four diagnostic criteria for determining the productivity, velocity, and efficiency of your software development teams. Of the four metrics, lead time for changes encompasses the widest scope, accounting for the total duration of every step in your deployment pipeline. 

DORA metrics, also known as DevOps Research and Assessment metrics, are designed to measure your developers’ productivity throughout the software development lifecycle. By measuring the four DORA metrics, engineering leaders can gain a better understanding of where their developers meet, exceed, and fall short of production expectations, and locate exactly where improvements need to be made.

In this article, we’ll discuss the DORA metric lead time for changes, how to measure it, and how to improve your lead time with various different tools.

What is DORA lead time for changes?

Google defines lead time for changes as the amount of time it takes a commit to get into production. For larger projects with multiple commits before a merge, you can measure lead time for changes by finding:

  • The time of the first commit
  • The time of the merger
  • The number of commits that took place between those times

Essentially, the clock starts once a piece of code has been submitted to production, and runs until that code is either ready to be deployed or has already been deployed and is available in production. 

A shorter lead time for changes typically means your software engineers are operating at a high level: they are able to write high-quality code quickly, submit it properly and according to standards, and deploy that code without issue. 

On the other hand, longer lead times can sometimes indicate that there are obstacles in your team’s way, from inefficiencies in process to increased cognitive load on developers working with complex technologies. 

Why is DORA lead time for changes important?

Understanding your lead time for changes is important for measuring and optimizing your team’s velocity. Some practical applications for lead time for changes include:

  • Setting expectations for new feature releases
  • Planning launches and communicating with product and marketing teams about release dates
  • Planning work for each sprint 
  • Managing incidents and estimating how long it will take to resume normal service

Working with lead time for changes in this way can help you shorten work cycles, making it possible to affect other DORA metrics like deployment frequency and mean time to resolution for incidents. A shorter lead time for changes typically corresponds with value provided to the customer as well — if code is deployed quickly and regularly, end users receive a higher quality experience.

Having a shorter lead time for changes also boosts developer satisfaction. By establishing a baseline estimate for how long something takes to accomplish, developers are likely to be motivated to meet or exceed that estimate, which encourages a culture of continuous improvement and productivity.

The way DORA metrics work also incentivizes developers to “beat” estimated times of completion with high-quality work using the counter metric of change failure rate — which helps discourage speed for speed’s sake. 

Measuring DORA lead time for changes

The DORA working group identified four categories of lead time performers:

  1. Elite performers: Lead time is less than one hour
  2. High performers: Lead time is between one day and one week
  3. Medium performers: Lead time is between one week and one month
  4. Low performers: Lead time takes over six months

You might notice that these benchmarks align closely with agile sprint planning, where development teams plan work in two-week or one-month sprints. But even the best laid plans are still subject to changes when new work emerges or incidents occur.

Gathering information on lead time can also be tedious and time consuming. It requires detailed knowledge of how individual developer teams work. Depending on any differences in working style or standards across teams, these differences can further impact the mean lead time for changes across your entire organization.

Some common challenges teams face when attempting to measure their lead time for changes include:

  • Internal documentation is not clear enough to shepherd junior developers through complex tasks like resource provisioning
  • SREs may need to contact developers on other teams to resolve incidents, extending downtime 
  • Teams with higher workloads can take longer to review PRs
  • Teams with specialized knowledge are more focused on resolving routine issues instead of building new features
  • As applications grow and mature, maintenance tasks increase in volume as a result of increased complexity

With this in mind, it’s important to remember that your lead time for features may be different than your lead time for incident management, which is why this metric is also sometimes taken as an average of all lead times for all changes.

Strategies to improve DORA metrics

Measuring your performance against any DORA metrics can quickly become difficult for a variety of reasons, especially for large development teams. If teams are siloed, they may be using different standards or pushing code to production at different levels of quality. These kinds of changes mean work isn’t comparable across teams.

If you want to improve your lead time for change metric, or any other DORA metric, here are some ideas for getting started:

  • Survey your developers: You won’t know what to fix without asking! Make sure you ask questions about pain points and consider all feedback you receive when making your final plan.
  • Set working agreements: Working agreements can help you standardize work across teams, making it easier to find your true mean lead time for change. Be sure to consider your developers’ input when creating your working agreements.
  • Encourage thorough code testing: Many developers run code through automated tests and QA to speed up the process before submitting code to production. Set a standard among your teams for minimum code testing requirements. 
  • Use an internal developer portal: An internal developer portal’s software catalog unifies information from all of the tools in your software development lifecycle, providing a single pane of glass into the health and efficiency of your entire pipeline. But what’s more, you can improve on these metrics through the portal by adding new elements to the software catalog or creating self-service actions. For instance, deployment frequency can be improved by optimizing the steps to bring a feature to production with self-service actions for scaffolding a new service or spinning up a developer environment.

After gathering all of this information, engineering leaders are still left with the task of monitoring productivity and output. Internal developer portals make it easy for:

  • Developers to find information and do what they need to deploy code
  • Platform engineers to standardize deployments and maintain standards
  • Managers to measure performance against DORA metrics

With everything relevant to your software development pipeline in one place, you can create baseline expectations for your teams using real data, build experiences in the portal to serve specific needs with golden paths, and track adherence to engineering standards with scorecards.

Use Port’s internal developer portal to measure DORA lead time

Port offers an open internal developer portal that you can customize according to your engineering organization's specific needs.

The internal developer portal is an ideal hub for centralizing your engineering metrics. Its software catalog aggregates all data tied to the software development lifecycle, including details on microservices, APIs, cloud assets, pull requests, compliance, and beyond, enabling you to identify and track all your DORA metrics.

By defining the thresholds you want to see for your average lead time, you can track time with the portal’s scorecards. This means the portal can also turn working agreements into measurable metrics via scorecards. Team members can get tailored dashboards incorporating DORA metrics scorecards within them. Real-time tracking makes working agreements actionable, allowing team members to monitor performance and stay aligned with expectations.

Then, you can drive continuous improvement by acting on the metrics within the portal. For example, using self- service actions for automating tasks in testing or deployment.

Want to learn more about internal developer portals and how they impact your lead time for changes? Port’s open demo is a great place to learn!

Let us walk you through the platform and catalog the assets of your choice.

I’m ready, let’s start