The best way to learn about the impact of caching on CI is to set up the same builds with the cache enabled and disabled, and compare the results over time. If you have a single Gradle build step that you want to enable caching for, it’s easy to compare the results using your CI system’s built-in statistical tools.
Measuring complex pipelines may require more work or external tools to collect and process measurements.
It’s important to distinguish those parts of the pipeline that caching has no effect on, for example, the time builds spend waiting in the CI system’s queue, or time taken by checking out source code from version control.
When using Develocity, you can use the
Export API
to access the necessary data and run your analytics.
Develocity provides much richer data compared to what can be obtained from CI servers.
For example, you can get insights into the execution of single tasks, how many tasks were retrieved from the cache, how long it took to download from the cache, the properties that were used to calculate the cache key and more.
When using your CI servers built in functions, you can use
statistic charts
if you use Teamcity for your CI builds.
Most of time you will end up extracting data from your CI server via the corresponding REST API (see
Jenkins remote access API
and
Teamcity REST API
).
Typically, CI builds above a certain size include parallel sections to utilize multiple agents. With parallel pipelines you can measure the wall-clock time it takes for a set of changes to go from having been pushed to version control to being built, verified and deployed. The build cache’s effect in this case can be measured in the reduction of the time developers have to wait for feedback from CI.
You can also measure the cumulative time your build agents spent building a changeset, which will give you a sense of the amount of work the CI infrastructure has to exert. The cache’s effect here is less money spent on CI resources, as you don’t need as many CI agents to maintain the same number of changes built.