Automating dependency updates: the big debate
Updated: Apr 21
Someone recently told me about Renovate and that initiated an interesting debate at Equal Experts over what is a hotter topic than I thought: automating dependency updates. To me the issue was crystal-clear: do it! But then I heard some interesting arguments in favour of the opposite. As usual, I don't think there is a universally best approach but rather I think that context will be the main determinant to your decision, however, I thought I could provide some guidance to the wider community by addressing what seem to be the main concerns regarding automated dependency updates. It is true that there is a risk in automated dependencies and it all depends on the cost/benefit analysis. In my case, this is the way I see it:
New dependencies breaking things: this risk can be mostly (but never entirely) addressed by having a comprehensive build that tests your service with the new version and only accepts the new version if the build passes. On the other hand, a counter-argument would be that by updating dependencies manually as opposed to automatically you’re likely to update less often; by updating less often you’re increasing the amount of change that you bring in when you do update, and by increasing the amount of change you actually increase the risk of something being broken. So, even if you worry about dependency updates breaking your system, your risk is still smaller by doing it more often, which you can achieve with reduced overhead via automation.
Increased number of releases: while I think that it is highly desirable to deploy to production every single version that is produced by your build pipeline, I don’t think it’s strictly necessary. You could have nightly builds that update your dependencies, but that doesn’t mean you then need to immediately deploy that to production. I tend to set up nightly update builds (for some services more than one nightly build so as to update different sets of dependencies), but then I might deploy once a week or so; we just deploy whatever is latest from our pipeline.
Updating with intent: I agree with the spirit of “you should update your dependencies for a reason, not just because”, a lot of waste in IT comes from doing things blindly. Here, however, I also think there is a risk/benefit analysis to be done. Software is in continuous flux, there are new vulnerabilities and bugs every day. I could wait until I hear about a particular vulnerability before updating to address it, or I could wait until being hit by a bug, or I could be proactive and always use the latest; this way I’m likely to address issues before I even know about them. A similar argument can be made for performance improvements: a new version of a dependency may run faster or use less memory, why waiting until I have a performance issue to use it? Sure, a new version could bring bad news too, but this is less frequent and usually spotted and fixed relatively quickly.
All the above, however, is based on some implicit assumptions like the team having a healthy Continuous Integration / Deployment / Delivery pipeline, but what if you don't? For a good breakdown on reasons that may make automating dependency updates not such good idea take a look at The Case Against Automatic Dependency Updates.
In summary, although automated dependency updates brings some risks, I do believe that under the right circumstances the benefits far outweigh them. Having said that, in the end you know your system better than anyone and therefore there is no one better place than you to assess whether you can benefit from automated dependency updates.
What's your take?