I maintain a lot of libraries on GitHub. I'd guess about 20 that I'm actively watching, and access to another 20 that I can help with if pinged. Many of them are stable, low activity projects. They might get an occasional bug fix or new feature, but are mostly quiet. Scheduled dependency updates have overwhelmed that tranquility. I've now disabled scheduled updates, replacing them with a local update command.
Pinning development dependencies for these projects ensures that everyone has the same development environment, and that updates and issues can be addressed rather than coming as a surprise at an inopportune moment. But you do want to update those dependencies so you continue working with maintained versions. Enter scheduled updates.
We have three ecosystems that we pin: Python requirements files with pip-compile, pre-commit hooks, and GitHub Actions in CI workflows. All three have services to create monthly update PRs. Dependabot can be configured to create one PR per ecosystem, one for Python and one for Actions. And pre-commit.ci will create a PR for all hooks.
Three PRs per project per month is really noisy. I dread the first of each month, when I wake up to approximately 60 PR notifications (3 ecosystems * 20 watched projects). Other people, such as occasional contributors, watch the projects as well. All these notifications that are irrelevant to them make it less likely they'll pay attention to the notifications that they could help with.
Addressing each PR requires a page load, scroll, then two clicks to merge. If any cause test failures, I now have to context switch into projects I haven't looked at in months, figure out the fixes (often just MyPy improvements), then commit, push, wait for tests, then merge. It's busy work.
For quiet projects, these PRs overwhelm the number of PRs and commits made by actual people. It makes it hard to search when pages of closed PRs are updates and not actual fixes or features. Even for big or active projects, it's really noisy. It makes it look like projects are getting active attention when they're actually stable and unchanged.
What if I just ignore the projects I'm not actively working on? pre-commit is nice and will update an existing unmerged PR with the latest updates, so it doesn't litter the history as much. Dependabot will do this only if you don't use grouped updates, which would mean one PR per dependency per repo per month instead; versus grouped PRs where it closes and opens a new PR each time. GitHub sends notifications when commits are pushed to PRs, so I end up with the same number of notifications.
For an application, especially with continuous deployments set up, scheduled updates probably make sense. You want to deploy with any bug fixes or security fixes as soon as possible. But for libraries, these dependencies are only running locally, as development environments. While new fixes and features are nice, they don't need immediate, constant attention.
I can run pip-compile and pre-commit locally to update those manually. But as far as I could find there was no way to update Actions locally, only Dependabot provided that. So I wrote gha-update, a simple tool that reads all workflow files and finds the highest version for each action it finds using the GitHub API. It updates pins using commit hashes, with tag names as comments for easy reference. With that written, I now had update commands I could run locally for each ecosystem.
Now I can update a project's dependencies when I'm actively working on that project. If I come back to a project after a few months, I can know first that the pins will give me a working environment, and then if I need something new then I can update right then. I managed to update every project before today, so for the first time in years I did not wake up to multiple pages of busy work notifications!
I use the following tox environments with a label, so I can run each tool individually or all at once.
[testenv:update-actions] labels = update deps = gha-update commands = gha-update [testenv:update-pre_commit] labels = update deps = pre-commit skip_install = true commands = pre-commit autoupdate -j4 [testenv:update-requirements] labels = update deps = pip-tools skip_install = true change_dir = requirements commands = pip-compile build.in -q {posargs:-U} pip-compile docs.in -q {posargs:-U} pip-compile tests.in -q {posargs:-U} pip-compile typing.in -q {posargs:-U} pip-compile dev.in -q {posargs:-U}
I can also use all-repos to call this for every project, run tests, and create a PR or push to main. Or in the future I can run this with a scheduled GitHub workflow. But it will be on my schedule.