When requested object objects are present in swh-graph, we should be able to approximate the total runtime of vault tasks, as I expect it to be linear in the number of objects of each type.
How to do it:
1. use data in swh-scheduler's database to get the run time of cooking each root object
2. [[ https://docs.softwareheritage.org/devel/swh-graph/api.html#counting-results | use swh-graph ]] to compute the number of objects of each type (cnt + dir + rev should be enough) reachable from that root object
3. run a linear regression to obtain a model of the runtime as a function of the number of object of each type
4. every time we get a cooking request, query counts in swh-graph (like in step 2) and use the model to estimate the run time
This would be a great UX improvement, as some gitfast/git-bare tasks can be really long.