Which Jobs Consume the Most? — Finding Your Storage Hogs
You have 200 backup jobs. Most are well-behaved — predictable sizes, reasonable retention. But somewhere in that list, there’s a job that silently consumes 40% of your total storage. A database dump that grows 10% monthly. A file server with 50,000 unchanged files that get re-read every night. A misconfigured Incremental that behaves like a Full.
In bconsole, you’d need to cross-reference list jobs, list volumes, and list pools to build this picture. In practice, nobody does — until storage runs out.
The 80/20 of backup storage
In most environments, 20% of jobs consume 80% of storage. Finding those top consumers is the single most effective step toward optimization. But you need a view that shows storage consumption per job and per client — not just a job history.
What changes when you can see it
When you visualize storage by job and client:
- The database dump stands out immediately
- The misconfigured Incremental becomes obvious
- You can prioritize: fix one job, save 200 GB
This isn’t about micromanagement. It’s about having the right view to make informed decisions.
Onesimus Pro — storage consumption at a glance
Onesimus Pro will break down pool usage per job and client. No scripting, no cross-referencing, no spreadsheets. Connect to your Director, and the answer will be right there.
This is a Pro feature on the roadmap. The Community edition gives you a modern management interface today — jobs, clients, schedules, and basic pool/volume status. Pro adds the storage intelligence layer.
This is question 3 of 5. Together with question 1, these are the visibility features planned for Onesimus Pro.