Group: Forum Members
Posts: 1.2K,
Visits: 4.4K
|
I don't think OP's proposed formula seems that off the mark as a conservative estimate. Assuming "Interval Time" means the interval between snapshots (e.g., T = 15 min), and "Time that Snapshot Exists" is exactly what it sounds like, and essentially equivalent to backup time (let's assume B = 5 min), then averaged over a long time period (to be able to assume the rate of disk writing is approximately constant over time) it would probably be accurate to say that the wear & tear on the source drive increases by a percentage less than or equal to the ratio R = B/T.
If, on average, the fraction of write operations that produce only new data (and do not overwrite snapshotted data) is N, then the fractional increase in write operations on the source drive would be R = (1-N)B/T. Thus, OP's estimate is true in the worst case scenario (N = 0), but is an overestimate if N > 0.
The absolute worst case scenario is N=0 and B=T (because Reflect will not start a new backup unless the previousl one is finished). In this case, R = 1, which means a 100% increase -- or in other words, the write activity is effectively doubled, meaning the drive lifespan is cut in half. So, based on JP's use case, the drive might last 40 years instead of 80 years.
Edited to Add: OP posted their follow-up response at the same time that I was composing the above, but it seems like we came to the same conclusion.
|