archive data ... accumulate changes per file until threshold size


Author
Message
core
core
New Member
New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)New Member (26 reputation)
Group: Forum Members
Posts: 7, Visits: 18
Is it possible to create a backup job which accumulates changes in one file until a threshold size and then starts into another file?
For archive data that is essentially unchanged once written (e.g. just keeps growing like camera photos and movies), I would like to accumulate new data into Blu-Ray size chunks.  Then I would periodically burn one or more optical copies and move one copy off site.

I would expect the job to also not modify the past backup files that reached the threshold size.

On a related note, I've noticed jobs with full/differential/incremental will update older files.  That caught me by surprise.  I don't know if that is part of the consolidation process or what.  It feels like it unnecessarily complicates being able to independently making a 2nd copy of backup sets because of that mutability.  I've updated my rsync recipe to handle the mutability, though it forced me to use "--backup-dir" "--delete" and some other options. 

"It does not do to leave a live dragon out of your calculations, if you live near him." -J.R.R. Tolkien
jphughan
jphughan
Macrium Evangelist
Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)Macrium Evangelist (10K reputation)
Group: Forum Members
Posts: 7.3K, Visits: 52K
There is no way to have a backup written directly into an existing file as an update of that file, nor is there a way to have multiple backup "states" stored within a single file.  And in any case setting a target size threshold would be a bit problematic because Reflect won't always know in advance how large the backup will be.  Compression has varying levels of effectiveness on different types of data, and when CBT is not being used, Reflect doesn't even know how upfront precisely how much data would need to be backed up in a Diff/Inc.  The "Looking for changes" phase tells Reflect which files have changes, but not how many blocks in each file have changed.  That determination occurs as Reflect scans those files as the backup progresses.

As for older backups being updated, that is either Incremental Merge (animation here) or Synthetic Fulls (animation here).  One or the other must be used in order to have an Incremental retention policy, since those are the only options for purging old Incrementals in a way that does not invalidate later backups in the Incremental chain.  If you don't want your old backups to be modified, simply disable your Incremental retention policy so that Reflect allows them to grow unconstrained, although in that case your only option would be for them to be purged as a result of a parent Diff or Full being purged, which means you'd be purging Incrementals in groups rather than one at a time. If purging Incrementals as a group isn't acceptable for your purposes, then typically if you're replicating backups you'll want to disable Synthetic Fulls, which will be disabled anyway if you're using Diffs or you set your retention policy in terms of time rather than number of backups.  In that setup, when a new Incremental runs, you'll have to replicate that new backup and an updated version of the oldest Incremental.  Again, that's the only way you can purge old Incrementals without invalidating the rest of the chain, so it's not "unnecessarily" complicating anything.

Sort of bridging those two topics together, Macrium does offer the standalone Consolidate.exe file whereby you can create a Synthetic Full on demand, which might simplify archiving.  So for example if you had a Sunday Full and Incrementals for Monday through Saturday and you only wanted to store the Saturday state, you could use that to create a Synthetic Full of Saturday as a single file.  But a) you can't set a file size threshold, and b) you can't make Synthetic Fulls if you have a Diff anywhere in the set, the reason being that in a set that contains Diffs, modifying the Full could invalidate other Diffs you might have, so that restriction exists as a safeguard.  Synthetic Fulls require that the set contain only a Full and Incrementals.

Edited 29 June 2020 3:09 AM by jphughan
GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Reading This Topic

Login

Explore
Messages
Mentions
Search