Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • DAR has inconvenient restoration procedure: each incremental backup physically overrides files from previous step. So, if your file changes 7 times, it would be extracted 7 times, and 6 copies would be wasted, overridden by the 7th. Commented May 20, 2017 at 4:28
  • I would not recommend DAR, since DAR uses a nonstandard archive format. Also, are there a sufficient number of successful incremental restores to verify usability? Note that GNU tar advertizes to support incremental backups since 1992, but still fails to restore non-trivial deltas. Commented May 12, 2020 at 8:02
  • @schily Using a committee-standardized format isn't the only relevant criterion for judging the quality of a backup program. DAR's format is openly specified. I don't understand what you want to get at with your question. Such a number would be hard to quantify for any backup program. I don't see how GNU tar is relevant here. For example, star advertises compression support but last time I checked it failed reporting errors when compression was enabled during archive creation. Commented Jul 10, 2022 at 18:07
  • 1
    Restic also has deduplication built in, so no need for hardlinks, if the data is the same, it will only be copied up once. Commented Jun 6, 2023 at 19:21