Because of @Kiwy's persistence that you could use Git to do this in the comments, it reminded me of a tool that I'd seen a while ago called git-annex. In refreshing myself on what git-annex can do I remembered coming across this post in the git-annex forums.
Synchronize large files (VM images)
Hi,
I'm thinking to use git-annex to synchronize my virtual machine
directory (Virtualbox) between 3 pc. It's quite big: more than 200GB
and some of the images are 40Gb in size.
The synchronization will be over a lan (obviously). It is already in
place with 2pc and unison but the configuration of the 3rd pc is
cumbersome. Does anybody have experiences with git-annex and such
amount of data?
Thanks in advance
Gabriele
To which the author of git-annex replied:
This volume of data should be no problem for git-annex.
The only catch would be if you're running those VM images and want to
sync them as they're changed. With git-annex, you'd need to git annex
unlock a file to allow it to be modified, and then git annex add it
back and commit changes made to it.
So it's just Git?
But be clear on this point. Git-annex is not pure Git. It uses the interface that git provides but uses a variety of different backends for doing the actual shuttling of data back and forth. Read the "How it works" page for more on this.
The contents of 'annexed' files are not stored in git, only the names of the files and some other metadata remain there.
For more on how it handles the "transferring of data" take a look at this section of the site titled: "transferring data.
Special remotes
The genius in git-annex's approach is in the "special remotes". This allows the backends to be essentially plugged in and are therefore modular in nature. You can see a full list of the various "special remotes" here.
References