Does anyone have any real experience using vv on giant binary files dominant projects?. My projects are generally non programming where I need to deal with humongous file sizes (GBs) i can have a sing file that can be easily 2-8 gb. These are big datasets, so I am wondering if vv is somewhat efficient with binary files. Obviously I wont try to merge or diff these files. I just need it mostly for bug tracking and versioning.
I am not sure if Veracity can compress versions. That would be great
All files stored in Veracity are compressed with zlib and won't be stored more than once. (Checking in identical files twice results in just one blob stored in the repository.)
The drawback to large binary files stored in a DVCS is the additional time needed to transfer the repository when cloning.
answered Nov 26 '12 at 08:12
Ian Olsen ♦♦