But your classic CVS/SVN system has
the obvious drawback of needing a full
repository to work, and I'd really
rather not have two copies of my 60gb+
MP3 folder sitting on a machine
somewhere, as well as not
traditionally dealing with binary
deltas very well.
With CVS/SVN you have one repository, and several working copies.
So the repository contains every file once plus the whole history for every file.
The working copy contains every file once plus some additional data per file (usually approx. the size of the file).
Very roughly:
Let's assume our revision control system cannot store diffs of binary files efficiently (not really true, but for simplicity). Your collection is 60 GB MP3 files.
If you have 10 revisions per file on average and we neglect compression (because MP3s compress bad) your repo will be ca. 600 GB and your working copy ca. 120 GB.
So, Distributed Version Control starts
sounding pretty good at this point.
In a distributed system every working copy is essentially a repository, that means
every working copy contains every file plus history.
Same assumptions as above, every copy will have ca. 600 GB.
Bottom line is, distributed system will require more space than centralized.
EDIT:
Even if your question is more about a large number of binary file than large binary files in version control the following post might be intersting: Revisiting large binary files issue.
Best Answer
Subversion can be used for "designer" files/documents as well. It keeps track of versions, alterations and updates in files as you and your colleagues work on them.
You can organize your projects in a single repository as folder structure in multiple repositories. There are some free client tools out there, like TortoiseSVN for Windows or SVNX for Mac OS X.
Here is a tutorial to Subversion written from a designer's perspective: Subversion Workflow For Designers