Hello there!
I'm in a bit of a dilemma here. I have a pretty nice backup solution at home with my originals being backed up to a mac mini server hosting time machine. It is two clients (so far) being time machined on the mini and the mini gets weekly images (superduper!) aswell. So, each computer has one original and one copy (all copies are accessible from the macmini server).
The crux is, how do I get this transferred in a nice way to a remote ftp server? I have a friend which I have setup a co-location deal with so I have access to his ftp server with enough space to house the "third copy". I would like to have a superduper!-style incremental backup (with only 10Mbps available, with >400GB full backups will take days) but I am not really sure how to do it.
These are the strategies I've already tried:
1) Transmit "folder sync", this I don't really trust to give me a good copy of for example the time-machine-bundles. Works in a pinch but then I'd rather copy the files manually using filezilla/cyberduck.
2) Expandrive and superduper!, expandrive is a bit flaky and mounting the drives doesn't always work which makes superduper unusable. Also, I have to move a .dmg-file TO the ftp server first and mount this from the mounted expandrive-drive since the file system on the ftp is not supported by superduper (some raid-stuff, I guess.. not really sure about why this is not working).
So, what to try? Anyone have had similar experiences? I could probably write a python-script to do checking for "last edited" and just transfer the changed files over some python-ftp-module, but I'm also not sure how for example the iPhoto-files are perceived by python. Having to re-transfer the whole iPhoto-library every time I change a photo is a bit messy aswell.
// DL
I'm in a bit of a dilemma here. I have a pretty nice backup solution at home with my originals being backed up to a mac mini server hosting time machine. It is two clients (so far) being time machined on the mini and the mini gets weekly images (superduper!) aswell. So, each computer has one original and one copy (all copies are accessible from the macmini server).
The crux is, how do I get this transferred in a nice way to a remote ftp server? I have a friend which I have setup a co-location deal with so I have access to his ftp server with enough space to house the "third copy". I would like to have a superduper!-style incremental backup (with only 10Mbps available, with >400GB full backups will take days) but I am not really sure how to do it.
These are the strategies I've already tried:
1) Transmit "folder sync", this I don't really trust to give me a good copy of for example the time-machine-bundles. Works in a pinch but then I'd rather copy the files manually using filezilla/cyberduck.
2) Expandrive and superduper!, expandrive is a bit flaky and mounting the drives doesn't always work which makes superduper unusable. Also, I have to move a .dmg-file TO the ftp server first and mount this from the mounted expandrive-drive since the file system on the ftp is not supported by superduper (some raid-stuff, I guess.. not really sure about why this is not working).
So, what to try? Anyone have had similar experiences? I could probably write a python-script to do checking for "last edited" and just transfer the changed files over some python-ftp-module, but I'm also not sure how for example the iPhoto-files are perceived by python. Having to re-transfer the whole iPhoto-library every time I change a photo is a bit messy aswell.
// DL