I tried to write this short to make it concise, and failed, so here's the long version.
I have a simple little website that I've built using RapidWeaver - it's hosted primarily on an external web server and I've successfully been able to ftp files to it for a long time with no issue. That site links via RW's "iFrame" pages to folders on my local Mac which I've set up via Apache to serve pages out to the web.
I know - pretty smart... but...
I set up a dynamic dns account to track my local mac on the off chance my IP would change - with Time Warner, that pretty much never happened. Now that Comcast cable has moved into town and is constantly changing my dynamic IP address, I can't keep the connections live for long. Once the IP changes, it takes a good while for the dyndns links to update in the DNS server, so the pages may not link up right for decent stretches at a time.
The point of all this is that I'm serving out client proofs of a few thousand individual files that can change at the drop of a hat due to client corrections and changes. Add to the confusion, they are proofs for multiple clients in multiple cities, sometimes having names that are relatively the same.
What I want to do is this: I want a folder action set up that will make it so that any file (they're all .pdf files, if that helps) saved into a certain folder will automatically get uploaded to a corresponding folder on my hosted web space, replacing any older file with the same name that may have already been uploaded in the past. I've been through google searches and found a few options, none of which are working the way I think they should, so I'm getting confused. Is there a simple way to accomplish what I'm after? I'm sure there someone out there with some incredible AppleScript Kung Fu that can do it, but I'm not that guy.
I guess I basically want a backup script that moves only the new or the changed files to an offsite server, which just happens to be my web server. All the while I want it to leave alone the files that didn't change. Make sense?
I have a simple little website that I've built using RapidWeaver - it's hosted primarily on an external web server and I've successfully been able to ftp files to it for a long time with no issue. That site links via RW's "iFrame" pages to folders on my local Mac which I've set up via Apache to serve pages out to the web.
I know - pretty smart... but...
I set up a dynamic dns account to track my local mac on the off chance my IP would change - with Time Warner, that pretty much never happened. Now that Comcast cable has moved into town and is constantly changing my dynamic IP address, I can't keep the connections live for long. Once the IP changes, it takes a good while for the dyndns links to update in the DNS server, so the pages may not link up right for decent stretches at a time.
The point of all this is that I'm serving out client proofs of a few thousand individual files that can change at the drop of a hat due to client corrections and changes. Add to the confusion, they are proofs for multiple clients in multiple cities, sometimes having names that are relatively the same.
What I want to do is this: I want a folder action set up that will make it so that any file (they're all .pdf files, if that helps) saved into a certain folder will automatically get uploaded to a corresponding folder on my hosted web space, replacing any older file with the same name that may have already been uploaded in the past. I've been through google searches and found a few options, none of which are working the way I think they should, so I'm getting confused. Is there a simple way to accomplish what I'm after? I'm sure there someone out there with some incredible AppleScript Kung Fu that can do it, but I'm not that guy.
I guess I basically want a backup script that moves only the new or the changed files to an offsite server, which just happens to be my web server. All the while I want it to leave alone the files that didn't change. Make sense?