Re: Backing up sites -beware of Hostpro

by "Darrell King" <darrell(at)webctr.com>

 Date:  Sat, 11 Mar 2000 06:57:47 -0500
 To:  <hwg-techniques(at)hwg.org>
 References:  fourelephants
  todo: View Thread, Original
>>>It is NOT the webmaster's responsibility to do daily backups! The server
administrator should have a daily backup scheduled, ...If you are concerned
about 100% uptime, a better solution would setup multi-homed servers (i.e.
host your site(s) on two or more servers, possibly geographically
dispersed), where any changes are simultaneously written to all servers.
That way, if one server fails nobody knows (except the admin folks),
.. and you can always recover one from one of the 'clones'.<<<<

I am the one who wrote Shannon about working from local versions.  I run
about 30 sites that way, and perform occasion programming tasks for many
more where I keep local versions of my part of the work.

It may not be a "webmaster's responsibility", but if Shannon thinks it
advisable, then the question is "how" rather than "if".  I would image "how"
might involve a server-side application...perhaps just a Perl script on a
cron (I didn't look back through the notes to see if this was NT or *nix,
tho)...??  If the script could go through the file list (an easily editable
text file?), tar up the indicated files from that list, and place them in a
predetermined place every day/week/whatever, then it would seem her machine
could FTP in, grab the archive, and disconnect easuly enough using a cron or
even a manual batch activated by a mouse click?

The key to the problem seems to be the fact that some files are changed by
the server.  It is easy to keep locals of files only changed by the design
team, but it shouldn't be that hard to put together a list of just
server-altered files for a daily grab...??

D

HWG hwg-techniques mailing list archives, maintained by Webmasters @ IWA