The Design of Software (CLOSED)

A public forum for discussing the design of software, from the user interface to the code architecture. Now closed.

The "Design of Software" discussion group has been merged with the main Joel on Software discussion group.

The archives will remain online indefinitely.

File backup solution

I run a rapidly growing service (http://comfypage.com).  To backup our user's photos, documents etc we've had a cron job zip up the relevant directories and ftp them to a separate server.  This process is now taking over an hour and is also burning a lot of cpu time and bandwidth.  Not to mention disk space on the receiving server as its getting a full backup at least every 24 hours.

So, can anyone recommend an incremental backup service?  Must be able to handle several gig of files.  Needs to be usable from either shell or php script.
Andrew Send private email
Sunday, July 06, 2008
 
 
I use Mozy: http://mozy.com/
Gili Send private email
Monday, July 07, 2008
 
 
Can't you just put everything on S3 and let them to the backups?
Odysseus Send private email
Monday, July 07, 2008
 
 
Considering the file size and future growth, I will not consider any online backup solution.

Just install a second hard drive and back files to that drive. Or better yet, get another server and copy files between the two servers via a private network.
Glitch
Monday, July 07, 2008
 
 
Glitch, did you even read the original post?
John Topley Send private email
Monday, July 07, 2008
 
 
<<Just install a second hard drive and back files to that drive. Or better yet, get another server and copy files between the two servers via a private network.>>

Burglary or fire, you're done. Not so good.
anonymous
Monday, July 07, 2008
 
 
OK, if you must transfer files over Internet, tar/ftp is not a good solution, because the same file is transmitted again and again every time.

I suggest that you use 'rsync' instead, as it only transmits changed files. (use -z for compression).
Glitch
Monday, July 07, 2008
 
 
Ive heard that its a allot faster if you have a simple scuzzy hard drive added to say a co-lo server and just have one for every box and have your backup software run that way. As you add new boxes, just bolt those on. Another option is build a large RAID on a file cluster server connected to your boxes. Get a beefy box with some big switches and run your backup service that way over the network.
Ranger Send private email
Monday, July 07, 2008
 
 
@Ranger
D2D backup is fine, but you still need to do the D2T, so you can keep multiple generations. Just having one backup copy that is overwritten every night isn't much better than having none.
Odysseus Send private email
Tuesday, July 08, 2008
 
 
Slightly beyond what you need, but here's how to use rsync to generate incremental backups as well:

http://www.mikerubel.org/computers/rsync_snapshots/

Fully automatic incremental backups that users can access instantly without admin intervention are full of win.
Tim Evans Send private email
Wednesday, July 09, 2008
 
 
It is a waste of CPU to zip photos.  The size will not be reduced for most file formats.
XYZZY
Friday, July 18, 2008
 
 

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics
 
Powered by FogBugz