• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Server backup problem using Backup4all

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

jmdixon85

Member
Joined
Oct 5, 2008
Location
Cumbria (UK)
Surely this can't be normal:

BackupProblem.png

Over 10 hours to complete a backup? And that is just with the "deleting ZIP files part" as nothing much had changed on the server yesterday. (I had to cancel the backup so I could open the settings page to take screen shots)

One CPU is maxed out all the time it is doing this (So I dont think the program is multi-threaded).

I thinking that it could be because I have the save storage space enabled:?

BackupProblem2.png

Would I be better off choosing one of these options or are they totally un-related?

BackupProblem3.png

I am pretty much at a loss as to what some of these options really do.

The backup in question is the backup job for my 2TB data array with all my music, docs etc on it.

I cant put up with it taken 11+ hours but I don't want to disable a feature that would stop me from getting back a deleted file.

ie: If I delete a file off my Data drive I would atleast want it to stop on the backup drive for a few days incase I needed to retrieve the file or it was deleted in error if you get what I mean.
 
I think you'd probably be better off using different software to accomplish your backup.

If it were me, I'd probably use either 7zip, ImageX, or Robocopy.

For a true backup that would be stored someplace where space could be an issue, I would probably use 7zip and compress everything.

For a backup where space is not necessarily an issue and the ability to restore rapidly if needed, ImageX with fast compression would probably be best (use the /split option so you don't have a 2TB .wim file)

If you would like regular access to the backup, just use robocopy to copy everything over.

Each of those options are probably better than that paid/shareware program you're trying to use from the looks of it.
 
Cheers for the advice but I don't think any of those options would let me recover deleted files if I needed to recover them. Which I have needed to do a couple of times in the past.

For example:

The last time I needed to recover files was after I broke up with my partner. I deleted all the photos of us together etc. Then we got back together after a few week and she was rather angry at me for deleting all of our memories. Luckily the server came to the rescue as the files were still in the backup. Saving another argument :)
 
Last edited:
I would do one large initial backup (snapshot) then incrementals. Have the software run weekly (daily is overkill, you're not citibank) and keep maybe a max of 3 copies of each file in the backup before deleting the oldest version. This at least means most regularly used files you get a few versions to restore from, and all the other files (like music) are archived and only one file saved since they don't get changed.

Some important terminology you must understand if you're going to do these things yourself and expect the data you want to be there to be there:

http://en.wikipedia.org/wiki/Backup#Data_repository_models
 
Its already set to do exactly that. Not keen on doing just a weekly backup, lost all my data once. I don't intend for that to happen again. I know its a bit over kill but like the saying goes: Better safer than sorry.

Its just this dam'n deleted ZIP files thing! It started at 1am this morning and was still going 17hours later! :bang head

I was thinking of turning off the "limit number of file versions". (set at 3)

But enabling the "Automatically make full backup if all increments exceed xx percent of full backup".

I'm thinking this should still retain any deleted files for a while? And hopefully get rid of the "deleting ZIP files" problem? I mean I know its no Quad Xeon but c'mon!

And if I do enable the above option what % should I set it at?
 
I'm not sure adjusting anything is going to magically make this faster. I suspect there's either a corrupt file, series of corrupt / unreadable files, or some other flakiness causing your issue. Sorry I can be of much help there, I only thought you didn't know your settings. Specific to that application I couldn't even guess what's holding it back.

Have you tried running a backup on only a large data folder and timing it? Maybe add a folder and run it again, etc until you find if there's a set of files causing the issue? It's a tedious way to find out, but can't thnk of a better way of eliminating a cause.

BTW, what are you backing up to? If this is USB 2.0 and you're doing a ton of data/files, it will be very slow. I'm restoring 450GB to a server right now with Paragon and it's going to take 5 hours over USB. if you're using an internal drive, would there be any other apps (web based perhaps) using the drive, causing the seek times to slow?
 
Thanks for your input Pinky.

I just tried altering settings to the ones I mentioned in my last post and the backup is complete already. Not many files have been changed since the last backup so that makes sence.

I am slighty worried about a currupt file since I have stopped it from "deleting ZIP files" a couple of times. So I am going to wipe the backup and start from scratch.

The backup disk is a 1TB SATA2 HDD attatched to the same RAID controller so overall speed is quite quick.

What % would you recommend for the "Automatically make full backup if all increments exceed xx percent of full backup" setting?

Other than that all seems well now. :)
 
What % would you recommend for the "Automatically make full backup if all increments exceed xx percent of full backup" setting?

Other than that all seems well now. :)

Good to hear things are coming around.

That setting puzzles me, not because of what it's trying to accomplish, but WHY it needs to accomplish it and whether this setting would ever come into play in your scenerio. It probably won't. It would seem this setting would be for a high volume backup situation, where a lot of files are changing frequently or large chunks of the data being backed up get over-written regularly. I can't imagine a situation where this would be useful. So if you insist on setting it, I'd think higher is better, too low and you risk full backup copies being made in addition to the larger existing snapshot. 51% or greater would probably be best. I doubt you makes changes to more than 5% of your files between backups.
 
Back