[Box Backup] Restore & Compare Issue ** UPDATE **

Matt Brown boxbackup@fluffy.co.uk
Tue, 24 Apr 2007 10:19:15 +0100


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

>>> > Do you have any huge directories backed up?
>>> Only this one 3.5GB file (its a 20GB SQL database file compressed  
>>> to 3.5GB (gzip))
>
> By the way, I don't think this is a good idea. Because of the way  
> that gzip works, you will end up uploading about 3.5 GB every night  
> (replacing the entire file).
>
> Normally for an rsync-style backup I would recommend that you back  
> up uncompressed files instead. Box Backup does its own compression,  
> so you should not actually lose much space on the server. However,  
> I'm not sure whether it will be any more efficient on uploads  
> because it depends how much of that 20 GB file gets churned every  
> day. I'd say it's definitely worth a try.
>
>> Ok I have now tried with two other files, 1 which was 8k in size  
>> and 1 which was 11MB in size. Both were fetched without any TLS  
>> issues - therefore it appears that BB has an issue restoring  
>> single files over a certain size. The file I am trying to restore  
>> is 3.5GB is there a limit anywhere ?
>
> There should not be, but it's possible that either there is a bug  
> with restoring large files (over 2 GB), or that the filesystem that  
> you're restoring to does not support large files.
>
> To eliminate the latter, could you try dd'ing a large file in the  
> restore directory? (Is that /tmp? Is it a ramdisk by any chance?)
>
> 	dd if=/dev/zero of=/tmp/bigfile bs=1M count=4k
>
> and check that it does actually produce a 4GB file?


Hi Chris,

Well the file is in question is an SQL backup done via NT Backup on a  
Windows 2003 Server and then rsync'd (via Cygwin) to a 2.5TB data  
central backup store (Linux box with large storage array) and in turn  
produces a 20GB file - this data store then needs an offsite backup  
so I am running the bbackup client to send to the remote host. As I  
am trying to keep the transfer small(ish) I gzip first otherwise we  
would be sending a 20GB file each night (and growing). 3.5GB a night  
is not too much of a problem for us as the pipe carrying the data is  
big enough and quiet at night - my main concern is if the gzip gets  
corrupted when sending/restoring as I do not like depending on  
compressed data where I can help it.

I intend on sending all files uncompressed and then do a snapshot  
each night to the remote BB server (with exception of the SQL backup  
file).

I have tried changing the LCD to other paths i.e /data /home and  / 
tmp and still the same issue. I tried /tmp to make sure this was not  
a directory perms issue.

The only thing I can conclude so far is that it would be something to  
do with the size of the file :(

Regards

Matt Brown
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (Darwin)

iD8DBQFGLcuW6vWewLkSmagRAhQQAJ94sV/Elo1lkU0UNBm4veNpVY48owCfQB6E
dHzw/RriJ7L482pw/IWS/oc=
=pYFi
-----END PGP SIGNATURE-----