[H-GEN] Backup
Ewan Edwards
Edwards_Ewan_B at cat.com
Mon Jul 28 21:06:05 EDT 2003
[ Humbug *General* list - semi-serious discussions about Humbug and ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]
On Tuesday 29 July 2003 08:35 am, Tony Melia (DMS) wrote:
> I am looking at creating a simple backup to a remote share of my RH7 linux
> box. I tried arring, but it stops at 2G which I found was the limit (the
> remote share is via samba smbmount). What's the best way for me to do
> this, based on the fact that my system is about 4G of data. I have a 2G
> file limit, so the solution must allow me to split files, and it must be
> easy to 'unsplit' them in the event I need to restore. Also, I have
> directories I need to --exclude. Tar -cjvf works find other than the 2G
> limit, although I saw posts on google suggest not to use compression as the
> whole archive could be lost if any part becomes corrupt.
>
> Suggestions please?
I have this very same problem and have wound up writing unnecessarily complex
scripts with a lot of "tar ... --exclude ..." commands.
The problem is in Samba. I vaguely recall reading about a bug fix in version
2.2.8 that fixed the 2Gb issue. At least I think it was 2.2.8 - you may need
to check the release notes.
--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'. See http://www.humbug.org.au/
More information about the General
mailing list