[H-GEN] Managing multiple child processes with perl

Michael Anthon michael at anthon.net
Mon May 13 02:23:30 EDT 2002


[ Humbug *General* list - semi-serious discussions about Humbug and     ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]

Here is one for the perl experts....

I have a desire to speed up my nightly backup process on our main database
server.  The process at the moment goes something like this (it's a perl
script that I wrote)

1. Build a list of tablespaces that I want to backup
2. Build a list of data files taht contain the tablespaces
3. For each tablespace, set it into backup mode (allowing a consistent copy
to be taken), make a copy of the tablespace datafiles onto a holding disk
and then end backup mode on the tablespace.
4. Make a backup of the oracle control file into the holding disk
5. Make a copy of other important oracle configuration files/directories
6. Using tar piped via gzip then dd [1], write the backups to tape
7. Using gzip, compress the files on the holding disk.

Currently stages 1-5 take about 30 minutes, then about 7 hours for stage 6
and another 7 hours for stage 7 (it's about 32G of data total).  I initially
was hoping to zip the files then write them to tape, however it takes too
long and the tape changing monkeys were complaining that it was interfering
with their schedule... so I swapped it around.

This means that I am gzipping all 30G or so twice, which seems terribly
inefficient to me.  So.. to get to my questions.  I am wanting to find a way
to run the gzip processes in parallel (it's a dual CPU E250 running
solaris).  The only possible way to do this that I can see is to fork the
gzip processes, getting the PID of each one as it starts, then just watch
for those processes to not be there any more (using ps or something).  This
seems a little... icky and I was hoping someone could advise me on a better
way to manage this.  If I can get the gzip processes running in parallel
then it should shorten the time taken for the compression stage enough that
I can do the compression only once, then use tar to write the compressed
files to tape.

Cheers
Michael

[1]  There is a good reason for this.... if I try to use the built in drive
compression it doesn't seem to fit on a single tape... and I have to use
pipes because some of the files are >2G in size.

--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'.  See http://www.humbug.org.au/



More information about the General mailing list