[H-GEN] Managing multiple child processes with perl
Jason Henry Parker
jasonp at uq.net.au
Mon May 13 03:14:38 EDT 2002
[ Humbug *General* list - semi-serious discussions about Humbug and ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]
Michael Anthon <michael at anthon.net> writes:
> This means that I am gzipping all 30G or so twice, which seems terribly
> inefficient to me. So.. to get to my questions. I am wanting to find a way
> to run the gzip processes in parallel (it's a dual CPU E250 running
> solaris). The only possible way to do this that I can see is to fork the
> gzip processes, getting the PID of each one as it starts, then just watch
> for those processes to not be there any more (using ps or something). This
> seems a little... icky and I was hoping someone could advise me on a better
> way to manage this.
Sure. The parent process will receive a SIGCHLD signal when a child
exits. There is a large amount of documentation on how to manage
child processes in perl; see the perlipc and perlfork manpages; there
is probably also something in the Perl FAQ (perldoc -q).
> [1] There is a good reason for this.... if I try to use the built in
> drive compression it doesn't seem to fit on a single tape... and I
> have to use pipes because some of the files are >2G in size.
You don't have large file support enabled? Hmm, okay.
--
||----|---|------------|--|-------|------|-----------|-#---|-|--|------||
| "Noble sentiments require the delicate sting of an arrow |
| Not the rude bluntness of a two-by-four" jasonp at uq.net.au |
||--|--------|--------------|----|-------------|------|---------|-----|-|
--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'. See http://www.humbug.org.au/
More information about the General
mailing list