[H-GEN] Managing multiple child processes with perl

Michael Anthon michael at anthon.net
Mon May 13 07:35:09 EDT 2002


[ Humbug *General* list - semi-serious discussions about Humbug and     ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]


<Jason wrote...>
> Sure.  The parent process will receive a SIGCHLD signal when a child
> exits.  There is a large amount of documentation on how to manage
> child processes in perl; see the perlipc and perlfork manpages; there
> is probably also something in the Perl FAQ (perldoc -q).

I was hoping it would be something like this.  System level stuff is not
something I'm overly familiar with.  I actually read the perlipc and
perlfork man pages before I wrote the first email but there was nothing that
jumped out and made me say "oh.. that's how you do it!", but I will read
them again with the SIGCHILD signal in mind, it might make more sense.

> You don't have large file support enabled?  Hmm, okay.

It was a while ago that I first wrote this.  In my tinkerings I did discover
that Solaris has several versions of all the major tools such as tar, grep
etc. AND that they all behave slightly differently from each other and from
the GNU implementations.  I find this rather annoying, however the ones in
/usr/xpg4/bin seem to be closest in behaviour to what I am used to.  I did
also note that *some* of these are capable of handling large files, some
aren't, as far as I can tell it is not a matter of "enabling" large file
support, more a case of ensuring you use the right version of a program...
I'm no Solaris expert so please correct me if I'm mistaken.  That said, I
ended up piping stuff into the processes but I can't recall if I did that
initially and just left it that way or if there was one of the tools that I
could not find a large file version of (gzip perhaps??)

<Ben wrote...>
> If instead of giving the subshells one job to do and trying to work out
> which one will return first so that you can give it the next job you give
> each one a list of jobs and let them all run at their own pace you might
> be able to simplify the problem.

If I can't get the IPC stuff to work I may look at this option.  I need to
be a little careful because the files need to be compressed in order from
smallest to largest.  The reason for this is that if I try to compress the
largest first then I run out of space on the holding disk.... still, it
should work well enough if I split the list of files by alternating from one
list to the next as I build it going from smallest to largest files.

Thanks for the pointers gents.

Cheers
Michael


--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'.  See http://www.humbug.org.au/



More information about the General mailing list