[H-GEN] Managing multiple child processes with perl

Jason Henry Parker jasonp at uq.net.au
Tue May 14 18:17:34 EDT 2002

[ Humbug *General* list - semi-serious discussions about Humbug and     ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]

Michael Anthon <michael at anthon.net> writes:

> Thanks to those that gave me some pointers.  I struggled with this a bit
> today but eventually[1] got something that seems to work.  If anyone wants to
> look the test script I wrote is here
> http://tamsdev.pwcglobal.com.au/mca/multiprocessgzip.pl

) # Reset the list of data files
) @datafiles = ();
) # Get a list of files in the destination directory, smallest file first
) # to avoid problems with compressing the largest first and filling the drive
) @datafiles = qx%ls -l $dst_dir | tail +2 | sort -k

It's always fun to see someone using qx//.  I probably would have
written a few more lines of perl to replace it, but I'm guessing that
was cut-and-pasted from working code, so there's probably no problem.

)     #fork the process
)     if ($childpid = fork) {
)       #This is the parent process

)     } else {
)       #This is the child process

)     }

The only really egregious problem is that as far as I can see you're
not checking if fork() fails.  This could become crucial at some point
since you're currently treating it as a child process.  Test the
return code of fork() against -1, I think.

) #Wait for the final child processes to terminate
) while ($childcount>0) {
)   sleep 1;
) }

I *think* you may be able to use wait() here, but I'm not sure.

| `This is the operative statement.                                     |
|  The others are inoperative.'                        jasonp at uq.net.au |

* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'.  See http://www.humbug.org.au/

More information about the General mailing list