[H-GEN] bash command-line question [was: Re: ...self executing tar.gz's?]
Tony Nugent
tony at linuxworks.com.au
Tue Jun 11 11:13:28 EDT 2002
[ Humbug *General* list - semi-serious discussions about Humbug and ]
[ Unix-related topics. Posts from non-subscribed addresses will vanish. ]
On Tue Jun 11 2002 at 10:27, Christopher Biggs wrote:
> Matthew, here's a document that explains how to do it using a shell
> script header:
>
> http://linux.org.mt/article/selfextract
Elegant :) [thanx for the url]
I have a (bash v2) shell problem that I've been trying to solve for
a while now, and I'm wondering if anyone has a some suggestions...
An example of what I'm trying to do:
$ find . -type f -name \*.txt -exec command1 {} ; command2 {} | filter_prog \;
That is, get find's -exec to run a *sequence* of commands on each
filename it groks. It fails to work. I can produce what I want if
I create a small shell script and use it thus:
$ find . -type f -name \*.txt -exec myscript {} \;
But this is very inconvenient, especially to implement cleanly
within shell scripts... creating external shell scripts files on the
fly seems an unnecessary resort. Being able to do it with a
one-liner at a command prompt would be useful.
I have tried quoting the multiple commands in all sorts of ways,
-exec'ing them wrapped as (exported) functions and aliases [1],
but aanaa-noowae.
Is this an issue related to the shell, or a feature-bug of
/usr/bin/find ? Perhaps a behaviour stipulated by some posix
requirement?
Does anyone have a solution or ideas/alternatives?
Oh, I'm aware of xargs...
$ find -type d | xargs /bin/ls -ld
... but that isn't robust for what I have in mind. It has the
potential to feed a huge list of files into the list of args to
the command that xargs runs, while "find -exec" executes for each
result right away. In fact xargs shows much of the same apparent
limitations as find in that it won't/can't accept multiple
commands or exported shell functions.
> --cjb
[1] To balance my request for help with a contribution, here is a
nice shell trick (illustrating my question) that I'm sure others
will find handy... (and perhaps help forgive my unusual verbosity
on the list this evening:-)
It is often useful to be able to monitor dynamically (in real time)
things are happening on your system, especially if you are testing
or looking for problems.
Most people know about tricks such as "tail -f /var/log/messages&".
But that often isn't so useful... what about monitoring things like
processes as they come and go or as their activity/memusage changes?
Sure, you could use "ps aux|grep something" repeatedly, but that is
very inconvenient. What about watching a directory as the files
change or come and go - how to monitor this in a sane way?
There is a little-known utility (standard at least on redhat boxes)
called "watch" that works by continuously clearing the screen and
running a command (or commands) every two seconds (by default). For
example:
watch -n 3 ls -ld /var/log/*
-n 3 is every 3sec, which should be increased if the command(s)
watch is running take more than this interval to complete (eg, you
can't expect it to work to be continuously running grep over a very
large log file sixty times a minute).
It has all sorts of uses. Here are simple commands that will allow
simple monitoring of your computer's interrupt or memory activity:
watch cat /proc/interrupts
watch cat /proc/meminfo
There are ways to make it do some very complex things. For example,
you may want to watch two (or more) different but related events at
the same time, but how to do that? If you quote (or shell-escape)
the commands you want to run for watch, that works to do the trick,
eg:
watch "ps auxwww | grep sendmail ; tail -10 /var/log/maillog | grep sendmail"
Ok not so effecient, using an anonomous function is better...
"{ ps auxwww ; tail -10 /var/log/maillog ; } | grep sendmail"
That gets rid of one process, but it's only an example :-)
To get rid of the the grep command itself out of the result, do it
in a way similar to this:
ps auxww | grep sendmai\[l\]
Even more complex things can be achieved by using watch with
functions -- which also happens to be a very elegant way to avoid
lots of quote-quoting and character escaping...
function pswatch () {
# fail if there is no parameter (process/daemon name to look for)
[ -z "$1" ] && echo "error - give me a parameter you big dill!" && return 1
ps auxwww | grep "$1"
# the double tail helps to ensure that 10 lines
# of the logs you want to see are always displayed if logging is busy
tail -50 /var/log/messages /var/log/maillog | grep "$1" | tail -10
# ... other stuff as desired ...
}
And to make this work, you do:
export -f pswatch
watch -n 5 pswatch sendmail
If you use any functions within your shell function, then they will
also need to be exported, and they will all need to be re-exported
if you make any changes to them.
(This is exactly the sort of thing I want to do with the -exec
parameter of the find command).
Your imagination can go wild from here. It is really useful.
Unix is wonderful how it gives you such a vast and powerful armoury
of simple, elegant but powerful plug-together tools as standard
issue.
When was the last time microsoft updated its "sort" and "more"
commands? :-) [oh, its now all done with their gui...]
Cheers
Tony
--
* This is list (humbug) general handled by majordomo at lists.humbug.org.au .
* Postings to this list are only accepted from subscribed addresses of
* lists 'general' or 'general-post'. See http://www.humbug.org.au/
More information about the General
mailing list