All I know is that I have hit the arg list too long error. That was
redhat-6.x so maybe it was old.
I think it is worth mentioning the solution if they ever run into that problem.
On March 16, 2018 11:04:37 AM "Robert P. J. Day" <rpjday [ at ] crashcourse [ dot ] ca> wrote:
On Fri, 16 Mar 2018, James wrote:
It is necessary for running a command on all files matching a
pattern.
i'm pretty sure that's not true.
Wildcarding substitutes the full file names of each matching result
of the wildcard on the command line. Thousands of matching
names/paths can easily blow up the buffer.
but that's the very point i was making ... that once upon a time,
there was limited space to construct a command before running it but,
these days, that limit is clearly much larger.
in the old days when i wanted to demonstrate this issue happening, i
would run a series of commands that looked like this:
$ ls /*
$ ls /*/*
$ ls /*/*/*
$ ls /*/*/*/*
... etc ...
note how that wildcard expansion blows up pretty quickly; years ago,
that would generate the "Arg list too long" error in a hurry. these
days, though, as long as the wildcard expansion succeeds and the
command is invoked, i believe we can say that we haven't reached that
limit.
and it's easy to see the length of each expansion:
$ echo /* | wc -c
119
$ echo /*/* | wc -c
159988
$ echo /*/*/* | wc -c
688989
$ echo /*/*/*/* | wc -c
3882671
$
so if i can successfully run:
$ ls /*/*/*/*
i'm fairly sure i can conclude that a command can be at least 3882671
characters long, can i not?
rday