Asked  7 Months ago    Answers:  5   Viewed   36 times

Is there some sort of character limit imposed in bash (or other shells) for how long an input can be? If so, what is that character limit?

I.e. Is it possible to write a command in bash that is too long for the command line to execute? If there is not a required limit, is there a suggested limit?

 Answers

78

The limit for the length of a command line is not imposed by the shell, but by the operating system. This limit is usually in the range of hundred kilobytes. POSIX denotes this limit ARG_MAX and on POSIX conformant systems you can query it with

$ getconf ARG_MAX    # Get argument limit in bytes

E.g. on Cygwin this is 32000, and on the different BSDs and Linux systems I use it is anywhere from 131072 to 2621440.

If you need to process a list of files exceeding this limit, you might want to look at the xargs utility, which calls a program repeatedly with a subset of arguments not exceeding ARG_MAX.

To answer your specific question, yes, it is possible to attempt to run a command with too long an argument list. The shell will error with a message along "argument list too long".

Note that the input to a program (as read on stdin or any other file descriptor) is not limited (only by available program resources). So if your shell script reads a string into a variable, you are not restricted by ARG_MAX. The restriction also does not apply to shell-builtins.

Tuesday, June 1, 2021
 
Classified
answered 7 Months ago
47

In bash, the OS-enforced limitation on command-line length which causes the error argument list too long is not applied to shell builtins.

This error is triggered when the execve() syscall returns the error code E2BIG. There is no execve() call involved when invoking a builtin, so the error cannot take place.

Thus, both of your proposed operations are safe: cmd <<< "$string" writes $string to a temporary file, which does not require that it be passed as an argv element (or an environment variable, which is stored in the same pool of reserved space); and printf '%sn' "$cmd" takes place internal to the shell unless the shell's configuration has been modified, as with enable -n printf, to use an external printf implementation.

Thursday, June 3, 2021
 
Uours
answered 6 Months ago
81

What about this trick...

ls -lrt -d -1 $PWD/{*,.*}

OR

ls -lrt -d -1 $PWD/*

I think this has problems with empty directories but if another poster has a tweak I'll update my answer. Also, you may already know this but this is probably be a good candidate for an alias given it's lengthiness.

[update] added some tweaks based on comments, thanks guys.

[update] as pointed out by the comments you may need to tweek the matcher expressions depending on the shell (bash vs zsh). I've re-added my older command for reference.

Tuesday, July 27, 2021
 
cegfault
answered 5 Months ago
10

The shell does not limit this

You can see the limit for your system with (run on my 64bit linux:)

$ getconf ARG_MAX
2097152

See this very informational page http://www.in-ulm.de/~mascheck/various/argmax/

Thursday, August 5, 2021
 
leetwinski
answered 4 Months ago
35

For this kind of thing I always use find together with xargs:

$ find output-* -name "*.chunk.??" | xargs -I{} ./myexecutable -i {} -o {}.processed

Now since your script processes only one file at a time, using -exec (or -execdir) directly with find, as already suggested, is just as efficient, but I'm used to using xargs, as that's generally much more efficient when feeding a command operating on many arguments at once. Thus it's a very useful tool to keep in one's utility belt, so I thought it ought to be mentioned.

Monday, August 16, 2021
 
mplappert
answered 4 Months ago
Only authorized users can answer the question. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :  
Share