Bash: pipe ‘find’ output into ‘readarray’

bashfind

I'm trying to search for files using find, and put those files into a Bash array so that I can do other operations on them (e.g. ls or grep them). But I can't figure out why readarray isn't reading the find output as it's piped into it.

Say I have two files in the current directory, file1.txt and file2.txt. So the find output is as follows:

$ find . -name "file*"
./file1.txt
./file2.txt

So I want to pipe that into an array whose two elements are the strings "./file1.txt" and "./file2.txt" (without quotes, obviously).

I've tried this, among a few other things:

$ declare -a FILES
$ find . -name "file*" | readarray FILES
$ echo "${FILES[@]}"; echo "${#FILES[@]}"

0

As you can see from the echo output, my array is empty.

So what exactly am I doing wrong here? Why is readarray not reading find's output as its standard input and putting those strings into the array?

Best Answer

When using a pipeline, bash runs the commands in subshells. Therefore, the array is populated, but in a subshell, so the parent shell has no access to it.

Use process substitution:

readarray FILES < <(find)

Note that it doesn't work for files with newlines in their names. If that could be the case, you need a more elaborate syntax:

readarray -d '' < <(find -print0)
Related Question