I have access to a distributed computing/server farm with a job scheduler (Slurm) that gives each parallel job an integer ID from 1 to n (I know the value of n, in the example below, n = 10).
I am using find -maxdepth 1 -name '2019 - *' to find the list of file names I want to pass to my program as an argument.
Sample file names:
2019 - Alphabet
2019 - Foo Bar
2019 - Reddit
2019 - StackExchange
The order does not matter. All matching files should only be used once.
This is an example of a "template" script I can use:
#!/bin/bash
# in this case, from i = 1 to i = 10
#SBATCH --array=1-10
# pseudocode begins
# it is given that filename_array has 10 unique elements
filename_array="$(find -maxdepth 1 -name '2019 - *')"
# SLURM_ARRAY_TASK_ID is the value of i, from i = 1 to i = 10
filename=filename_array[$SLURM_ARRAY_TASK_ID]
# pseudocode ends
./a.out "$filename"
This is more or less what it does (but with each process running in a different computer in parallel):
./a.out "./2019 - Alphabet" &
./a.out "./2019 - Foo Bar" &
./a.out "./2019 - Reddit" &
./a.out "./2019 - StackExchange" &
How can I write a bash script that would run the template script exactly once for each of the file names given by find -maxdepth 1 -name '2019 - *'?