I want to split a file of 10000 records into number of files with equal size of records.These new files need to be given as input to a shell script. Shell script should run in parallel for each file. Can we use any looping here?
1 Answer
Suppose your data file is called data.txt and the script you want to run is called script.sh. Then you could do something like the following:
#!/bin/bash
# Create a temporary directory
splitdir="$(mktemp -d)"
# Splite the data-file into files of 1000 lines each
split --lines=1000 -d --suffix-length=3 data.txt "${splitdir}/chunk"
# Run your script on each data file separately
for chunk in "${splitdir}/"*; do nohup script.sh "${chunk}" &; done
You could also do something similar using xargs or GNU parallel instead of a Bash loop.
man splitand have a try with option-l. If you can't succeed, edit the question to show what you tried and maybe give an easy example.parallel -a myfile --pipepart --block -1 wcorparallel -a myfile --pipepart --block -1 --cat wcorparallel -a myfile --pipepart --block -1 --fifo wc.