2

I have a file of 1000 lines. I need to read that file 10/20 lines per time and execute those or save it into other file. Next time it should read the file from 11/21 and do the same.This should be done till the EOF.

How should I restrict the number while reading from file?

4
  • How are you reading the file? Is this in a shell script? Commented Jul 3, 2017 at 11:22
  • yes, using shell script. Commented Jul 3, 2017 at 11:23
  • Please edit your question and show us how you are reading the file. Normally, you read one line at a time. What are you trying to achieve by reading 10 lines at once? Commented Jul 3, 2017 at 11:24
  • After executing the first bunch of lines, what triggers execution of the next? Commented Jul 3, 2017 at 15:02

4 Answers 4

0

Simplistically:

while read -r one
do 
  read -r two && 
  read -r three && 
  read -r four && 
  read -r five && 
  read -r six && 
  read -r seven && 
  read -r eight && 
  read -r nine && 
  read -r ten && 
  printf "%s\n" "$one" "$two" "$three" "$four" "$five" "$six" "$seven" "$eight" "$nine" "$ten"
  ## or whatever you want to do to process those lines
  echo END OF SECTION 
done < input-file

This could "easily" enough be extended to read twenty lines at a time.

0

This would do it:

while read line1 && [do something with $line1]
do
    read line2 && [do something with $line2]
    read line3 && [do something with $line3]
    […]
done < file.txt

However, it is very strange to have the restriction on reading exactly N lines, unless your data structure is a fixed number of lines. Usually what is attempted by reading several lines at a time is some sort of parallelism, which would be better achieved by using either xargs (to handle multiple $lineN variables in a single command), parallel (to use a worker model to handle lines ASAP), or a combination of these.

0

You can do the following, to read 5 lines:

N=5; # Number of lines to process together (YMMV)
cat input_file |
while IFS= read -r v1; do
   eof=
   for i in $(seq 2 "$N"); do
      IFS= read -r "v$i" || { unset -v eof; break; }
   done
   ${eof+:} break
   echo "The 5 lines read in are: $v1 $v2 $v3 $v4 $v5"
done
0

We can split and filter the file...

split -l 20 --filter='command'  input_file

Example: split in chunks of 20 lines and choose one random line from each chunk (shuf -n 1)

split -l 20 --filter='shuf -n 1' input_file

The sama command (split) may be used to create a file for each chunk:

split -l 20 input-file input-file-chunk-

creating input-file-chunk-aa nput-file-chunk-ab etc

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.