Skip to main content
deleted 4 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

Update: Solved!. Vote closeThanks

cat infile | xargs -I {} -P3 curl {} -o /dev/null --silent --head --write-out "%{http_code} {}\n" > outfile

PD: "-P 3" number of instances

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

Update: Solved!. Vote close

cat infile | xargs -I {} -P3 curl {} -o /dev/null --silent --head --write-out "%{http_code} {}\n" > outfile

PD: "-P 3" number of instances

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

Update: Solved!. Thanks

cat infile | xargs -I {} -P3 curl {} -o /dev/null --silent --head --write-out "%{http_code} {}\n" > outfile

PD: "-P 3" number of instances

added 148 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

Update: Solved!. Vote close

cat infile | xargs -I {} -P3 curl {} -o /dev/null --silent --head --write-out "%{http_code} {}\n" > outfile

PD: "-P 3" number of instances

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

Update: Solved!. Vote close

cat infile | xargs -I {} -P3 curl {} -o /dev/null --silent --head --write-out "%{http_code} {}\n" > outfile

PD: "-P 3" number of instances

deleted 135 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me.

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

deleted 213 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 182 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 109 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
deleted 36 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
Tweeted twitter.com/StackUnix/status/1167678553745502208
added 109 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 513 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 99 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 79 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 99 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 63 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
added 63 characters in body
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading
Source Link
acgbox
  • 1k
  • 5
  • 15
  • 40
Loading