Skip to main content
14 of 16
deleted 135 characters in body
acgbox
  • 1k
  • 5
  • 15
  • 40

Launching Multiple Queries With Bash Script

I have the following bash script (on this post):

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < infile > outfile

infile:

google.com
facebook.com

outfile:

301 amazon.com
302 facebook.com

Problem: It is very slow since it verifies line by line.

Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes), wget, GNU parallel, etc, etc. None has convinced me. And here is a solution with xargs, but the output is different from the original script

Question: How can I launch multiple queries (parallel processing) with this script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?

acgbox
  • 1k
  • 5
  • 15
  • 40