1

I am trying to parse about 6000 urls from a text file line by line and return their http status code. I have it working for the most part but when my script encounters semi-colons and other various symbols it breaks the script. I am wondering if there is a way to make sure it escapes these characters and still allows cURL to run properly?

This is my code so far:

#!/bin/bash
while read line
do
    name=$line
    test="curl -s -o /dev/null -I -w "%{http_code}" $name"
    eval "$test"
done < $1

This is an example of an error:

./checkURL.sh: eval: line 6: `curl -s -o /dev/null -I -w %{http_code} http://xtblast.com/ptv/?attachment_id=855&repl&replytoco;&replytocom=32954'
./checkURL.sh: eval: line 6: syntax error near unexpected token `;&'

1 Answer 1

1

Too many quotes. Try a single quote instead:

test="curl -s -o /dev/null -I -w %{http_code} '$name'"

To adept the comments, it should be sufficient to use the script like:

#!/bin/bash
while read line
do
    /usr/bin/curl -s -o /dev/null -I -w %{http_code} -- "$line"
done < $1
Sign up to request clarification or add additional context in comments.

3 Comments

It's best to avoid putting commands in strings. If you want to be able to both print and execute a command, an array is better.
There's also, in this case, literally no reason to need the variable and exec.
You must using doublequotes , "$line" instead of '$line'. Try to add ` -- ` before $line, so value of $line was considered as argument not an option. /usr/bin/curl -s -o /dev/null -I -w %{http_code} -- "$line"

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.