I have to fetch a website (multiple redirections possible using -L) and save the html content in a file named as [HTTP_Status_code]_[Website_name].html
Currently I am using two curl calls one for the dump and the other for header. Is there any way to club them into one?
Script:
cat url_list.txt | while read line; do
if curl -L $line -o `curl -I $line 2>/dev/null | head -n 1 | cut -d$' ' -f2`_`basename $line`.html
then
:
else
echo $line >>error.txt
fi
done
EDIT: I have to find the header of the last redirection.