I wrote a simple bash script who takes a list of URLs and outputs a CSV with some data for each one : url, status code and target url :
while read url
do
urlstatus=$(curl -H 'Cache-Control: no-cache' -o /dev/null --silent --head --insecure --write-out '%{http_code} , %{redirect_url}' "$url" -I )
echo "$url , $urlstatus" >> "$1-out.csv"
done < $1
Sometimes an URL have 2 or 3 redirects, I'd like to get them all and print them in the output file.
I've found the -L option and the %{url_effective} filter to the last URLs :
urlstatus2=$(curl -H 'Cache-Control: no-cache' -o /dev/null --silent --head --insecure --write-out ' , %{url_effective}' "$url" -L -I )
But I'd like to have all URLs from the origin to the final one and add them to the csv.