0

I want to fetch some apis from github.

so this is the code

array=(link1 link2 link3 link4)
readarray -t item < <(for i in ${array[@]}
      do curl -s "$i" | jq '.tag_name'
      done)

and then I can fetch the data from the item.

But the problem is if there is some links left to fetch, curl just stuck there.

I want it to exit immediately and instead of previous data, I need it to print some text like "error" or get its exitcode so that I can implement some logic in case of any fail.

provide a efficient solution for this.

2 Answers 2

1

You could read all URLs with one curl invocation and use option --fail-early to exit immediately on the first detected transfer error. Add --max-time to abort each transfer after x seconds and --connect-timeout if you want to restrict the connection time.

The item array will be empty in case of an error, you could check that.

urls=(link1 link2 link3 link4)
readarray -t item < <(
    curl -s --fail-early --connect-timeout 10 --max-time 120 "${urls[@]}" |
    jq '.tag_name')

if [ "${#item[@]}" -eq 0 ]; then
    echo "Uh, failed to fetch data." >&2
    exit 1
fi
1

You can use curl -f option to exit immediately if the response code is greater than or equal to 400. Also, you can add the -w option to capture the exit code and -o option to redirect the output to /dev/null to avoid printing the response to the console.

array=(link1 link2 link3 link4)

for i in "${array[@]}"
do
    response=$(curl -f -w '%{http_code}' -o /dev/null "$i")
    if [ "$response" -ge 400 ]; then
        echo "error"
    else
        curl -s "$i" | jq '.tag_name'
    fi
done

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.