Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • Does each curl produce multiple child links (in which case they might be parallelizable), or just one (in which case they can only be sequential)? Also, if the network is the bottleneck, would parallelization actually help at all (since multiple simultaneous curls will just compete for bandwidth)? Commented Sep 3, 2021 at 7:39
  • @GordonDavisson Each curl produces multiple links. By network, I meant the API's GET response time. I edit my post now. Commented Sep 3, 2021 at 7:47
  • 1
    External tools like GNU parallel or xargs -P won't work with a function (not directly, anyway - you could make it run sh -c 'task ...' with a .bashrc that defined the function) because the bash function only exists within the bash shell where it is defined. They will work with scripts that implement the same function, same as they would with any other executable. If you really want to do this with a function, you could just run the task function in the background with & but you'd have to write your own queueing code to limit the number of simultaneous tasks. Commented Sep 3, 2021 at 8:48
  • As @GordonDavisson says, though, since CPU isn't the bottleneck, parallelisation is unlikely to help and may even harm performance. Commented Sep 3, 2021 at 8:49