Timeline for Running thousands of curl background processes in parallel in bash script
Current License: CC BY-SA 3.0
13 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Sep 26, 2022 at 3:57 | answer | added | David | timeline score: 0 | |
| Nov 14, 2015 at 20:15 | comment | added | Nemo |
Why don't you use ab (apache benchmark)? You can set any concurrency.
|
|
| Apr 3, 2015 at 1:00 | comment | added | Mark Hudson | Original question didn't specify how long a single request takes... what if the earlier instances have completed? | |
| Aug 2, 2014 at 18:07 | answer | added | Ole Tange | timeline score: 16 | |
| Nov 18, 2013 at 20:21 | history | edited | zavg | CC BY-SA 3.0 |
deleted 20 characters in body
|
| Nov 18, 2013 at 20:15 | answer | added | jthill | timeline score: 2 | |
| Nov 18, 2013 at 15:07 | history | migrated | from serverfault.com (revisions) | ||
| Nov 7, 2013 at 18:48 | comment | added | zavg |
@zhenech @HBrujin I launched parallel and it says me that I may run just 500 parallel tasks due to system limit of file handles. I raised limit in limits.conf, but now when I try to run 5000 simulaneus jobs it instantly eats all my memory (49 Gb) even before start because every parallel perl script eats 32Mb.
|
|
| Nov 7, 2013 at 18:44 | comment | added | Dennis Williamson |
I agree with HBruijn. Notice how your process count is halved when you double the number of processes (by adding awk).
|
|
| Nov 7, 2013 at 18:44 | comment | added | Dennis Williamson |
Try start=$SECONDS and end=$SECONDS - and use lower case or mixed case variable names by habit in order to avoid potential name collision with shell variables. However, you're really only getting the ever-increasing time interval of the starting of each process. You're not getting how long the download took since the process is in the background (and start is only calculated once). In Bash, you can do (( diff = end - start )) dropping the dollar signs and allowing the spacing to be more flexible. Use pgrep if you have it.
|
|
| Nov 7, 2013 at 17:50 | comment | added | zhenech |
I also would suggest using parallel(1) for such tasks: manpages.debian.org/cgi-bin/…
|
|
| Nov 7, 2013 at 17:07 | comment | added | HBruijn |
You're probably hitting the one of the limits of max running processes or max open sockets. ulimit will show some of those limits.
|
|
| Nov 7, 2013 at 16:53 | history | asked | zavg | CC BY-SA 3.0 |