Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

7
  • 2
    You're probably hitting the one of the limits of max running processes or max open sockets. ulimit will show some of those limits. Commented Nov 7, 2013 at 17:07
  • 6
    I also would suggest using parallel(1) for such tasks: manpages.debian.org/cgi-bin/… Commented Nov 7, 2013 at 17:50
  • Try start=$SECONDS and end=$SECONDS - and use lower case or mixed case variable names by habit in order to avoid potential name collision with shell variables. However, you're really only getting the ever-increasing time interval of the starting of each process. You're not getting how long the download took since the process is in the background (and start is only calculated once). In Bash, you can do (( diff = end - start )) dropping the dollar signs and allowing the spacing to be more flexible. Use pgrep if you have it. Commented Nov 7, 2013 at 18:44
  • I agree with HBruijn. Notice how your process count is halved when you double the number of processes (by adding awk). Commented Nov 7, 2013 at 18:44
  • @zhenech @HBrujin I launched parallel and it says me that I may run just 500 parallel tasks due to system limit of file handles. I raised limit in limits.conf, but now when I try to run 5000 simulaneus jobs it instantly eats all my memory (49 Gb) even before start because every parallel perl script eats 32Mb. Commented Nov 7, 2013 at 18:48