0

I have a script that loads a massive directory listing and just (in it's nature) takes forever to load. I'm now in some cases experiencing the script timing out and was curious if I could use something like usleep to keep the script from timing out or if I'll just make the situation worse?

4
  • 1
    How could usleep prevent from execution timeout?? Commented Nov 12, 2011 at 23:02
  • Are you generating any output? Sometimes apache will kill an active request if it has been running for a long time without sending any output to the client. Commented Nov 12, 2011 at 23:10
  • Is it necessary to load the directory listing for every request? Could you run a periodic task to cache the file listing instead? Commented Nov 12, 2011 at 23:36
  • How many files and directories are in this directory? Commented Nov 13, 2011 at 0:43

4 Answers 4

3

You can't set the timeout limit with set_time_limit()?

If you set it to 0, the script will run forever.

set_time_limit(0);

Usleep() will halt the execution of the PHP script in the given time. In that time your script will not be listing any directories and such. It will just freeze the script until it is allowed to continue.

Sign up to request clarification or add additional context in comments.

Comments

1

PHP can try to look for a file/directory that doesn't exist for a long time so if you're already using something like:

if ((is_dir($path) || file_exists($path)) && ($dh = opendir($path)))
{
  while(($file = readdir($dh)) !== false)
  {
    .. file or dir is found, do stuff :)
  }

  closedir($dh);
}

I haven't said a word, but if you simply use:

$dh = opendir($path);

It can take a few minutes before the script times out, but it doesn't do anything.

Comments

1

Have you tried using RecursiveDirectoryIterator for generating the directory listing?

I used to use a recursive function to generate directory listings which inadvertently cause script timeouts when I had to work with a massive amount of files going into deeper levels. Using RecursiveDirectoryIterator solved many of my problems.

Comments

0

You can try set_time_limit or see if you can optimise your code:

  • execute ls -l > results.txt & on your system so that the listing launches in the background and copies it to the results.txt file.
  • reduce the amount of files in your directory by using subdirectories

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.