3

I have a long running script that dies out for no reason. It's supposed to run for over 8 hours, but dies out after an hour or two, no errors, nothing. I tried running it via CLI and via http, no difference.

I have the following parameters set:

set_time_limit(0);
ini_set('memory_limit', '1024M');

I've been monitoring the memory usage, and it doesn't go over 200M

Is there anything else that I'm missing. Why would it die out?

7
  • When you mention "no errors" does that include the log files? Commented Jan 11, 2011 at 14:20
  • what platform you are running the script on ? win/linux ? Commented Jan 11, 2011 at 14:22
  • @andre: yes, error_log('test') will output to those log files, but i get no errors when the script dies out. Commented Jan 11, 2011 at 14:39
  • could be only me - but thats not really how PHP should be used - now is it? Commented Jan 21, 2011 at 16:58
  • 1
    @Hannes The point of PHP CLI is that it can be used in this fashion. I agree that a script like this should not be run via Apache, however. Commented Jan 21, 2011 at 17:00

4 Answers 4

6
+50

One possible explanation could be that the PHP garbage collector is interfering with the script. That could be why you're seeing random die offs. When the garbage collector is turned on, the cycle-finding algorithm is executed whenever the root buffer runs full.

The PHP manual states:

The rationale behind the ability to turn the mechanism on and off, and to initiate cycle collection yourself, is that some parts of your application could be highly time-sensitive.

You could try disabling the PHP garbage collector using gc_disable. The manual recommends you call gc_collect_cycles right before disabling to free the buffer.

Another explanation could be the code itself. An 8 hour script is a long script and if it's complex, it could easily be hitting a snag that causes the script to exit. I think for your troubleshooting now, you should definitely turn error reporting to report everything using error_reporting(-1);.

Also, if your script is communicating with other services, say a database for example, it's quite possible that could be the issue. If the database server runs out of memory or times out, it could be causing your script to hang and die. If this is the case, you could split up your connections to the database and connect/disconnect at specific timed intervals during the script to keep that connection fresh. The same mentality could be applied to any other service you may be communicating with.

You could, for testing purposes only, purposely make your script write to a log file an each successful query, making sure to include the timestamp from when the query beings and another when the query ends. You might not get any errors, but it may help you determine if there is a specific problem query or if a query is hanging for longer than usual. You could also check to make sure your MySQL connection is still valid and print out something to inform you of that as well.

An example log file:

[START 2011/01/21 13:12:23] MySQL Connection: TRUE [END 2011/01/21 13:12:28] Query took 5s
[START 2011/01/21 13:12:28] MySQL Connection: TRUE [END 2011/01/21 13:12:37] Query took 9s
[START 2011/01/21 13:12:39] MySQL Connection: TRUE [END 2011/01/21 13:12:51] Query took 12s
Sign up to request clarification or add additional context in comments.

3 Comments

Some very good points. Error logging is turned on with E_ALL. But I don't see any errors. I'll test by disabling the garbage collection. Also, how do I check my current database timeout setting? I'm using mysql.
Honestly, I couldn't tell you how to check your MySQL settings as I have no one-on-one experience with MySQL. Doing some research of my own shows that there may be ways to extend the timeouts if you have access to a configuration panel for the MySQL installation.
Were you able to figure out what your problem was?
0

It's propably something related to the code.

I have scripts running weeks and months with no trouble.

Your database connection might timeout and output error. It's also possible you run out of filedescriptors if you open connections or files. Or you're shared memory region is full. It depends on the code.

Check out system logs that selinux is not messing with you. This way your script would not print any error. From the system logs you also see if you have crossed user limits on any system resources (see ulimit).

It's really strange if you run it in cli and you get nothing, not even segfault. You saw both stdout and stderr?

Comments

0

Maybe it segafults. Try to launch your script on this way:

$ ulimt -c unlimited
$ php script.php

And see if you find a core dump file (core.xxxx) in the running directory when it dies

Comments

-1

Apache also has it's own script timeout, you will need to tweak the httpd.conf file

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.