Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

6
  • 1
    I would quote the args on the tar line. Maybe it takes five minutes to get to the first file that has whitespace in its name. Or maybe a comma, which could mess with the read. Commented Mar 28 at 10:05
  • 1
    Have you tried running it from the command line and see if it behaves the same way, or outputs something more useful? What about experimenting with a cron job using the same parameters that does a sleep 600 (followed by a log write) and see if it gets killed, too? Commented Mar 28 at 17:57
  • 1
    When you say stopped, do you mean stopped as in suspended as if with the SIGSTOP signal? Or terminated/killed as if with the SIGTERM signal? Commented Mar 31 at 6:23
  • 1
    With the v option of tar, the list of files being archived will be printed on stdout, and that will end up in an email sent to root. Is that what you want? Beware there's usually a limit on the size of emails being sent Commented Mar 31 at 6:25
  • 1
    The archive that you produce seems to be quite big. Do you have enough space on /mnt/backup/website? Do you have local mail delivery set up? Does the cron daemon send any messages to the owner of the crontab? Commented Mar 31 at 6:27