Skip to main content
added more explanation, and example of getting the filename
Source Link
cas
  • 84.2k
  • 9
  • 136
  • 205
#!/usr/bin/perl

use strict;
use File::Tail;

my $dir='/mnt/zandrologs';

# Get the list of logfiles in the target directory
my @logfiles = glob("$dir/*");

# Set up an array of File::Tail handlesobjects, one for each filename.
my @logs=();
foreach (@logfiles) {
  push(@logs,File::Tail->new(name => $_));
}

# Now watch those logs and do stuff when the script sees matching patterns
while (1) {
  my ($nfound,$timeleft,@pending)= File::Tail::select(undef,undef,undef,undef,@logs);

  if ($nfound) {
    foreach my $input (@pending) {

      # read the line of data from the current File::Tail object into $_
      $_ = $input->read;
      chomp;

      # Get the filename of the current File::Tail object.
      # This script doesn't use it, so it's commented out.
      # my $fn = $input->{'input'};

      if (m/somestring/) {
         # do stuff here
         # any perl code, including executing external programs with system().
      } elsif (m/somestring2/) {
         # do different stuff here
      } elsif (m/somestring3/) {
         # and more different stuff
      } elsif (m/somestring999/) {
         # yet more different stuff
      }
    }
  }
};

This will loop over the log files forever (or until killed). If any of the input files are rotated, File::Tail will automatically close and re-open the file (i.e. similar to tail -F). As

When there is new data in one or more of the logs, the File::Tail::select() method returns:

  • $nfound - the number of File::Tail objects found with data pending (i.e. the number of elements in the @pending array).
  • $timeleft - the time remaining until the select() timed out, but this script didn't pass a timeout value to select (it just passed undef for everything except the @logs array.
  • @pending - an array of File::Tail objects with new, unread data

Each element of @pending is a File::Tail object with various methods (e.g. read() which returns the line of text pending for that object) and hash keys (e.g. {'input'} containing the filename).

See man File::Tail and perldoc -f select for details.


As written, the script will ignore any lines in the file that existed before this script started. You can change that to start by reading either the last n lines or even the entire file from the start by changing this line:

#!/usr/bin/perl

use strict;
use File::Tail;

my $dir='/mnt/zandrologs';

# Get the list of logfiles in the target directory
my @logfiles = glob("$dir/*");

# Set up an array of File::Tail handles, one for each filename.
my @logs=();
foreach (@logfiles) {
  push(@logs,File::Tail->new(name => $_));
}

# Now watch those logs and do stuff when the script sees matching patterns
while (1) {
  my ($nfound,$timeleft,@pending)= File::Tail::select(undef,undef,undef,undef,@logs);

  if ($nfound) {
    foreach my $input (@pending) {
      $_ = $input->read;
      chomp;

      if (m/somestring/) {
         # do stuff here
         # any perl code, including executing external programs with system().
      } elsif (m/somestring2/) {
         # do different stuff here
      } elsif (m/somestring3/) {
         # and more different stuff
      } elsif (m/somestring999/) {
         # yet more different stuff
      }
    }
  }
};

This will loop over the log files forever (or until killed). If any of the input files are rotated, File::Tail will automatically close and re-open the file (i.e. similar to tail -F). As written, the script will ignore any lines in the file that existed before this script started. You can change that to start by reading either the last n lines or even the entire file from the start by changing this line:

#!/usr/bin/perl

use strict;
use File::Tail;

my $dir='/mnt/zandrologs';

# Get the list of logfiles in the target directory
my @logfiles = glob("$dir/*");

# Set up an array of File::Tail objects, one for each filename.
my @logs=();
foreach (@logfiles) {
  push(@logs,File::Tail->new(name => $_));
}

# Now watch those logs and do stuff when the script sees matching patterns
while (1) {
  my ($nfound,$timeleft,@pending)= File::Tail::select(undef,undef,undef,undef,@logs);

  if ($nfound) {
    foreach my $input (@pending) {

      # read the line of data from the current File::Tail object into $_
      $_ = $input->read;
      chomp;

      # Get the filename of the current File::Tail object.
      # This script doesn't use it, so it's commented out.
      # my $fn = $input->{'input'};

      if (m/somestring/) {
         # do stuff here
         # any perl code, including executing external programs with system().
      } elsif (m/somestring2/) {
         # do different stuff here
      } elsif (m/somestring3/) {
         # and more different stuff
      } elsif (m/somestring999/) {
         # yet more different stuff
      }
    }
  }
};

This will loop over the log files forever (or until killed). If any of the input files are rotated, File::Tail will automatically close and re-open the file (i.e. similar to tail -F).

When there is new data in one or more of the logs, the File::Tail::select() method returns:

  • $nfound - the number of File::Tail objects found with data pending (i.e. the number of elements in the @pending array).
  • $timeleft - the time remaining until the select() timed out, but this script didn't pass a timeout value to select (it just passed undef for everything except the @logs array.
  • @pending - an array of File::Tail objects with new, unread data

Each element of @pending is a File::Tail object with various methods (e.g. read() which returns the line of text pending for that object) and hash keys (e.g. {'input'} containing the filename).

See man File::Tail and perldoc -f select for details.


As written, the script will ignore any lines in the file that existed before this script started. You can change that to start by reading either the last n lines or even the entire file from the start by changing this line:

Source Link
cas
  • 84.2k
  • 9
  • 136
  • 205

The perl File::Tail module is good for this kind of task. It may be available pre-packaged for your distribution (e.g. on debian and derivatives, sudo apt-get install libfile-tail-perl)

e.g.

#!/usr/bin/perl

use strict;
use File::Tail;

my $dir='/mnt/zandrologs';

# Get the list of logfiles in the target directory
my @logfiles = glob("$dir/*");

# Set up an array of File::Tail handles, one for each filename.
my @logs=();
foreach (@logfiles) {
  push(@logs,File::Tail->new(name => $_));
}

# Now watch those logs and do stuff when the script sees matching patterns
while (1) {
  my ($nfound,$timeleft,@pending)= File::Tail::select(undef,undef,undef,undef,@logs);

  if ($nfound) {
    foreach my $input (@pending) {
      $_ = $input->read;
      chomp;

      if (m/somestring/) {
         # do stuff here
         # any perl code, including executing external programs with system().
      } elsif (m/somestring2/) {
         # do different stuff here
      } elsif (m/somestring3/) {
         # and more different stuff
      } elsif (m/somestring999/) {
         # yet more different stuff
      }
    }
  }
};

This will loop over the log files forever (or until killed). If any of the input files are rotated, File::Tail will automatically close and re-open the file (i.e. similar to tail -F). As written, the script will ignore any lines in the file that existed before this script started. You can change that to start by reading either the last n lines or even the entire file from the start by changing this line:

push(@logs,File::Tail->new(name => $_));

to (start by reading the last 10 lines in log files):

push(@logs,File::Tail->new(name => $_, tail => 10));

or (start by reading all log files from the beginning):

push(@logs,File::Tail->new(name => $_, tail => -1));

This is an effective but simple use of the module. See man File::Tail for more details and alternative ways to use it. The module also comes with a few good example scripts.


PS: I've used this module a lot over the years. e.g. I used to maintain my own script using File::Tail in the 1990s to call ipchains to automatically block IP addresses trying to do bad things. Then fail2ban came along and I switched to using that instead. I still use it for log-monitoring scripts to this day.