syslog
Here are 454 public repositories matching this topic...
I currently need to connect to a Cloudera Impala instance in order to make queries and analyses.
It would be nice to be able to integrate this type of connection in DataStation.
I think it might require some sort of use of the Impala ODBC Connector / Driver.
Thanks!
I have an application which receives log messages from a firewall. The logs are written into a MongoDB. My goal is to process 30'000 messages per second (more or less constantly for 7*24 hours, not as transient peak value)
As peak value I expect app. 50'000 messages per second.
With several settings I reached up to 20'000 msg/sec. but that is not sufficient for our life traffic. The MongoDB ho
-
Updated
Jan 13, 2022 - JavaScript
currently when rsyslog starts, it checks to see if a pidfile exists, and if it exists, rsyslog refuses to start.
However, if rsyslog crashes or is killed with a -9, it does not have a chance to remove the pidfile and so a replacement cannot be started
As an enhancement, rather than just depending only on the existance of a pid file, rsyslog should look in the pid file and check to see if the
-
Updated
Apr 28, 2022 - JavaScript
-
Updated
Jun 7, 2022 - Shell
-
Updated
May 26, 2022 - Ruby
-
Updated
May 29, 2022 - Go
-
Updated
Feb 9, 2021
-
Updated
Jun 8, 2022 - Pascal
希望作者能够提供上述方法
-
Updated
May 17, 2022 - Python
-
Updated
Feb 8, 2021 - Python
Add DataDog webhook
Improve this page
Add a description, image, and links to the syslog topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the syslog topic, visit your repo's landing page and select "manage topics."


I have noticed when ingesting backlog(older timestamped data) that the "Messages per minute" line graph and "sources" data do not line up.
The Messages per minute appear to be correct for the ingest rate, but the sources breakdown below it only show messages for each type from within the time window via timestamp. This means in the last hour if you've ingested logs from 2 days ago, the data is