Timeline for Data architecture for event log metrics?
Current License: CC BY-SA 3.0
4 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Sep 20, 2012 at 15:29 | comment | added | TMN | OP mentioned volumes of "hundreds of thousands of events a day". One million events a day is less than seven hundred a minute, or about eleven a second. Unless the input is some lengthy XML, your average server should be able to handle that without breaking a sweat. It's definitely something that should be considered when designing (and deploying) the solution, though. | |
| Sep 20, 2012 at 13:53 | comment | added | Waylon | Are there concerns with the volume of logging the OP mentioned and doing filtering + aggregating as they come in? It seems like it might be a dangerous bottleneck if the log volume is high and/or aggregation is non trivial. | |
| Sep 17, 2012 at 18:31 | comment | added | elliot42 | "aggregate the data, but store the details in a (compressed) file". Great thought in particular, thanks! | |
| Sep 17, 2012 at 13:31 | history | answered | TMN | CC BY-SA 3.0 |