Getting Data In

Monitor all remaining files not specifically matched

davidstuffle
Path Finder

We have several syslog-ng collectors with UFs on them. The UF monitors the paths and files that syslog-ng generates that we point it to, but I know there are probably several systems sending syslog data that we are missing. Is there a way to point a UF monitor stanza at the top level file path and tell it to monitor everything not matched elsewhere and send it to a specific index so that we can search that index to see what data we're missing?

0 Karma

frobert
New Member

Hi, recent versions of syslog-ng Premium Edition can send log messages to Splunk HEC directly. syslog-ng also has a wildcard file source to monitor files and directories for log messages.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

While you probably could monitor * and blacklist the files in other monitor stanzas, I wouldn't advise it. Wide wildcards cause the UF to monitor a ton of files and can be very non-performant.

IIRC, syslog-ng has a default directory for data that does not match any rule. I suggest you have your UFs monitor that directory then create an alert to let you know something is in there requiring attention.

---
If this reply helps you, Karma would be appreciated.
0 Karma

davidstuffle
Path Finder

yeah, it probably would make more sense to do it in the syslog-ng config. I could run all the input through filters with "final" flags sending the matched data to their respective folders and then whatever is left goes to a "not_matched" folder or something.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Shape the Future of Splunk: Join the Product Research Lab!

Join the Splunk Product Research Lab and connect with us in the Slack channel #product-research-lab to get ...