I need solve a issue as simple as that: my system generate many files and each file is a isolated event.
Each file has many lines (more than 700 lines) but to my business each file is just one single event.
How configure Splunk to treat each file as a single event?
I am using splunk plugin in Jenkins. Where would I make change so that Splunk consider Jenkins log file as one event? I do not have access to .conf files.
If I have to change in .conf file, I may ask admin to make this change but I don't know what change I have to make. Help is appreciated. ,I am using splunk plugin in Jenkins to send Jenkins logs to the Splunk. I want Splunk to treat one log file as a single event. Where would I use
((*FAIL)) to achieve this? Do I have to make changes to prop.conf and input.conf? What if I do not have access to those files on Splunk server?
One way is to set up a dummy/impossible
In props.conf (on indexer if using universal forwarder):
[my_system] SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]*)-=-=-=-=ThIs-iS-An-ImPoSsiBle-StRiNg=-=-=-=-
If these files change, you may want to also set the
CHECK_METHOD on the forwarder itself.
In props.conf (on the forwarder):
The easiest and most efficient way is to set a single sourcetype for your file, and define the rules for this sourcetype:
[mysinglefilesourcetype] SHOULD_LINEMERGE = false LINE_BREAKER = ((*FAIL)) TRUNCATE = 99999999
This disables line-merging, which sounds wrong, but in fact, you don't want or need it since you won't be breaking the file into separate lines in the first place. The specified LINE_BREAKER is a special PCRE regex that will never break on any line the file, guaranteed. The TRUNCATE setting is there to make sure the entire file is counted as the event, because the default max size is only 10000 characters. You should set it above the expected maximum size of your file. It's not recommend to set it to 0 (no limit) because something could go wrong, or you might drop in some file that shouldn't be there.
@gkanapathy : Thanks for sharing this. Really useful. I am facing a similar issue to ingest all line in a file as single. But the config works for me only in stand-alone environment. And not when deployed on Heavy forwarder.
Is that because logs are coming partially parsed ( and event-segmented by UF)?