Getting Data In

Dump files and filtering and sending to indexer


I have a .out DUMP file generated by Bamboo Logs -- that I want to monitor in Splunk.

I need to filter out certain content and then push over the rest to Standard Out via the universal forwarder onto the indexer.

What is the best way to go about this ?

A few options :

i) Convert the .out file to a .log file and filter out content using some process (not sure how) I could use a nullQueue setup on the indexer that will filter out whatever should not be indexed in splunk

ii) Point the configuration on the universal forwarder DIRECTLY to the .out file

whitelist= \.out$ | \.out[0-9]*$

I have been trying to get alternative (ii) to work -- but the dry run was largely unsuccesfull

iii) Somehow write the content of the .out file to sysout

Suggestions are appreciated : ) thanks !

0 Karma

Splunk Employee
Splunk Employee

Is the file a binary file? If not then you don't need to convert to .log. Just monitor the file and then you use props/transforms.conf to look for what you don't want sent to the indexer and forward that to the nullQueue.

DEST_KEY = queue
FORMAT = nullQueue

TRANSFORMS-dumpme = dumpme

This will dump everything though so make sure your regex is intact. Also this can't be done on a UF it needs to be done when it data hits the indexer.


Its like catalina.out from tomcat

MINUS the timestamps, which presents a challenge in its own

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!