Splunk Enterprise

Can someone help me on how to parse the whole log file and get each line parsed and indexed in one single index?

Tufail
Observer

Hi,
I want to use Splunk, but not sure where to start, i am new to it.


I have a situation where in, I have a log file that has all sort of logs, say category1 catergory2 and category3 etc. logs. I have dedicated regex parsers for each category say parser1 parser2 and parser3. One single log line would match to one of the parser only. If there is no suitable parser i.e. no match found, the line is not eligible to be indexed.
I want it all to happen before indexing. The log source could be either a log file or a stream of logs.
Can someone help me on how to parse the whole log file and get each line parsed and indexed in one single index, say myidx?
I understand I will have to deploy props.conf and transforms.conf, but how to configure these files to achieve this. Please help or suggest better way.
TIA

sample log lines.
1.
Sep 01 23:43:47 test_device001 test_device001 default default-log [test_domain][0x0001][mp][alert] mp(Rrocessor): trans(53)[request][109.2.x.z] gtid(127d3b333052): event((test.xsl) Transaction Initiated) TestURI(my/mapped/url) Size(0) Node((test_domain)) userID(test_uid)

2.
Sep 05 23:43:47 test_device001 test_device001 default default-log [test_domain][0x0001][mp][alert] mp(Rrocessor): trans(53)[request][109.2.x.z] gtid(127d3b33305): (set-client-idy-head.xsl)*** P O N O D E T<entry><url event((test.xsl) Transaction Initiated) TestURI(my/mapped/url) <http-method>GET</http-method>

3.
Sep 04 23:43:47 test_device001 test_device001 default default-log [test_domain][0x0001][mp][alert] mp(Rrocessor): trans(53)[request][109.2.x.z] gtid(127d3b333052): *** NODETYPE(SS) ***FLOW(HTTP{->HTTP) ***OUTG(mysite.test.com)

 

Labels (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

To answer the question in the title, that is the normal process.  Each line of an input stream is parsed and indexed into a single index.

To answer the question in the message, yes, you can keep some lines and discard others using transforms.  Splunk documents the procedure at https://docs.splunk.com/Documentation/Splunk/9.0.1/Forwarding/Routeandfilterdatad#Keep_specific_even...and there are answers on the topic, including a good one at https://community.splunk.com/t5/Getting-Data-In/Filtering-events-using-NullQueue/m-p/66392

In this case, you'd need 4 transforms - one for each category to send the matches to indexQueue and the fourth to send other events to nullQueue.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI! Discover how Splunk’s agentic AI ...

[Puzzles] Solve, Learn, Repeat: Dereferencing XML to Fixed-length events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...