Getting Data In

How to split my input file into multiple events?

lakromani
Builder

Hi

My input file /tmp/log.txt looks like this.

192.168.22.5 93.x.x.x 456 2
192.168.22.10 183.x.x.x 63 1
src_ip dest_ip byte packet

When I add this file as an input file in Splunk, I get all data as one large event.
I would like these events top be split to separate lines.

So in props.conf i added:

[source::///tmp/log.txt]
SHOULD_LINEMERGE = false

But that did not help.
Not sure If I need to have **/** or **///** before file name, but nothing splits the line.

0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi lakromani,
in your props.conf use

[monitor:///tmp/log.txt]
SHOULD_LINEMERGE = false
index = your_index
sourcetype = your_sourcetype

Bye.
Giuseppe

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi lakromani,
in your props.conf use

[monitor:///tmp/log.txt]
SHOULD_LINEMERGE = false
index = your_index
sourcetype = your_sourcetype

Bye.
Giuseppe

0 Karma

lakromani
Builder

I did find the [monitor:///tmp/log.txt] in input.conf under my app, so tried to add it there, but lines are still in one event.

[monitor:///tmp/log.txt]
SHOULD_LINEMERGE = false
disabled = false
sourcetype = test

Also restated splunk

0 Karma

gcusello
SplunkTrust
SplunkTrust

sorry but I was still sleeping 😉
you have to put SHOULD_LINEMERGE = false in your indexer's props.conf not in inputs.conf.

inputs.conf

[monitor:///tmp/log.txt]
disabled = false
sourcetype = test

props.conf

[test]
SHOULD_LINEMERGE = false

remember that if you receive logs from forwarders, you have to put inputs.conf in your forwarders and props.conf in your indexer.
Bye.
Giuseppe

0 Karma

lakromani
Builder

Now it worked perfectly. Thanks

0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...