Deployment Architecture

Configure Receiver To Create Events by Source

cpt12tech
Contributor

I'm new to Splunk and having some issues with getting logs to create events correctly.
I've installed the universal forwarder, and set to a directory of plain text logs for a specific application.
The logs are created 1 per day, and each line should be an event.
I configured the universal forward to get local data from the logs directory (in which there are multiple sub directories with log files in them).

When it reaches the receiver, Splunk creates events based on the log files and not the line items in the logs.
I was able to connect the Splunk server to the computer and import the files, and can see in the data preview.

However, I'm haven't figured out how to set this up on a receiver. I don't want the client computer to do the processing, and would rather have the Splunk server split the events up correctly at time of indexing.

Is there an app I need to install to configure a receiver?
Should I have multiple receivers for different source types? Or do I modify the config file to look for the source and run an event filter based on the source?

Tags (3)
0 Karma
1 Solution

Ayn
Legend

This should already be the case though - Universal Forwarder do not perform parsing, and so do not decide how events are broken up. This all happens on the indexer (or to be more precise, on the first parsing Splunk instance that events arrive to, which usually is the indexer unless you have some kind of multi-layered setup with several chained forwarding instances).

The rules Splunk uses for breaking events are documented here: http://docs.splunk.com/Documentation/Splunk/5.0.2/Data/Indexmulti-lineevents

Basically by default, Splunk will break into a new event when it encounters a line with a valid timestamp, but you can configure this pretty much any way you want.

View solution in original post

cpt12tech
Contributor

I grabbed a file from the client, and uploaded it to Splunk using the web
manager : data inputs : files & directories
Changed the Data Preview to Every Line is One Event
Saved the data preview as a new source type
then canceled

I read through the inputs.conf and set up the following:
[tcp://9997]
connection_host = myComputerName
sourcetype = mySourceName

Installed the forwarder, pointed to Splunk and the data is coming in parsed out by the timestamp on each line!

Very Cool.

0 Karma

cpt12tech
Contributor

Just realized that now I'm having duplicates reimported. When a new line is written to the end of the log file, the entire file is imported as an event, and not just the new event.

0 Karma

Ayn
Legend

This should already be the case though - Universal Forwarder do not perform parsing, and so do not decide how events are broken up. This all happens on the indexer (or to be more precise, on the first parsing Splunk instance that events arrive to, which usually is the indexer unless you have some kind of multi-layered setup with several chained forwarding instances).

The rules Splunk uses for breaking events are documented here: http://docs.splunk.com/Documentation/Splunk/5.0.2/Data/Indexmulti-lineevents

Basically by default, Splunk will break into a new event when it encounters a line with a valid timestamp, but you can configure this pretty much any way you want.

cpt12tech
Contributor

Very Cool.

• I'm assuming the connection_host parameter is only applying the sourcetype for matched hosts and not setting the host for all inputs?

I plan on using Splunk for other servers, data streams, SNMP, WMI, etc. Do I need to create a different receiver for different source files? Or can I use the host field to match different sourcetypes?

0 Karma

kristian_kolb
Ultra Champion

Hm, you might have to rephrase that a little bit ('Splunk creates events based on the log files and not the line items in the logs').

Also it's a bit unclear if you are using the forwarder OR getting the indexer to read the files directly (presumably through a file share?).

With a universal forwarder reading and sending the logs, the indexer will do the work of splitting the stream of incoming data into separate events.

No app needed to enable the receiver. See: http://docs.splunk.com/Documentation/Splunk/latest/Deploy/Enableareceiver#Set_up_receiving_with_Splu...

/K

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...