Deployment Architecture

Configure Receiver To Create Events by Source

cpt12tech
Contributor

I'm new to Splunk and having some issues with getting logs to create events correctly.
I've installed the universal forwarder, and set to a directory of plain text logs for a specific application.
The logs are created 1 per day, and each line should be an event.
I configured the universal forward to get local data from the logs directory (in which there are multiple sub directories with log files in them).

When it reaches the receiver, Splunk creates events based on the log files and not the line items in the logs.
I was able to connect the Splunk server to the computer and import the files, and can see in the data preview.

However, I'm haven't figured out how to set this up on a receiver. I don't want the client computer to do the processing, and would rather have the Splunk server split the events up correctly at time of indexing.

Is there an app I need to install to configure a receiver?
Should I have multiple receivers for different source types? Or do I modify the config file to look for the source and run an event filter based on the source?

Tags (3)
0 Karma
1 Solution

Ayn
Legend

This should already be the case though - Universal Forwarder do not perform parsing, and so do not decide how events are broken up. This all happens on the indexer (or to be more precise, on the first parsing Splunk instance that events arrive to, which usually is the indexer unless you have some kind of multi-layered setup with several chained forwarding instances).

The rules Splunk uses for breaking events are documented here: http://docs.splunk.com/Documentation/Splunk/5.0.2/Data/Indexmulti-lineevents

Basically by default, Splunk will break into a new event when it encounters a line with a valid timestamp, but you can configure this pretty much any way you want.

View solution in original post

cpt12tech
Contributor

I grabbed a file from the client, and uploaded it to Splunk using the web
manager : data inputs : files & directories
Changed the Data Preview to Every Line is One Event
Saved the data preview as a new source type
then canceled

I read through the inputs.conf and set up the following:
[tcp://9997]
connection_host = myComputerName
sourcetype = mySourceName

Installed the forwarder, pointed to Splunk and the data is coming in parsed out by the timestamp on each line!

Very Cool.

0 Karma

cpt12tech
Contributor

Just realized that now I'm having duplicates reimported. When a new line is written to the end of the log file, the entire file is imported as an event, and not just the new event.

0 Karma

Ayn
Legend

This should already be the case though - Universal Forwarder do not perform parsing, and so do not decide how events are broken up. This all happens on the indexer (or to be more precise, on the first parsing Splunk instance that events arrive to, which usually is the indexer unless you have some kind of multi-layered setup with several chained forwarding instances).

The rules Splunk uses for breaking events are documented here: http://docs.splunk.com/Documentation/Splunk/5.0.2/Data/Indexmulti-lineevents

Basically by default, Splunk will break into a new event when it encounters a line with a valid timestamp, but you can configure this pretty much any way you want.

cpt12tech
Contributor

Very Cool.

• I'm assuming the connection_host parameter is only applying the sourcetype for matched hosts and not setting the host for all inputs?

I plan on using Splunk for other servers, data streams, SNMP, WMI, etc. Do I need to create a different receiver for different source files? Or can I use the host field to match different sourcetypes?

0 Karma

kristian_kolb
Ultra Champion

Hm, you might have to rephrase that a little bit ('Splunk creates events based on the log files and not the line items in the logs').

Also it's a bit unclear if you are using the forwarder OR getting the indexer to read the files directly (presumably through a file share?).

With a universal forwarder reading and sending the logs, the indexer will do the work of splitting the stream of incoming data into separate events.

No app needed to enable the receiver. See: http://docs.splunk.com/Documentation/Splunk/latest/Deploy/Enableareceiver#Set_up_receiving_with_Splu...

/K

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...