Getting Data In

How can I configure Splunk to read a csv file from a universal forwarder?

chadman
Path Finder

I'm new to Splunk and having a hard time getting it setup to sort a csv file. I'm able to send my csv logs to the indexer using the universal forwarder. From what I have read it looks like we want to do the sorts at search time and not with the forwarder. So could someone tell me what files I would have to edit to make this happen?

So far I see

C:\Program Files\Splunk\etc\apps\search\local\inputs.conf
[splunktcp://9997]
connection_host = ip
disabled = 1

My log files all have a name and path from the clients that looks like
C:\Program Files\SysCheck\Logs\11-2014_Log.txt
I have tried to edit the inputs.conf, props.conf and transforms.conf without any luck. I did have some success with one log file on my local workstation, but have not been able to do the same thing with the forwarder using multiple clients. So any help on what files I need to edit so when I search my csv files I can see all the fields would be great!

1 Solution

esix_splunk
Splunk Employee
Splunk Employee

To enable the extraction of a CSV file, there are a few approaches. If the data is static and the fields are always the same, you can define this per the sourcetype, refer to : http://docs.splunk.com/Documentation/Splunk/6.2.1/Data/Extractfieldsfromfileheadersatindextime#Edit_...

Basically you would need this in your props.conf on your indexer (a UF cannot parse, just send the data..)

Generally the below is a good starting point.

[mysourcetypecsv]
FIELD_DELIMITER=,
FIELD_NAMES=myfield1,myfield2,myfield3,myfield4

View solution in original post

tanwadh
New Member

Hi Team,

This is with regards to the integration of Splunk with jmeter.
I am able to get the data into the splunk in below form. Every sample has different parameter values as shown below
Could you please assist me extract the required fields by using the reg ex or updating the conf files.

Sample Set 1:-

jmeter.TS01_01_Launch_ok.avg
jmeter.TS01_01_Launch_ok.count
jmeter.TS01_01_Launch_ok.max.
jmeter.TS01_01_Launch_ok.min.
jmeter.TS01_01_Launch_ok.pct90
jmeter.TS01_01_Launch_ok.pct95
jmeter.TS01_01_Launch_ok.pct99

How to extract below fields from above samples?

Label = TS01_01_Launch
Average response time = avg
Max. response time = Max.
Min. response time = Min.
Sample Count = Count
90th Per response time = pct90
95th Per response time = pct95
99th Per response time = pct99

###############################################################################################

Sample Set 2:-

jmeter.TS01_01_01_Launch_index.html_ok.avg
jmeter.TS01_01_01_Launch_index.html_ok.count
jmeter.TS01_01_01_Launch_index.html_ok.max.
jmeter.TS01_01_01_Launch_index.html_ok.min.
jmeter.TS01_01_01_Launch_index.html_ok.pct90
jmeter.TS01_01_01_Launch_index.html_ok.pct95
jmeter.TS01_01_01_Launch_index.html_ok.pct99

How to extract below fields from above samples?

Label = TS01_01_01_Launch_index.html
Average response time = avg
Max. response time = Max.
Min. response time = Min.
Sample Count = Count
90th Per response time = pct90
95th Per response time = pct95
99th Per response time = pct99

0 Karma

simon_lavigne
Path Finder

I downvoted this post because it is not related to the original posters question.

0 Karma

esix_splunk
Splunk Employee
Splunk Employee

To enable the extraction of a CSV file, there are a few approaches. If the data is static and the fields are always the same, you can define this per the sourcetype, refer to : http://docs.splunk.com/Documentation/Splunk/6.2.1/Data/Extractfieldsfromfileheadersatindextime#Edit_...

Basically you would need this in your props.conf on your indexer (a UF cannot parse, just send the data..)

Generally the below is a good starting point.

[mysourcetypecsv]
FIELD_DELIMITER=,
FIELD_NAMES=myfield1,myfield2,myfield3,myfield4

vsingla1
Communicator

I noticed that the link for the document has changed in newer splunk version.
The link on how to extract fields from structured files, for latest realese of Splunk version 7.3.1, is as below:
https://docs.splunk.com/Documentation/Splunk/7.3.1/Data/Extractfieldsfromfileswithstructureddata

0 Karma

jayannah
Builder

Hi Chadman

Since you mentioned you are new to Splunk and as you are using Universal forwarder to send the data to indexer, read on http://docs.splunk.com/Documentation/Splunk/6.2.1/Forwarding/Setupforwardingandreceiving

follow the below steps to send data from forwarder to indexer after installing forwarder and indexer.
1. Configure the receiving port on Indexer (inputs.conf for receiving data on port say 9997)
read details at http://docs.splunk.com/Documentation/Splunk/6.2.1/Forwarding/Enableareceiver

  1. Configure the monitoring inputs on Uni. Forwarder (inputs.conf for monitoring the file)
    User conf example given above by lguinn. Also read more inputs config details & many examples at end of http://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf

  2. Configure Uni. Forwarders to send the logs to Indexer (outputs.conf)
    User conf example given above by lguinn. Also read more outputs config details and many examples at end of http://docs.splunk.com/Documentation/Splunk/latest/Admin/Outputsconf

chadman
Path Finder

Thanks! I have the forwarder working. I have a server and 4 test clients setup. I'm getting the log files on my indexer. The issue I have is getting the fields in my csv file. My log files have a .txt extention, but are really csv files without headers on the first line. I was trying to get Splunk to read these files and break up the csv file into fields.

0 Karma

lguinn2
Legend

Well, for starters I would remove the connection_host and disabled lines from your inputs.conf on the indexer. At least remove disabled=1 - it means that you have disabled the stanza and are not accepting forwarded data on port 9997!! I assume that the inputs.conf above is on the indexer...

On the forwarder, you need an inputs.conf to tell the forwarder what data to send. You also need an outputs.conf to tell the forwarder where to send the data. Here is the minimum forwarder configuration IMO, based on the data given.

inputs.conf

[monitor://C:\Program Files\SysCheck\Logs\*.txt]

outputs.conf

[tcpout:anyName]
server=indexer.myco.com:9997

Instead of indexer.myco.com, substitute the FQDN or ip address of the indexer. Note that the forwarder's inputs.conf will pick up any files in the directory that end with .txt

Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...