Getting Data In

How to configure a universal forwarder to send data to a specific index on our Splunk Cloud instance?

dbcase
Motivator

Hi,

I'm trying to send data to a specific index on our Splunk Cloud instance

I've tried several methods found in answers.splunk.com but still with no apparent success.

What I've tried:

/opt/splunkforwarder/bin/splunk add monitor /home/oracle/workdir/*csv -index top10
Parameters must be in the form '-parameter value'

# cat /opt/splunkforwarder/etc/system/local/inputs.conf
[default]
host = hostname omitted but it is there

"The code block has been omitted but it is there"

[monitor:///home/oracle/workdir/*csv]
sourcetype=csv
index=top10

The latter one was followed by a restart of the forwarder.

In Splunk, an all time search of index=top10 yields 0 results. Not sure what I'm missing.

0 Karma
1 Solution

masonmorales
Influencer

What user are you running the Splunk Forwarder as? Does that user have read access to /home/oracle/workdir/*csv?

Look at $SPLUNK_HOME/var/log/splunk/splunkd.log for possible ERROR or WARN messages that may indicate why data is not being picked up and sent to the indexer. Also, if it's a very large file you might just have to give it a few minutes to process.

You can try changing your inputs.conf to:

 [monitor:///home/oracle/workdir/]
 whitelist = \.csv$
 sourcetype=csv
 index=top10
 crcSalt = <SOURCE>

Remember to restart Splunk after updating inputs.conf. Also, ensure that your outputs.conf is configured and pointed to your Cloud indexers, as well as that you have network connectivity between your forwarder and the Cloud indexers.

As others have mentioned, make sure that the index you are sending to has been created in your Cloud instance as well.

View solution in original post

0 Karma

dbcase
Motivator
[root@datamine splunkforwarder]# cat /opt/splunkforwarder/etc/system/local/inputs.conf
[default]
host = datamine.icontrol.com

[monitor:///home/oracle/workdir/*csv]
whitelist = \.csv$
sourcetype=csv
index=top10
0 Karma

masonmorales
Influencer

What user are you running the Splunk Forwarder as? Does that user have read access to /home/oracle/workdir/*csv?

Look at $SPLUNK_HOME/var/log/splunk/splunkd.log for possible ERROR or WARN messages that may indicate why data is not being picked up and sent to the indexer. Also, if it's a very large file you might just have to give it a few minutes to process.

You can try changing your inputs.conf to:

 [monitor:///home/oracle/workdir/]
 whitelist = \.csv$
 sourcetype=csv
 index=top10
 crcSalt = <SOURCE>

Remember to restart Splunk after updating inputs.conf. Also, ensure that your outputs.conf is configured and pointed to your Cloud indexers, as well as that you have network connectivity between your forwarder and the Cloud indexers.

As others have mentioned, make sure that the index you are sending to has been created in your Cloud instance as well.

0 Karma

dbcase
Motivator

Found it!!! (*&)(&@#$&)(*& syntax error in inputs.conf

0 Karma

masonmorales
Influencer

Hi @dbcase glad you found the issue! Could you please choose Accept Answer for whichever response helped you the most in getting this resolved?

0 Karma

dbcase
Motivator

Hi All, Thanks for the hints!

splunkd.log has a warning Not sure what it means though....

04-14-2016 13:47:49.355 -0500 WARN  CsvLineBreaker - CSV StreamId: 5516642215406685943 has extra incorrect columns in certain fields. - data_source="/opt/splunkforwarder/var/log/splunk/metrics.log", data_host="datamine.icontrol.com", data_sourcetype="csv"
04-14-2016 13:47:49.358 -0500 WARN  CsvLineBreaker - CSV StreamId: 15295132286795016394 has extra incorrect columns in certain fields. - data_source="/opt/splunkforwarder/var/log/splunk/splunkd.log", data_host="datamine.icontrol.com", data_sourcetype="csv"

The user that I'm running as is root

The index had been created beforehand
top10 Edit Delete Disable _cluster_admin 1 MB 500 GB 0

0 Karma

khourihan_splun
Splunk Employee
Splunk Employee

Hey DBCase,

Make sure you have created the index under Settings --> Indexes before you send the data. If not, Splunk will drop the data and you should see an error on your GUI. Something like this:

Received event for unconfigured/disabled/deleted index=top10

Let us know!
-K

ryandg
Communicator

What's your outputs.conf? Is the data in your main index?

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...