Hi,
I'm trying to send data to a specific index on our Splunk Cloud instance
I've tried several methods found in answers.splunk.com but still with no apparent success.
What I've tried:
/opt/splunkforwarder/bin/splunk add monitor /home/oracle/workdir/*csv -index top10
Parameters must be in the form '-parameter value'
# cat /opt/splunkforwarder/etc/system/local/inputs.conf
[default]
host = hostname omitted but it is there
"The code block has been omitted but it is there"
[monitor:///home/oracle/workdir/*csv]
sourcetype=csv
index=top10
The latter one was followed by a restart of the forwarder.
In Splunk, an all time search of index=top10
yields 0 results. Not sure what I'm missing.
What user are you running the Splunk Forwarder as? Does that user have read access to /home/oracle/workdir/*csv?
Look at $SPLUNK_HOME/var/log/splunk/splunkd.log for possible ERROR or WARN messages that may indicate why data is not being picked up and sent to the indexer. Also, if it's a very large file you might just have to give it a few minutes to process.
You can try changing your inputs.conf to:
[monitor:///home/oracle/workdir/]
whitelist = \.csv$
sourcetype=csv
index=top10
crcSalt = <SOURCE>
Remember to restart Splunk after updating inputs.conf. Also, ensure that your outputs.conf is configured and pointed to your Cloud indexers, as well as that you have network connectivity between your forwarder and the Cloud indexers.
As others have mentioned, make sure that the index you are sending to has been created in your Cloud instance as well.
[root@datamine splunkforwarder]# cat /opt/splunkforwarder/etc/system/local/inputs.conf
[default]
host = datamine.icontrol.com
[monitor:///home/oracle/workdir/*csv]
whitelist = \.csv$
sourcetype=csv
index=top10
What user are you running the Splunk Forwarder as? Does that user have read access to /home/oracle/workdir/*csv?
Look at $SPLUNK_HOME/var/log/splunk/splunkd.log for possible ERROR or WARN messages that may indicate why data is not being picked up and sent to the indexer. Also, if it's a very large file you might just have to give it a few minutes to process.
You can try changing your inputs.conf to:
[monitor:///home/oracle/workdir/]
whitelist = \.csv$
sourcetype=csv
index=top10
crcSalt = <SOURCE>
Remember to restart Splunk after updating inputs.conf. Also, ensure that your outputs.conf is configured and pointed to your Cloud indexers, as well as that you have network connectivity between your forwarder and the Cloud indexers.
As others have mentioned, make sure that the index you are sending to has been created in your Cloud instance as well.
Found it!!! (*&)(&@#$&)(*&
syntax error in inputs.conf
Hi @dbcase glad you found the issue! Could you please choose Accept Answer for whichever response helped you the most in getting this resolved?
Hi All, Thanks for the hints!
splunkd.log has a warning Not sure what it means though....
04-14-2016 13:47:49.355 -0500 WARN CsvLineBreaker - CSV StreamId: 5516642215406685943 has extra incorrect columns in certain fields. - data_source="/opt/splunkforwarder/var/log/splunk/metrics.log", data_host="datamine.icontrol.com", data_sourcetype="csv"
04-14-2016 13:47:49.358 -0500 WARN CsvLineBreaker - CSV StreamId: 15295132286795016394 has extra incorrect columns in certain fields. - data_source="/opt/splunkforwarder/var/log/splunk/splunkd.log", data_host="datamine.icontrol.com", data_sourcetype="csv"
The user that I'm running as is root
The index had been created beforehand
top10 Edit Delete Disable _cluster_admin 1 MB 500 GB 0
Hey DBCase,
Make sure you have created the index under Settings --> Indexes before you send the data. If not, Splunk will drop the data and you should see an error on your GUI. Something like this:
Received event for unconfigured/disabled/deleted index=top10
Let us know!
-K
What's your outputs.conf? Is the data in your main index?