Getting Data In

Configuring Logstash to send files to Splunk

Fr3nchee
Engager

Hello all,

So I'm very new to Splunk, like I've been playing around with it for less than 3 months.  I have been tasked with sending logs from Logstash into Splunk, however, I have no idea where to start with this.  I've been looking online, but the information I find is very confusing.

Does someone have some kind of guide that explains how to get data from Logstash to Splunk in detail, including what files need to be configured in Logstash?

Any help would be appreciated.

Thanks

Labels (3)
0 Karma
1 Solution

kiran_panchavat
Influencer

@Fr3nchee 

Before Logstash can send logs, Splunk needs to be configured to receive them.
 
  • Open your Splunk instance and Create a new HEC data input. 
  • Go to Settings > Data Inputs > HTTP Event Collector
  • Click New Token and give it a name 
  • Select the index where you want the data to be stored (e.g., "logstash"), You have to create the index on HF and also in the indexers. 
  • Copy the token for later use.

 Refer this for more info: 

 Verify data in Splunk :

Go to search head and search for your data you specified eg: index=logstash

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!

View solution in original post

Fr3nchee
Engager

Thanks for the info, I really appreciate the help.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Logstash is an external tool which doesn't have direct native support for Splunk. You can however configure serveral different types of outputs which you can use it to send data to Splunk - syslog, http or simply writing to files and piking up the data from files with monitor inputs using Splunk's UF.

But.

As @gcusello pointed out - since Logstash is an external tool meant for something completely different than being used with Splunk, it prepares data its own way and generally its output is not compatible with what normal Splunk apps expect. So you could be better off - especially if you're receiving data over syslog - "branching" your event ingestion pipeline before logstash. So that logstash, if you're using it as well for some other solution, receives its copy of data and another copy is sent from before logstash to your Splunk environment (especially to some syslog receiver).

gcusello
SplunkTrust
SplunkTrust

Hi @Fr3nchee ,

in addition to the perfect information from @kiran_panchavat , remember that logstash modifies the original logs putting the original log in a field of the JSON called message.

This means that all the add-on that you can find in splunkbase don't work.

You have two choices:

  1. restore the original log format extracting from the JSON file the message field and putting it in _raw,
  2. create your own parsers.

I hint the first solution, because the second one is very long to implement.

even if isn't so easy to implement because you must use INGEST_EVAL command and json_extract funcion, for this I hint to engage someone that already experienced this job.

At least to prepare the job.

Ciao.

Giuseppe

kiran_panchavat
Influencer

@Fr3nchee 

Before Logstash can send logs, Splunk needs to be configured to receive them.
 
  • Open your Splunk instance and Create a new HEC data input. 
  • Go to Settings > Data Inputs > HTTP Event Collector
  • Click New Token and give it a name 
  • Select the index where you want the data to be stored (e.g., "logstash"), You have to create the index on HF and also in the indexers. 
  • Copy the token for later use.

 Refer this for more info: 

 Verify data in Splunk :

Go to search head and search for your data you specified eg: index=logstash

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcment

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...