Getting Data In

Reading Data from ElasticSearch to Splunk

apoonia
New Member

My goal is to forward all ES indexes data to splunk using logstash.

I have installed logstash on ES node and created input plugin as ES node. I am using syslog TCP port as an output plugin (Splunk TCP data inputs)

So far I am trying to test for an index, it appears all of index data is put on a single event even though I am using a line breaker.

My logstash config looks like as per below:

input {
elasticsearch {
hosts => ["localhost:9200"]
index => "nuage_event-2018-06-09"
size => 500
scroll => "5m"
docinfo => true
}
}
filter {
}
output {
tcp {
host => "10.0.0.4"
port => "333"
codec => json
}
stdout { codec => rubydebug }
}

My splunk props.conf source type config looks like as per below:
[es_source]
pulldown_type = true
INDEXED_EXTRACTIONS = json
KV_MODE = none
category = Structured
MAX_EVENTS = 10000
TRUNCATE = 0
BREAK_ONLY_BEFORE = {"@version":\d+,
SHOULD_LINEMERGE = false

My questions are as per below:

  1. Is it a right approach of using logstash and Splunk TCP to ingest all data from ES to splunk
  2. Can you suggest what I am doing wrong here for my source type config: I am seeing all the data in a single event.
0 Karma

larmesto
Path Finder

This might be helpful for anyone visiting; I have started working on an addon for Elasticsearch instances, feel free to use it!
https://splunkbase.splunk.com/app/4175/

0 Karma

cpetterborg
SplunkTrust
SplunkTrust

You may want to take a look at the information in this Answers web page:

https://answers.splunk.com/answers/372999/is-there-anyway-to-push-data-from-elasticsearch-or.html

I would not use logstash to send the data to Splunk, but perhaps that is what would work for you.

0 Karma

apoonia
New Member

@cpetterborg Sorry for replying a bit late. I looked at the web-page provided in ur response. I can't seem to find the exact answer to see the import the data using logstash to Splunk.

Any other pointers ?

0 Karma

cpetterborg
SplunkTrust
SplunkTrust

I was responding to the part of your question that says:

Is it a right approach of using logstash and Splunk TCP to ingest all data from ES to splunk

and there is a partial answer to that in the comments of the linked Answer:

I've been researching this question
myself, and the best I can tell from
doc.s there is no mechanism to push
all data from Elasticsearch to another
destination. ... However, having
worked with Logstash before, I can say
that it would definitely support
sending all data to a second
destination -- and with special
handling (filtering, transformation,
etc.) if you desired. Search "logstash
output-plugins" and you should find
what you need.

I'm not sure if this is the best answer to that question, but I think that it has stated that it should be possible to send the ES data to Splunk but only if you send the Logstash data to Splunk, not the ES data from ES to Splunk. I hope that clears up what I was trying to convey in my answer.

0 Karma

apoonia
New Member

@cpetterborg yes, that clears your answer. So far I am able to send elasticsearch (one index at this moment) data to splunk using Logstash. It appears splunk is reading all the data of that particular index as a single event. I tried using line breaker etc .. So far I am not able to get the end state which I am looking for.

Ideally it should show each json o/p to a diff event of any index. But that's not the case here.. If I do tcpdump, I don't see any line breaker at the end of line. Or maybe I am missing something.

0 Karma

cpetterborg
SplunkTrust
SplunkTrust

Can you supply some example data and perhaps it will become more obvious what you need to do. Please use the "code" button (101010 in the formatting options of the comment/answer textbook) on the text of your example data. I don't know what the data looks like, so I would need to see some of the data before I could give you any kind of good answer.

0 Karma

apoonia
New Member

@cpetterborg, Sure, please find below details.

I have defined logstash config as per below:

[root@elastic conf.d]# cat splunk.conf
input {
elasticsearch {
hosts => ["localhost:9200"]
index => "nuage_event*"
size => 500
scroll => "5m"
docinfo => true
}
}
filter {
}
output {
tcp {
host => "10.0.0.4"
port => "333"
codec => json
}
stdout { codec => rubydebug }
}

This is my splunk source type definition:

[es_source]
pulldown_type = true
KV_MODE = json
category = Structured
MAX_EVENTS = 10000
TRUNCATE = 0
SHOULD_LINEMERGE = false
LINE_BREAKER = "}(){"

I am doing a line breaker so instead of single event it appears as a different event.

{"value":2,"type":"ACL_DENY","nuage_metadata":{"subnetName":"Dallas","domainId":"a52428f8-1453-40e0-ab25-ba4f3465a8e8","zoneName":"Branches","spgName":null,"zoneId":"d379e845-1ea5-485c-b8e8-f02cebd9e812","vportId":"257137d5-24a6-4423-96e0-428f95351405","subnetId":"8497d32a-c04c-4711-a27c-5a5765e2d8a5","domainName":"VSS_Domain","dpgName":null,"enterpriseName":"VSS"},"timestamp":1530131382229,"@timestamp":"2018-06-27T20:31:12.690Z","@version":"1"}

this is my second event:

{"value":6,"type":"ACL_DENY","nuage_metadata":{"subnetName":"Dallas","domainId":"a52428f8-1453-40e0-ab25-ba4f3465a8e8","zoneName":"Branches","spgName":null,"zoneId":"d379e845-1ea5-485c-b8e8-f02cebd9e812","vportId":"257137d5-24a6-4423-96e0-428f95351405","subnetId":"8497d32a-c04c-4711-a27c-5a5765e2d8a5","domainName":"VSS_Domain","dpgName":null,"enterpriseName":"VSS"},"timestamp":1530131081979,"@timestamp":"2018-06-27T20:31:12.690Z","@version":"1"}

Btw if there way i can put dynamic value (date example) in logstash config ?

0 Karma

apoonia
New Member

I forgot to mention this below is a single event: (portion of it)

Even I have put a regular expression to split it, I can a single event is being created and then it gets divided to multiple event as per regular expression. If my elasticsearch index size is more as compared to maximum row allowed in a event .. Splunk GUI will fail to show the event, right ? Any suggestion on that ?

{"nuage_metadata":{"subnetName":"NewYork","subnetId":"5e129b87-38fc-4417-be27-467ede4f999e","zoneName":"Branches","zoneId":"d379e845-1ea5-485c-b8e8-f02cebd9e812","dpgName":null,"spgName":null,"enterpriseName":"VSS","vportId":"30a826b8-e977-43dc-9051-7c1d74833c76","domainName":"VSS_Domain","domainId":"a52428f8-1453-40e0-ab25-ba4f3465a8e8"},"type":"ACL_DENY","value":1,"@version":"1","@timestamp":"2018-06-27T20:35:37.319Z","timestamp":1530058904992}{"nuage_metadata":{"subnetName":"Dallas","subnetId":"8497d32a-c04c-4711-a27c-5a5765e2d8a5","zoneName":"Branches","zoneId":"d379e845-1ea5-485c-b8e8-f02cebd9e812","dpgName":null,"spgName":null,"enterpriseName":"VSS","vportId":"257137d5-24a6-4423-96e0-428f95351405","domainName":"VSS_Domain","domainId":"a52428f8-1453-40e0-ab25-ba4f3465a8e8"},"type":"ACL_DENY","value":2,"@version":"1","@timestamp":"2018-06-27T20:35:37.319Z","timestamp":1530058904992}{"nuage_metadata":{"subnetName":"SanFrancisco","subnetId":"7504cb32-c1e2-447e-b017-84ae9b5c3042","zoneName":"Branches","zoneId":"d379e845-1ea5-485c-b8e8-f02cebd9e812","dpgName":null,"spgName":null,"enterpriseName":"VSS","vportId":"a22e0257-56f4-4c50-815f-3d3b492f7ebf","domainName":"VSS_Domain","domainId":"a52428f8-1453-40e0-ab25-ba4f3465a8e8"},
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...