Getting Data In

scripted inputs and duplicate event data

himynamesdave
Contributor

Hi all.

I have built a simple scripted input that grabs XML data over http:

#!/bin/bash
curl http://www.a.com/EN.XML

All works fine BUT Splunk is indexing all events each time it is pinging the file, resulting in duplicate events.

What is the best way to validate the index of events in Splunk against the XML file, so that Splunk only pulls back events that have not already been indexed?

Thanks!

Tags (2)
0 Karma
1 Solution

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

View solution in original post

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

himynamesdave
Contributor

Thought so (was hoping I could cheat) 🙂

Thanks for your help!

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...