Getting Data In

scripted inputs and duplicate event data

himynamesdave
Contributor

Hi all.

I have built a simple scripted input that grabs XML data over http:

#!/bin/bash
curl http://www.a.com/EN.XML

All works fine BUT Splunk is indexing all events each time it is pinging the file, resulting in duplicate events.

What is the best way to validate the index of events in Splunk against the XML file, so that Splunk only pulls back events that have not already been indexed?

Thanks!

Tags (2)
0 Karma
1 Solution

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

View solution in original post

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

himynamesdave
Contributor

Thought so (was hoping I could cheat) 🙂

Thanks for your help!

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...