Getting Data In

scripted inputs and duplicate event data

himynamesdave
Contributor

Hi all.

I have built a simple scripted input that grabs XML data over http:

#!/bin/bash
curl http://www.a.com/EN.XML

All works fine BUT Splunk is indexing all events each time it is pinging the file, resulting in duplicate events.

What is the best way to validate the index of events in Splunk against the XML file, so that Splunk only pulls back events that have not already been indexed?

Thanks!

Tags (2)
0 Karma
1 Solution

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

View solution in original post

Ayn
Legend

The best (and possibly only) way would be to implement this logic in your script. Splunk doesn't have that kind of ability to compare incoming data to what's already in the index.

My suggested approach would be for you to edit your script so it keeps the last version of the XML file, and when you issue the next request you compare the data you get from that with what's in the previous version.

himynamesdave
Contributor

Thought so (was hoping I could cheat) 🙂

Thanks for your help!

0 Karma
Get Updates on the Splunk Community!

Updated Data Type Articles, Anniversary Celebrations, and More on Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

A Prelude to .conf25: Your Guide to Splunk University

Heading to Boston this September for .conf25? Get a jumpstart by arriving a few days early for Splunk ...

4 Ways the Splunk Community Helps You Prepare for .conf25

.conf25 is right around the corner, and whether you’re a first-time attendee or a seasoned Splunker, the ...