Getting Data In

How to remove or delete duplicate event entries from a file before indexing using inputs.conf, props.conf or a perl script?

kkarthik2
Observer

Before indexing, how do I delete, remove, or avoid duplicate log files or events coming from a saturated file on the server and export events or log files every 15 minutes?

0 Karma

stephane_cyrill
Builder

Hi, while waiting for a better solution, let met tell you that you can do it after indexing:
1- after identifying the duplicated event or file.
2-build a query that fetch what you want to remove and pipe it with delete.
3- you can scheduled that search to run periodically.


Now to export event you can use the command dump:
1- you build the query that map the event you want to export.
2- then you pipe like this .....l dump basefilename=MyExport

Note: see all the options for the dump command in the splunk search reference manual.

You can also do an outputcsv.
After that scheduled the search to run each 15min.

Get Updates on the Splunk Community!

Unlock New Opportunities with Splunk Education: Explore Our Latest Courses!

At Splunk Education, we’re dedicated to providing top-tier learning experiences that cater to every skill ...

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...

Spotting Financial Fraud in the Haystack: A Guide to Behavioral Analytics with Splunk

In today's digital financial ecosystem, security teams face an unprecedented challenge. The sheer volume of ...