Getting Data In

How to remove or delete duplicate event entries from a file before indexing using inputs.conf, props.conf or a perl script?

kkarthik2
Observer

Before indexing, how do I delete, remove, or avoid duplicate log files or events coming from a saturated file on the server and export events or log files every 15 minutes?

0 Karma

stephane_cyrill
Builder

Hi, while waiting for a better solution, let met tell you that you can do it after indexing:
1- after identifying the duplicated event or file.
2-build a query that fetch what you want to remove and pipe it with delete.
3- you can scheduled that search to run periodically.


Now to export event you can use the command dump:
1- you build the query that map the event you want to export.
2- then you pipe like this .....l dump basefilename=MyExport

Note: see all the options for the dump command in the splunk search reference manual.

You can also do an outputcsv.
After that scheduled the search to run each 15min.

Get Updates on the Splunk Community!

Leveraging Detections from the Splunk Threat Research Team & Cisco Talos

  Now On Demand  Stay ahead of today’s evolving threats with the combined power of the Splunk Threat Research ...

New in Splunk Observability Cloud: Automated Archiving for Unused Metrics

Automated Archival is a new capability within Metrics Management; which is a robust usage & cost optimization ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...