Getting Data In

How to remove or delete duplicate event entries from a file before indexing using inputs.conf, props.conf or a perl script?

kkarthik2
Observer

Before indexing, how do I delete, remove, or avoid duplicate log files or events coming from a saturated file on the server and export events or log files every 15 minutes?

0 Karma

stephane_cyrill
Builder

Hi, while waiting for a better solution, let met tell you that you can do it after indexing:
1- after identifying the duplicated event or file.
2-build a query that fetch what you want to remove and pipe it with delete.
3- you can scheduled that search to run periodically.


Now to export event you can use the command dump:
1- you build the query that map the event you want to export.
2- then you pipe like this .....l dump basefilename=MyExport

Note: see all the options for the dump command in the splunk search reference manual.

You can also do an outputcsv.
After that scheduled the search to run each 15min.

Get Updates on the Splunk Community!

Security Professional: Sharpen Your Defenses with These .conf25 Sessions

Sooooooooooo, guess what. .conf25 is almost here, and if you're on the Security Learning Path, this is your ...

First Steps with Splunk SOAR

Our first step was to gather a list of the playbooks we wanted and to sort them by priority.  Once this list ...

How To Build a Self-Service Observability Practice with Splunk Observability Cloud

If you’ve read our previous post on self-service observability, you already know what it is and why it ...