Knowledge Management

How to delete repeat events that upload files automatically?

zhimeng_yu1506
New Member

How to delete repeat events that upload files automatically?
When I forward files to splunk automatically, it has some repeat events in old files. I do not need old events in old files. How can delete them ?

Tags (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

If you don't want to load old events, you could insert in your inpus.conf ignoreOlderThan = <nonnegative integer>[s|m|h|d].
In this way you don't load events older than your option.
Bye.
Giuseppe

0 Karma

TStrauch
Communicator

Hi,

you can delete events by pipe them to delete. This will not free any disk space. It only flags the events as deleted in splunk and will not show them in future searches. But be careful with this. You can not negotiate this command. Once deleted the events are gone.

You need to get you the "can_delete" role. No one has this role by default. Log to your admin account and go to Settings --> Access Control --> Users. And add the "can_delete" role to your user.

Before piping somthing to delete you should verify that your search is correct.

Example:

<your_search> | delete
0 Karma

zhimeng_yu1506
New Member

Thanks for your answer.
There is a file to be monitored. When the file changes, it will automatically upload data to splunk. But data is always repeated. I would like to delete before the first data, or remove duplicate data. When uploading, only to retain the change of the data.

0 Karma
Get Updates on the Splunk Community!

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Enhance Security Operations with Automated Threat Analysis in the Splunk EcosystemAre you leveraging ...

Splunk Developers: Go Beyond the Dashboard with These .Conf25 Sessions

  Whether you’re building custom apps, diving into SPL2, or integrating AI and machine learning into your ...

Index This | How do you write 23 only using the number 2?

July 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this month’s ...