Getting Data In

Splunk indexing more than normal amount of data after re-installation of the universal forwarder

soumdey0192
Explorer

The universal forwarder which was installed on "server A" was uninstalled on 14th May due to some issue.
So post 14th May logs from the "server A" was not being indexed in Splunk.
On 30th May, we re-installed the universal forwarder on "server A" but there was a huge spike in the data ingested for the next couple of days.
If the daily ingestion rate was 1GB per day, it started ingesting at the rate of approx. 15GB per day for the next 2 days.
Moreover the source from where the logs are ingested on "server A" keeps 1 day worth of data.

So can somebody please explain, for the above scenario, how the indexing of the data increased almost 15 times?

0 Karma

somesoni2
Revered Legend

Did you see any data being duplicated? You can look at licensing usage (index=_internal source=*license_usage.log) for the sources (files) so see if you got historical data being ingestetd (or run tstats command to see you got data for just those 2 days or for all the missing days from may 14th).

0 Karma
Get Updates on the Splunk Community!

Shape the Future of Splunk: Join the Product Research Lab!

Join the Splunk Product Research Lab and connect with us in the Slack channel #product-research-lab to get ...

Auto-Injector for Everything Else: Making OpenTelemetry Truly Universal

You might have seen Splunk’s recent announcement about donating the OpenTelemetry Injector to the ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...