All Apps and Add-ons

Run-away indexing with Splunk add-on for AWS, how to enforce a Log Start Date

Glasses
Builder

Hi,
I was moving about 20 aws data inputs (s3 sources) from an EOL server to an aws ec2.
The ec2 was a clone and had the Splunk_TA_aws app on it.
When I started up the new ec2, I disabled the TA.
I configured the Log Start Date directly in the inputs.conf for all the data inputs (and restarted).
Then I enabled the TA and disabled each of the inputs with the WebUI.
I went 1 by 1 disabling the input on the old host and enabling the input on the new host.
Everything looked good but now my indexing is blowing up.
I set the Log Start Date to 7/1/2019 but it seems the new ec2 host is fetching data before that date, like its not obeying the configs.

Any help is appreciated, thank you.

Glasses
Builder

Because I upgraded the AWS app, it did not retain my start date/time settings. I created new ones and it worked.

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...