Getting Data In

Can I restrict the log ingestion when the index capacity reaches its limit on per day basis?

mala_splunk_91
Explorer

Hi, 

In Splunk cloud, Can I restrict the log ingestion when the index capacity reaches its limit on per day basis?

I have logs which is exceeding its indexing capacity on certain days. Is there any way I can block ingestion if the capacity reaches its threshold?

Also, I have another question, Is it possible for me to edit the configuration files to filter logs or send it null queue on the Splunk cloud?

If I want to create custom app to do so. Please share me any related documents to follow.

Thanks, 

Mala Sundaramoorthy

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @mala_splunk_91,

as @VatsalJagani said, there isn't any automatic way to do this.

Obviously you can create an alert that fires when you're reaching e.g. the 50% at midday or the 80% at 5 PM.

So you can turn off some input when the alert fires, but not automatically.

Maybe it's  possible having Phantom, but I never tried.

about configurations, you can modify them only by interface on Splunk Cloud.

It's easier if you have to take on-premise logs using Forwarders, but anyway, always in manual mode not automatically.

About the way to create a custom App, it's a very easy App:

It will be easier when the data Stream Processor will be available (https://docs.splunk.com/Documentation/DSP/1.3.0/User/Filter).

Ciao.

Giuseppe

0 Karma

VatsalJagani
SplunkTrust
SplunkTrust

@mala_splunk_91 

 

I hope this helps!!!

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...