All Apps and Add-ons

Send alert when indexing volume limit exceeded?

I-Man
Communicator

Is there a way to send an alert if I exceed my license limit? Does Splunk generate a log when this happens?

Thanks in advance!

Tags (1)
1 Solution

joshd
Builder

You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:

index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total

Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.

I've summarized some useful usage statistics here (along with links to splunks docs):

http://www.joshd.ca/content/splunk-usage-statistic-searches

View solution in original post

glitchcowboy
Path Finder

How could I get this per index? I'd like total on each index (series)

0 Karma

DaveSavage
Builder

You may find everything you need for index analysis in the plug-in Splunk on Splunk (SoS)...it's rather good. Download it, then run 'Metrics' from the menu...check out the 2nd bar chart down, by index.

0 Karma

joshd
Builder

Glad to hear its figured out. Sorry there werent typos I should have just used the code tag since the wiki messed up the formatting or the search.. it should have been index=_internal and series!=_* which eliminates all internal indexes because those are not charged against your license usage so you do not want them calculated against. Then you shouldn't need the earliest parameter because the timechart span is set.

0 Karma

joshd
Builder

You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:

index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total

Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.

I've summarized some useful usage statistics here (along with links to splunks docs):

http://www.joshd.ca/content/splunk-usage-statistic-searches

jimcall
Engager

It may because you are using a basic conditional alert rather than an advanced conditional alert.

See http://docs.splunk.com/Documentation/Splunk/latest/User/SchedulingSavedSearches#Define_alerts_that_a... for more details.

0 Karma

afields
New Member

Running the exact search mentioned above by I-Man. Running is manually works perfectly. Splunk just doesn't seem to like the alert setting.

0 Karma

joshd
Builder

What is the search you are running? What happens when you run it manually (what does it return)?

0 Karma

afields
New Member

My Splunk is not liking the custom alert condition "if total > 3". What am I doing wrong?

0 Karma

I-Man
Communicator

There were a couple typos in your search but it works like this:

index=_internal metrics kb group="per_index_thruput" series=* earliest=@d | eval totalGB = (kb/1024)/1024 | timechart span=1d sum(totalGB) as total

Thanks Man!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...