Monitoring Splunk

How to get avg license usage per for sourcetype for a month



I need to get average license utilization per index for 30 days. I was trying this, but it's taking forever. Anyone have any suggestions?

index=_internal [`set_local_host`] source=*license_usage.log* type="Usage" | eval h=if(len(h)=0 OR isnull(h),"(SQUASHED)",h) | eval s=if(len(s)=0 OR isnull(s),"(SQUASHED)",s) | eval idx=if(len(idx)=0 OR isnull(idx),"(UNKNOWN)",idx) | bin _time span=1d | stats avg(b) as b by pool, st,idx  | eval idx=pool.":".idx | timechart limit=0 useother=f span=1d sum(b) AS volumeB by idx fixedrange=false  | join type=outer _time [search index=_internal [`set_local_host`] source=*license_usage.log* type="RolloverSummary" earliest=-30d@d | eval _time=_time - 43200 | bin _time span=1d | stats latest(stacksz) AS "stack size" by _time] | fields - _timediff  | foreach * [eval <<FIELD>>=round('<<FIELD>>'/1024/1024/1024, 3)]
0 Karma

Super Champion

@a212830 please do not forget to accept an answer if you are happy with it

0 Karma

Splunk Employee
Splunk Employee

Along with the great search based solutions, check out these things in Splunk already:

Navigate to your Monitoring Console
Use the navigation menu within the Monitoring Console app to navigate to Indexing -> License Usage -> License Usage - Previous 30 Days.
Use the Split By input toggle to set "By index" and scroll down for the Average Peak Daily Volume panel at the bottom. It includes average.

Also worth noting that if you are looking for an average by day, make sure any manual search you run snaps to the start of day for both earliest and latest - otherwise you'll be measuring partial days and skewing the results.

For improved performance, make sure to specify a sourcetype and source and you can probably use RolloverSummary instead of Usage since you're looking for day over day. Even better might be to peek at the "Daily License Usage - Rollover Summary" Data Model as an even more rapid way for doing a simple pivot on the index and solving this question.

0 Karma


This only applies, when you actually have configured the dmc.

0 Karma

Splunk Employee
Splunk Employee

Absolutely! Yet another great win for the DMC (now the MC) 😉

0 Karma


If you just need license usage per index for last 30 days..

index=_internal source=license_usage.log type=Usage earliest=-30d@d latest=-0d@d | timechart span=1d sum(b) AS volume_b by idx | foreach * [eval <>=round('<>'/1024/1024/1024, 3)]

0 Karma

Super Champion

Let's see if the following helps.

Option 1: Use the following search (if you want to group by sourcetype instead of index simply replace idx with st)

    [ `set_local_host`] source=*license_usage.log* type="Usage" earliest=-30d@d latest=@d 
| fields _time, pool, idx, b
| eval idx=if(len(idx)=0 OR isnull(idx),"(UNKNOWN)",idx) 
| bin _time span=1d 
| stats sum(b) as b by _time, pool, idx 
| stats sum(b) AS volume by idx, _time
| stats avg(volume) AS avgVolume max(volume) AS maxVolume by idx 
| eval avgVolumeGB=round(avgVolume/1024/1024/1024,3) 
| eval maxVolumeGB=round(maxVolume/1024/1024/1024,3) 
| fields idx, avgVolumeGB, maxVolumeGB 
| rename avgVolumeGB AS "average" maxVolumeGB AS "peak" idx AS "Index"

Option 2: enable acceleration for your license reports in your license master and use the built-in ones (go to Settings > System > Licensing > Usage report > Previous 30 days)

Option 3: Create a summary index with the bits you need from _internal on a daily basis and compute from that

My preferences would be:

  1. Option 1 if this is a one-off. Waiting some minutes shouldn't be a problem. If it takes more than a couple of minutes there might another performance issues you might want to look at
  2. Option 2 if you are going to do this on a regular basis
  3. Option 3 if you want to be very specific on what you report or you want to play around with different types of queries grouping by different time ranges


Get Updates on the Splunk Community!

Detecting Remote Code Executions With the Splunk Threat Research Team

WATCH NOWRemote code execution (RCE) vulnerabilities pose a significant risk to organizations. If exploited, ...

Enter the Splunk Community Dashboard Challenge for Your Chance to Win!

The Splunk Community Dashboard Challenge is underway! This is your chance to showcase your skills in creating ...

.conf24 | Session Scheduler is Live!!

.conf24 is happening June 11 - 14 in Las Vegas, and we are thrilled to announce that the conference catalog ...