Splunk Enterprise

Need help with building faster searches- How to make Dashboard efficient?

dsenapaty
Explorer

Hello All,

 

I am new to splunk and trying to create an executive level dashboard for few of our enterprise applications. All these applications produces very large data like 100GB/day and my normal indexed searches are taking for ever and sometimes timing out as well. Need some guidance on how i could achieve faster/optimised searches 

 

I tried using tstats but i am running into problems as the data is not totally structured and i am not able to do aggregate functions on response times as that data has string "ms" at the end or doesn't have KV type .  Like below examples.

 

Data 1 : 

2022-09-11 22:00:59,998 INFO -(Success:true)-(Validation:true)-(GUID:68D74EBE-CE3B-7508-6028-CBE1DFA90F8A)-(REQ_RCVD:2022-09-11T22:00:59.051)-(RES_SENT:2022-09-11T22:00:59.989)-(SIZE:2 KB)-(RespSent_TT:0ms)-(Actual_TT:938ms)-(DB_TT:9ms)-(Total_TT:947ms)-(AppServer_TT:937ms)

 

Data 2: 

09/27/2022 16:34:57:998|101.123.456.789|106|1|C97EC2DA10C64F64A83C87AEEC1CDDBE703A546E1B554AD1|POST|/api/v1/resources/ods-passthrough|200|97

Data 2 fields:

date_time|client_ip|appid|clientid|guid|http_method|uri_path|http_status_code|response_time

 

 

Need help/suggestions on how to achieve faster search.

Labels (1)
Tags (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

Lots of ways to make things faster and more efficient. If you're looking to use that ms timing counter as a number, then you should extract it as a field.

tstats will not give you data unless you're taking it from a datamodel, in which case, you will no doubt have extract fields by virtue of having passed it through the model.

Efficient searching is about taking the minimum amount of data to satisfy the search, so give as many restrictive criteria as possible, then aggregate to reduce the data volume as much as possible.

Dashboard efficiency can be achieved by using base searches

https://docs.splunk.com/Documentation/Splunk/9.0.1/Viz/Savedsearches

Another technique is to have a saved search that runs frequently and performs some aggregation of the large volume of data and then save that aggregated data back to a summary index and then your dashboard can search from the already created aggregations.

As for extracing those ms values at search time, here's an example that will extract all the (*_TT:NNms) fields from your example line

| makeresults 
| eval _raw="2022-09-11 22:00:59,998 INFO -(Success:true)-(Validation:true)-(GUID:68D74EBE-CE3B-7508-6028-CBE1DFA90F8A)-(REQ_RCVD:2022-09-11T22:00:59.051)-(RES_SENT:2022-09-11T22:00:59.989)-(SIZE:2 KB)-(RespSent_TT:0ms)-(Actual_TT:938ms)-(DB_TT:9ms)-(Total_TT:947ms)-(AppServer_TT:937ms)"
| rex max_match=0 "\((?<fn>\w+)_TT:(?<tt>\d+)ms\)"
| foreach 0 1 2 3 4 [ eval f=mvindex(fn, <<FIELD>>), tt_{f}=mvindex(tt, <<FIELD>>) ]
| fields - fn tt f

The first two lines set up your example and the last 3 lines extract those numbers and create field names tt_XX where XX is the name of the time taken field and the value is the time excluding milliseconds.

 

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...