Dashboards & Visualizations

Splunk dashboard performance Issue

karthi2809
Builder

Hi,

I am writing a query in Splunk dashboard and the query return in base search it has multiple joint query. still the page is loading very slow. need to improve performance of dashboard query.

This my the query.

index="Test"
applicationName="sapi" timestamp log.map.correlationId level message ("Ondemand Started*" OR "Process star
| rex field-message max_match=0 "\"Ondemand Started for. filename: (?<OnDemandFileName> [^\n]\w+\S+)"
| rex field-message max_match=0 "Process started for (?<FileName>[^\n]+)"
Ieval OnDemandFileName=rtrim(OnDemandFileName, "Job")
Ieval "FileName/JobName"= coalesce (OnDemandFileName, FileName)
| rename timestamp as Timestamp log.map.correlationId as CorrelationId level as Level message as Message
eval JobType=case (like( 'Message', "%Ondemand Started%"), "OnDemand", like('Message", "Process started%"), "Scheduled", true (), "Unknown")
eval Message=trim(Message, "\"")
table Timestamp CorrelationId Level JobType "FileName/JobName" Message
join CorrelationId type=left
[search index="Test" applicationName="sapi" level=ERROR
| rename log.map.correlationId as CorrelationId level as Level message as Messagel
I dedup CorrelationId
| table CorrelationId Level Message1]
| table Timestamp CorrelationId Level JobType "FileName/JobName" Messagel
I join CorrelationId type=left
20
[ search index="Test" applicationName="sapi" message="*file archived successfully *"
| rex field-message max_match=0 "\"Concur file archived successfully for file name: (?<Archived FileName>[^\n]\w+\S+)"
Ieval Archived FileName=rtrim(Archived FileName,"\"")
I rename log.map.correlationId as CorrelationId
| table CorrelationId ArchivedFileName]
1 table Timestamp CorrelationId Level JobType "FileName/JobName" ArchivedFileName Message1
join CorrelationId type=left
[ search index="Test" applicationName="sapi" (log.map.processor Path=ExpenseExtractProcessingtoOraclex AND (" Import*" OR "APL Import*"))
| rename timestamp as Timestamp1 log.map.correlationId as CorrelationId level as Level message as Message
| eval Status-case (like('Message", "%GL Import flow%"), "SUCCESS", like('Message", "%APL Import flow%"), "SUCCESS", like('Level', "%Exception%"), "ERROR")
| rename Message as Response
| table Timestamp1 CorrelationId Status Response]
Ieval Status=if (Level="ERROR", "ERROR", Status)
Ieval StartTime=round(strptime(Timestamp, "%Y-%m-%dT%H:%M: %S.%QZ"))
| eval EndTime=round(strptime (Timestamp1, "%Y-%m-%dT%H:%M: %S.%QZ"))
Ieval Elapsed TimeInSecs-EndTime-StartTime
| eval. "Total Elapsed Time"=strftime (Elapsed TimeInSecs, "%H:%M:%S")
eval Response= coalesce (Response, Message1)|table Status CorrelationId ArchivedFileName]
1 table Timestamp CorrelationId Level JobType "FileName/JobName" ArchivedFileName |search Status=*|stats count by JobType
Labels (1)
Tags (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @karthi2809,

the join command kills the system, you can use it only with few data.

You should use other choices, (stats command) , in addition you're joining the same search with similar conditions.

You can find many examples (also from me!) in the Community

Anyway, it's difficoult to help you with this search because there are many missing parts (e.g. pipes) in the shared search, please share the correct one.

Ciao.

Giuseppe

0 Karma

karthi2809
Builder

@gcusello 

Please find the exact query.

index="Test"
applicationName="sapi" timestamp log.map.correlationId level message ("Ondemand Started*" OR "Process star
| rex field-message max_match=0 "\"Ondemand Started for. filename: (?<OnDemandFileName> [^\n]\w+\S+)"
| rex field-message max_match=0 "Process started for (?<FileName>[^\n]+)"
|eval OnDemandFileName=rtrim(OnDemandFileName, "Job")
|eval "FileName/JobName"= coalesce (OnDemandFileName, FileName)
| rename timestamp as Timestamp log.map.correlationId as CorrelationId level as Level message as Message
|eval JobType=case (like( 'Message', "%Ondemand Started%"), "OnDemand", like('Message", "Process started%"), "Scheduled", true (), "Unknown")
|eval Message=trim(Message, "\"")
|table Timestamp CorrelationId Level JobType "FileName/JobName" Message

join CorrelationId type=left
[search index="Test" applicationName="sapi" level=ERROR
| rename log.map.correlationId as CorrelationId level as Level message as Messagel
| dedup CorrelationId
| table CorrelationId Level Messagel]
| table Timestamp CorrelationId Level JobType "FileName/JobName" Messagell

join CorrelationId type=left

[ search index="Test" applicationName="sapi" message="*file archived successfully *"
| rex field-message max_match=0 "\"Concur file archived successfully for file name: (?<Archived FileName>[^\n]\w+\S+)"
|eval Archived FileName=rtrim(Archived FileName,"\"")
| rename log.map.correlationId as CorrelationId
| table CorrelationId ArchivedFileName]
| table Timestamp CorrelationId Level JobType "FileName/JobName" ArchivedFileName Messagel

join CorrelationId type=left
[ search index="Test" applicationName="sapi" (log.map.processor Path=ExpenseExtractProcessingtoOraclex AND (" Import*" OR "APL Import*"))
| rename timestamp as Timestamp1 log.map.correlationId as CorrelationId level as Level message as Message
| eval Status-case (like('Message", "%GL Import flow%"), "SUCCESS", like('Message", "%APL Import flow%"), "SUCCESS", like('Level', "%Exception%"), "ERROR")
| rename Message as Response
| table Timestamp1 CorrelationId Status Response]

| eval Status=if (Level="ERROR", "ERROR", Status)
| eval StartTime=round(strptime(Timestamp, "%Y-%m-%dT%H:%M: %S.%QZ"))
| eval EndTime=round(strptime (Timestamp1, "%Y-%m-%dT%H:%M: %S.%QZ"))
| eval Elapsed TimeInSecs-EndTime-StartTime
| eval. "Total Elapsed Time"=strftime (Elapsed TimeInSecs, "%H:%M:%S")
| eval Response= coalesce (Response, Message1)|table Status CorrelationId ArchivedFileName]
| table Timestamp CorrelationId Level JobType "FileName/JobName" ArchivedFileName
|search Status=*
|stats count by JobType

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @karthi2809,

don't use join, Splunk isn't a DB use stats or something similar to this

 

index="xxx" applicationName="api" (environment=$env$ timestamp correlationId trace message ("Ondemand Started*" OR "Expense Process started") OR (trace=ERROR) OR (message="*Before Calling flow archive-Concur*")
| rename sourceFileName as SourceFileName content.JobName as JobName
| eval "FileName/JobName"= coalesce(SourceFileName,JobName)
| rename timestamp as Timestamp correlationId as CorrelationId tracePoint as Tracepoint message as Message
| eval JobType=case(like('Message',"%Ondemand Started%"), "OnDemand", like('Message',"Expense Process started%"), "Scheduled", true(), "Unknown")
| eval Message=trim(Message,"\"")
| rename correlationId as CorrelationId traceas TracePoint message as StatusMessage
| rename 
   correlationId AS CorrelationId 
   content.loggerPayload.archiveFileName AS ArchivedFileName
| stats
   earliest(Timestamp) AS Timestamp
   values(Tracepoint) AS Tracepoint
   values(JobType) AS JobType
   values("FileName/JobName") AS "FileName/JobName"
   values(Message) AS Message
   values(StatusMessage) AS StatusMessage
   values(ArchivedFileName) AS ArchivedFileName
   BY CorrelationId

 

in other words: put all the searches in OR in the main search, use all the renames and evals, and at east correlate results using the join key in a stats command.

If you want some additional field, add it to the stats command.

Ciao.

Giuseppe

karthi2809
Builder

Hi @gcusello @

Splunk dashboard performance Issue
 

Let me explain my requirement properly.

I have message field and i need to extract multiple values from message fileds .for that is used multiple joins so its dashboard taking time to load fast. So as you mention in the answer use stats command .I tried to used stats but i cant able to get it in table .

In real time senario:

I am giving field with keywords

 

  • Three are two jobtype  xxx and yyy
  • Message : "concur ondemand" and "expense"  --- Started Successfuly.
  • Message : "Error "                                                           ---Error
  • Message : Progress completed                                ------ completed 

and we have unique correlationID.Based on correlation ID we need find out the result.

I will copy paste that you suggested to using stats but i am not well in that can you please help to fix the issue.

 

 

 

 

 

 

index="xxx"   applicationName="api" environment=DEV timestamp correlationId tracePoint message ("Concur Ondemand Started*") OR (message="Expense Extract Process started for jobName : AP/GL Extract V.3.0*") OR (trace=ERROR) OR ("Before Calling flow archive-Concur*") OR (message="Concur AP/GL File/s Process Status*")|dedup correlationId
| rename content.SourceFileName as SourceFileName content.JobName as JobName
| eval "FileName/JobName"= coalesce(SourceFileName,JobName)
| rename timestamp as Timestamp correlationId as CorrelationId tracePoint as TracePoint message as Message
| eval JobType=case(like('Message',"%Concur Ondemand Started%"), "OnDemand", like('Message',"%Expense Extract Process started for jobName : *%"), "Scheduled")
| eval Message=trim(Message,"\"")
| rename correlationId as CorrelationId tracePoint as TracePoint message as Message
| rename content.loggerPayload.archiveFileName AS ArchivedFileName   
| eval Status=case(like('Message' ,"%Concur AP/GL File/s Process Status%"),"SUCCESS", like('TracePoint',"%EXCEPTION%"),"ERROR") 
| eval Response= coalesce(Response,Message)
| eval Status=if(TracePoint="ERROR","ERROR",Status) 
| join CorrelationId type=left 
    [ search index="xxx"  applicationName="api" | stats 
   earliest(timestamp) AS Timestamp
   values(TracePoint) AS TracePoint
   values(Response) AS Response 
   values(JobType) AS JobType
   values(Status) AS Status
   values("FileName/JobName") AS "FileName/JobName"
   values(Message) AS Message
   BY CorrelationId]

 

 

 

 

 

Status   FileName/JobNamJobTypeArchivedFileNameCorrelationIdTimestamp 
SUCCESSkarthixxxtest12essrfsf4dgs 
SUCCESSpriyayyytest246dsfh68 
0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...