Splunk Search

How to join multiple tables / stats to a single table

parameshjava
Explorer

In our application, we are processing files received by our application.
In various places, we have logs as follows:

Log when a file received:
Received Document [application, document id] : app_name, doc_uuid
Ex:
2017-07-02_20:46:08.073 INFO c.o.f.m.r.NotificationReceiver - Received Document [application, document id] : ApplicationOne, 5959a1dfe4b0fc54b544ae1e

Log when a file in error format:
Not able to process the request [application, document id, errorMessage] : app_name, doc_uuid, error_msg
Ex:
2017-06-29_20:03:54.868 [SimpleAsyncTaskExecutor-5] ERROR c.o.f.m.r.NotificationReceiver - Not able to process the request [application, document id, errorMessage] : ApplicationOne, 5959a1dfe4b0fc54b544ae1e, The field abcd - cannot be null

Log when document processing successful:
Document Processed successfully [application, document id] : app_name, doc_uuid
Ex:
2017-06-29_20:03:54.868 [SimpleAsyncTaskExecutor-5] ERROR c.o.f.m.r.NotificationReceiver - Document Processed successfully [application, document id] : ApplicationOne, 5959a1dfe4b0fc54b544ae1e

We need to prepare daily statistics for a week as follows:

alt text

Index & Source types are same for all the log entries.
Ex: index = "SplunkAppNameProd" sourcetype="appname-service-prod"

1 Solution

somesoni2
Revered Legend

Try like this

 index = "SplunkAppNameProd" sourcetype="appname-service-prod"
| eval Received=if(searchmatch("Received Document"),1,0)
| eval Successfull=if(searchmatch("Document Processed successfully"),1,0)
| eval Error=if(searchmatch("Not able to process the request"),1,0)
| rex "^(\S+\s+){3}([^\[]+)\[(?<appId>[^\,]+)"
| eval Day=strftime(_time,"%F")
| stats sum(Received) as "Received Count" sum(Successfull) as "Successfull Count" sum(Error) as "Error Count" by Day appId

View solution in original post

somesoni2
Revered Legend

Try like this

 index = "SplunkAppNameProd" sourcetype="appname-service-prod"
| eval Received=if(searchmatch("Received Document"),1,0)
| eval Successfull=if(searchmatch("Document Processed successfully"),1,0)
| eval Error=if(searchmatch("Not able to process the request"),1,0)
| rex "^(\S+\s+){3}([^\[]+)\[(?<appId>[^\,]+)"
| eval Day=strftime(_time,"%F")
| stats sum(Received) as "Received Count" sum(Successfull) as "Successfull Count" sum(Error) as "Error Count" by Day appId

parameshjava
Explorer

@somesoni2 Thanks a lot. It worked.

0 Karma

niketn
Legend

@parameshjava, you need to add more data example. The three data sources that you have mentioned do they belong to different indexes or same indexes with different sourcetypes.

While from your exmplaination it seems you want to perform correlation based on both app_name and doc_uuid, it would be useful if you can add mocked data for these as well. In your table UUID is not present so, would it be fine if the status are counted directly aggregated by app_name?

Ideally if you have three different sourcetypes(or three different sources with wildcard pattern) for received, successful and error, you can try similar to following:

sourcetype="<YourReceivedSourceTypeName>" OR sourcetype="<YourSuccessSourceTypeName>" OR sourcetype="<YourErrorSourceTypeName>" app_name=*
| timechart span=1d count(eval(sourcetype=="<YourReceivedSourceTypeName>") as ReceivedCount count(eval(sourcetype=="<YourSuccessSourceTypeName>") as SuccessCount count(eval(sourcetype=="<YourErrorSourceTypeName>") as ErrorCount  by app_name
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

parameshjava
Explorer

@niketnilay, thanks for your time, I have updated my query with examples and index, sourcetype are same for all the log entries.

0 Karma

rjthibod
Champion

The community can best help you if you provide an example query that shows how to find the correct indexes, sourcetypes, sources, etc.

0 Karma
Get Updates on the Splunk Community!

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...

Stay Connected: Your Guide to October Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...