Splunk Search

Search Count is different in direct search vs in table

asubramanian
Explorer

I am seeing an odd behavior where my search event count is different when the exact query is run separately vs when used to build a table for large number of log entries like around 10 million entries for lesser values it seems to match.

Below is a rough example of what the query looks like

index="my_index"
"Message1"
OR "Message2"
OR "Message3"
 | stats count
 | fieldformat count =tostring(count,"commas")
 | eval "Type"="Metric1"
 | append [ search
       index="my_index"
       "Message2"
       OR "Message3"
       | stats count
       | fieldformat count =tostring(count,"commas")
       | eval "Type"="Metric2"
 ] | append [ search
         index="my_index"
        "Message1"
       OR "Message3"
        | stats count
        | fieldformat count =tostring(count,"commas")
        | eval "Type"="Metric3"
 ]| append [ search
        index="my_index"
        "Message2"
       OR "Message3"
        | stats count
        | fieldformat count =tostring(count,"commas")
        | eval "Type"="Metric4"
 ]
 | table count, Type

If I run the query for Metric2 separately i get the right count vs if I run as part of building the table which is usually much lesser.

Also since I am searching for same message across these queries is it possible to reuse the count of these message and just add a row based on these message counts ?

Tags (2)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi asubramanian,
check the results of your subsearches because there's a limit of 50,000 results in subsearches and maybe it's yous situation.
why you don't use a different approach with only one search (remember that Splunk isn't a DB!)?
Something like this:

index=my_index
| eval Type=case(searchmatch("Message1"),"Metrics1",searchmatch("Message2"),"Metrics2",searchmatch("Message3"),"Metrics3")
| stats count By Type

Ciao.
Giuseppe

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...