Splunk Search

Problem with a query not returning proper data with nested search

rmcfarla
Explorer

I have a query that has two nested searches, it has been working correctly for at least a few years when I was using Splunk 5.x but I have recently updated to Splunk 6.1 and the nested queries are not working properly. The first one (1 day ago) will work on some days but not always, the second sub query (7 days ago) always returns the wrong results.

I can take each of the searches and run them in the search, primary always works, the two sub searches will not work in smart move but if I change it to verbose mode it will work correctly. I first added a SearchMode and could see that my post query had the proper adhoc_search_level value set but it still world not work.

Here is my query:

  index=caabistats | sort ~_time | head 1
   | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
   | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
   | rename customer_count as "Total Customer Count",
            caa_customer_count as "CAA Customer Count",
            cna_customer_count as "CNA Customer Count",
            caa_cna_customer_count as "CAA and CNA Customer Count",
            total_found_devices as "Total Found Devices",
            caa_found_devices as "CAA Found Devices",
            cna_found_devices as "CNA Found Devices",
            findit_found_devices as "FindIt Found Devices"
   | transpose
   | rename column as "Basic Information", "row 1" as Current
     | join[search index=caabistats **| sort ~_time | head 2 | tail 1**
     | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
     | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
     | rename customer_count as "Total Customer Count",
              caa_customer_count as "CAA Customer Count",
              cna_customer_count as "CNA Customer Count",
              caa_cna_customer_count as "CAA and CNA Customer Count",
              total_found_devices as "Total Found Devices",
              caa_found_devices as "CAA Found Devices",
              cna_found_devices as "CNA Found Devices",
              findit_found_devices as "FindIt Found Devices"
     | transpose
     | rename column as "Basic Information", "row 1" as "1 day ago"]
       | join[search index=caabistats **| sort ~_time | head 7 | tail 1**
       | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
       | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
       | rename customer_count as "Total Customer Count",
                caa_customer_count as "CAA Customer Count",
                cna_customer_count as "CNA Customer Count",
                caa_cna_customer_count as "CAA and CNA Customer Count",
                total_found_devices as "Total Found Devices",
                caa_found_devices as "CAA Found Devices",
                cna_found_devices as "CNA Found Devices",
                findit_found_devices as "FindIt Found Devices"
       | transpose
       | rename column as "Basic Information", "row 1" as "7 days ago"]

I did figure out a workaround and it is listed here:

  index=caabistats | sort ~_time | head 1
   | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
   | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
   | rename customer_count as "Total Customer Count",
            caa_customer_count as "CAA Customer Count",
            cna_customer_count as "CNA Customer Count",
            caa_cna_customer_count as "CAA and CNA Customer Count",
            total_found_devices as "Total Found Devices",
            caa_found_devices as "CAA Found Devices",
            cna_found_devices as "CNA Found Devices",
            findit_found_devices as "FindIt Found Devices"
   | transpose
   | rename column as "Basic Information", "row 1" as Current
     | join[search index=caabistats **earliest=-1d@d latest=-1d@d+1d**
     | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
     | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
     | rename customer_count as "Total Customer Count",
              caa_customer_count as "CAA Customer Count",
              cna_customer_count as "CNA Customer Count",
              caa_cna_customer_count as "CAA and CNA Customer Count",
              total_found_devices as "Total Found Devices",
              caa_found_devices as "CAA Found Devices",
              cna_found_devices as "CNA Found Devices",
              findit_found_devices as "FindIt Found Devices"
     | transpose
     | rename column as "Basic Information", "row 1" as "1 day ago"]
       | join[search index=caabistats **earliest=-7d@d latest=-7d@d+1d**
       | eval caa_cna_customer_count=(caa_customer_count+cna_customer_count)-customer_count
       | table customer_count, caa_customer_count, cna_customer_count, caa_cna_customer_count, total_found_devices, caa_found_devices, cna_found_devices, findit_found_devices
       | rename customer_count as "Total Customer Count",
                caa_customer_count as "CAA Customer Count",
                cna_customer_count as "CNA Customer Count",
                caa_cna_customer_count as "CAA and CNA Customer Count",
                total_found_devices as "Total Found Devices",
                caa_found_devices as "CAA Found Devices",
                cna_found_devices as "CNA Found Devices",
                findit_found_devices as "FindIt Found Devices"
       | transpose
       | rename column as "Basic Information", "row 1" as "7 days ago"]

Looks like there might be some problem or bug in Splunk 6.1 when using sort ~ time | head x | tail 1

Tags (2)
0 Karma
1 Solution

martin_mueller
SplunkTrust
SplunkTrust

I assume this is the answer?

I missed the '~', it should have been a '-' and that fixed the problem, guess I have been looking at it too long.

If so, mark the question as solved.

View solution in original post

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

I assume this is the answer?

I missed the '~', it should have been a '-' and that fixed the problem, guess I have been looking at it too long.

If so, mark the question as solved.

0 Karma

rmcfarla
Explorer

I missed the '~', it should have been a '-' and that fixed the problem, guess I have been looking at it too long. There are only 325 rows being returned before the sort. One row for each day that looks like this: src_host=op2-prd1-db02, src_date="2014-05-19 10:04:01.271495+00", customer_count="11660", caa_customer_count="3038", cna_customer_count="9706", findit_customer_count="1", total_found_devices="217483", caa_found_devices="62984", cna_found_devices="123858", findit_found_devices="30547"

That query gives me the current days, previous day and 7 days ago numbers.

Thanks

martin_mueller
SplunkTrust
SplunkTrust

Something like this run over -7d@d to now:

index=caabistats | eval date = strftime(_time, "%Y-%m-%d")
| stats latest(customer_count) as "Total Customer Count", latest(caa_customer_count) as "CAA Customer Count" by date
| untable date "Basic Information" count | xyseries "Basic Information" date count

That's for two fields and all eight days. The untable | xyseries does the transposing of the table in case you want a column per day like in your search rather than a row per day like stats would normally produce. The latest() picks the last value per day.

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

What is sort ~ _time supposed to do? I don't think there's any modifier beyond + for ascending and - for descending.

How many rows are being returned before the sort?

The search seems very convoluted overall, I'm sure it can be rewritten entirely to avoid searching and sorting the same data thrice over.

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...