Splunk Search

Fix appendpipe [stats count | where count=0] in search

ebs
Communicator

Hi,

When using the suggested appendpipe [stats count | where count=0] I've noticed that the results which are not zero change. I've realised that because I haven't added more search details into the command this is the cause but considering the complexity of the search, I need some help in integrating this command in the search

Search:
| datamodel Test summariesonly=true search
| search "TEST.date"=2021-05-18 | rename "TEST.date" as date
| rename "TEST.uri_path" as uri_path
| eval category=case(like(uri_path, "/url1"), "highPriority", uri_path="/url2", "unattended",
uri_path="/url3", "lowPriority", uri_path="/url4", "largePayload")
| rename "TEST.response_time" as response_time
| stats avg(response_time) by category
| rename avg(response_time) as averageResponse
| appendpipe [stats count as averageResponse | where averageResponse=0]
| eval averageResponse=round(averageResponse,3)
| transpose 0 header_field=category
| fillnull value=0.000 highPriority, lowPriority, largePayload, unattended
| eval _time="$date$"
| fields highPriority, lowPriority, largePayload, unattended, _time

Note: Fillnull only works when one of the other fields has a value, I want to eventually remove it but it exists currently for reference

Labels (5)
0 Karma

ebs
Communicator
 
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Perhaps the reason you are getting different results is because a different number of events are returned in the two searches?

0 Karma

ebs
Communicator

My impression of appendpipe was that it used the results from the search conducted earlier to produce the appropriate results. Even when I just have | appendpipe with no following subsearch in the line it still changes the expected results.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

The appendpipe you have used only adds an event with averageResponse=0 if there are no results from the earlier part of the search, if you have results it does nothing. If you look at the two screenshots you provided, you can see how many events are included from the search and they are different which probably accounts for the difference you are seeing, not the presence of the appendpipe.

0 Karma

ebs
Communicator

The two searches are the same aside from the appendpipe, one is with the appendpipe and one is without. The one without the appendpipe, its values are higher than the one with the appendpipe 

If the issue is not the appendpipe being present then how do I fix the search where the results don't change according to its presence if its results are more than 0

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

They aren't the same, they have different time ranges and consequently a different number of events are used

ITWhisperer_0-1623219851036.pngITWhisperer_1-1623219873598.png

 

0 Karma

ebs
Communicator

I use a $date$ token, which specifies which day I'm looking over and I have tested the token multiple times on multiple occasions and have the same results. The screenshot of time of the run affects nothing

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Those two times are the earliest and latest time of the events returned by the initial search and the number of events. These are clearly different. How are you specifying the timerange for your searches? Can you show a difference in the results where the time ranges and number of events are identical?

0 Karma

ebs
Communicator

I have created a $date$ token in this style YYYY-MM-DD

It searches for events on this specific date only. For example I searched for the results on 2021-05-18 yesterday, and again a few weeks ago the results are the same because it only takes results that have the corresponding date field which is based on the _time field. It does not change dependent on when I search for it because it is only searching for the events that have the corresponding date field specified

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...