Splunk Search

Comparing results from two searches

dcparker
Path Finder

Hello,

I am trying to compare the standard deviation from the last 24 hours to the standard deviation of the last 3 hours. I have my search, which is basically this:

earliest = -24h@h latest = @h | timechart span=1h count | stats stdev(count) as test | append [ search earliest = -3h@h latest = @h | timechart span=1h count | stats stdev(count) as testsub ]

The search is working fine, but returns two rows. The problem is that only one field in each row is populated from the search. It looks like this:
alt text

However, I cannot compare those two values, I think because of the empty fields in each row. Is there a way to get these in the same row or a better way to compare them? I have tried eval, where, and a lot of different ways to compare them. Any help is appreciated!

Tags (2)
0 Karma
1 Solution

kmattern
Builder

Make your subsearch appendcols instead of append. You'll get everything in one row.

earliest = -24h@h latest = @h | timechart span=1h count | stats stdev(count) as test | appendcols [ search earliest = -3h@h latest = @h | timechart span=1h count | stats stdev(count) as testsub ]

alt text

View solution in original post

kmattern
Builder

Make your subsearch appendcols instead of append. You'll get everything in one row.

earliest = -24h@h latest = @h | timechart span=1h count | stats stdev(count) as test | appendcols [ search earliest = -3h@h latest = @h | timechart span=1h count | stats stdev(count) as testsub ]

alt text

Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...