Splunk Search

How to get fields across different events - via eval and search/where?

GaryZ
Path Finder

Is there an easy way of capturing the fields across different events?

 

example:

event 1)

       abc: {

       build: 123

       duration: 1.1

       sw: gen1

       hardware: h1

}

 

event 2)

      def: {

       build: 124

       duration: 1.4

       sw: gen2

       hardware: h2

}

|rename abc.duration as a_duration, def.duration as d_duration

| stats avg(d_duration) as avg_d_duration, avg(a_duration) as avg_a_duration by def.build, def.hardware
| eval avg_d_duration = round(avg(avg_d_duration),3)

| eval avg_a_duration=abc.duration          <= this is a limit line I want to implement based on the next search 

| search abc.build="123",  abc.hardware="h1"

 

 

NOTE:

There's multiple events similar to event1 & event2.  the differences between event1 & event2 are the different sw versions.

 

Problem:

I am not able to specify a search/where to get the abc.duration.  

 

Question:

1) How can I add a search at the end, so I can query the data from a different event?

Labels (4)
0 Karma

GaryZ
Path Finder

@richgalloway If I want a single value from one of the event, but I want to extend this for all the results captured from the later stats, how would I do that?  

At the moment, when I use append, that only appends the first result with the latter result.  This is still the case even if I specify the same variable in both stats.

 

ie. 

example1

| stats avg(avg(a_duration) as avg_a_duration by def.build, def.hardware

| append [ ...

| stats avg(d_duration) as avg_d_duration, avg(avg_a_duration)  by def.build, def.hardwareeval avg_d_duration = round(avg(avg_d_duration),3)

]

 

example2

| stats avg(avg(a_duration) as avg_a_duration by def.build, def.hardware

|eval avg_a_duration = round(avg(avg_a_duration),2)

| append [ ...

| stats avg(d_duration) as avg_d_duration, avg(g_duration)  by def.build, def.hardware 

]

g_duration = avg_a_duration

 

example3:

| stats avg(avg(a_duration) as avg_a_duration by def.build, def.hardware

|eval avg_a_duration = round(avg(avg_a_duration),2)

| append [ ...

| stats avg(d_duration) as avg_d_duration, avg(avg_a_duration)  by def.build, def.hardware

|eval avg_d_duration = round(avg(avg_d_duration),3)

]

avg_a_duration = 10

 

In Example1, I see the results from the first search appended to the second search

In Example 2, I see the same results as in example1, but just a different column - 'g_duration'

In Example 3, I see avg_a_duration is constant for both appended search results.     <= This is what I'm looking for, but I'm trying to achieve this with the results from the first search.

 

Sorry, I'm fairly new to Splunk, so there's a lot of learning on my end.  Please let me know if I need to clarify any parts of the question.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The append command adds its results to the bottom of the current result set.  If the main search returns results "foo" and "bar" and the append returns "baz" and "bat" then the final resullt will be

foo
bar
baz
bat

No relationship is made among any of the results.  You need to do that yourself.  Usually, that's done with the stats command using any field(s) common to all results.

<<main search>>
| append [ <<some other search>> ]
| stats values(*) as * by blah
---
If this reply helps you, Karma would be appreciated.
0 Karma

yuanliu
SplunkTrust
SplunkTrust

I am confused.  You already performed

 

 | stats avg(d_duration) as avg_d_duration, avg(a_duration) as avg_a_duration by def.build, def.hardware

 

After this, abc.duration no longer exist in data stream, only avg_a_duration.  But then, you have this eval

 

| eval avg_a_duration=abc.duration  ```<= this is a limit line I want to implement based on the next search ```

 

This wipes avg_a_duration out with null value because abc.duration no longer exists.

Of course, even if you do not have that eval, the first stats will not give you value for avg_a_duration, either, because the abc.duration only exists with abc.build and abc.hardware.

The best way to start is to describe your use case.  What is the result that you expect from the illustrated data?  What is the logic to "cross" events?  Is there any key to correlate the two builds? (I don't see any in your illustrated data.)  What is the purpose of that last search for abc attributes after you stats over def builds?

If there is no correlation, the best you can do is your search 1 with append.

 

0 Karma

GaryZ
Path Finder

Hello @yuanliu ,

 

Thank you for getting back.  I am looking to plot a chart, where the x,y values are the build and duration values (respectively),  based on the latest sw version.   I want to add a limit-line (base-line) captured from the previous sw version, and super impose it on the current chart.  

 

Since the build numbers will be different from the current and previous sw version, I want to capture a single data point  from previous sw version, and use that as the base-line point.  

 

How would I go about in getting this in the Splunk search?

 

TIA

0 Karma

yuanliu
SplunkTrust
SplunkTrust

So, to clarify:

  1. You want to chart by sw, not hardware.  If you stats by hardware, you will never get the chart by sw.
  2. The designation of abc and def is just weird because they are just distractions.  Do the top node really change from event to event in real data?

Assuming that the top node does change from event to event, you'll have to find some way to get rid of because they do not factor into your desired result. (Identical top node will make the search infinitely simpler.)  The function to call is coalesce; but you will need some way to enumerate the top nodes.  In the following, I will use foreach command to iterate.  This is less obvious what it does, so I also put a manual enumeration equivalent in comments.

| foreach *.*
    [ | eval <<MATCHSEG2>> = mvappend(<<MATCHSEG2>>, '<<MATCHSEG1>>.<<MATCHSEG2>>')]
``` the above is equivalent to the following
| eval build = coalesce('abc.build', 'def.build', 'ghi.build')
| eval duration = coalesce('abc.duration', 'def.duration', 'ghi.duration')
| eval sw = coalesce('abc.sw', 'def.sw', 'ghi.sw')```

The next question is: Do you always know the latest sw version and the one before that?  It would be simpler if you do.  Suppose the latest version is gen2 as in illustrated data, and second to last is gen1.

| foreach *.*
    [ | eval <<MATCHSEG2>> = mvappend(<<MATCHSEG2>>, '<<MATCHSEG1>>.<<MATCHSEG2>>')]
| where sw == "gen2"
| chart values(duration) over build

This will give you the chart for gen2.  Then, to overlay a flat line for gen1, you calculate its average, then spread it over gen2 builds. (I believe that average is better than a single data point.)  Like this.

| foreach *.*
    [ | eval <<MATCHSEG2>> = mvappend(<<MATCHSEG2>>, '<<MATCHSEG1>>.<<MATCHSEG2>>')]
| eventstats avg(duration) as avg_duration by sw
| eventstats values(eval(if(sw == "gen1", avg_duration, null()))) as previous_avg
| where sw == "gen2"
| chart values(duration) as duration values(previous_avg) as baseline by build

If you don't know the latest version, you can calculate it based on data.  It is just more calculations.

richgalloway
SplunkTrust
SplunkTrust

The abc.duration and abc.hardware fields were discarded by the stats command so they are not available to any commands that follow.  This is the nature of transforming commands like stats.

If you need to add data from another event then use the append command to add on a query for the desired event.  You then will need to merge the results based on shared field(s).

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

September Community Champions: A Shoutout to Our Contributors!

As we close the books on another fantastic month, we want to take a moment to celebrate the people who are the ...

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...