Splunk Search

Compare unknown number of fields with previous values

FrankSPL
Path Finder

Hi all,
This is a challenge....
I do have some basic SPL knowledge but I can't get my head around this one.

I have a source with machine data, thousands of sensors each reporting their set of "parameters". Each sensor reports a "tagname", a sensor type and ~100 different parameters. Some parameters are the same for different sensor type, some are unique.
I do want to compare these parameters between at least 2 sets (=2 days) or even over 7-14 days to detect differences over time.
As long as all values of the same parameter for the same sensor "tagname" are all equal everythiong is okay, I only want the report with the noted differences like:

Tagname parameter oldValue newValue 
Tagxyp  temp      12       4

or later on
Tagname parameter ValueDay1 ValueDay2 ValueDay3 ValueDay4
Tagxyp  temp      12        4         12        12

After a lot of trying and testing I now have this:

    | stats list(*) as * by Tagname 
    | foreach * 
    [eval <<FIELD>>_val=if(mvcount(mvdedup(<<FIELD>>))>1,"Change","All good")

This will create a table with 100+ columns (one for each parameter), another 100+ colums with a Change/"all good" string that displays if the parameter is changed or not (or better, if all listed values are the same).

I don't know how to proceed. If want only the"changed" values but in a listed form like above.

So for each tag, for each found parameter difference, i need the column name (=parameter name) and all values in the value-list with their corresponding datasource.

Does anybody has any idea how to tackle this? Or some hints how to proceed?
Help is much appreciated!

Tags (2)
0 Karma
1 Solution

somesoni2
Revered Legend

Assuming you could cleanup the list of fields and you're comaring today versus yesterday, try this

For 2 days

your base search and cleanup logic to get only the fields _time tagname and paramenters, kind of | table _time tagname para1, para2... 
| eval day=if(_time>=relative_time(now(),"@d"),"Today","Yesterday")
| eval timetagname=day."#".tagname | fields - _time day tagname
| untable timetagname parameter val 
| eval day=mvindex(split(timetagname,"#"),0) 
| eval tagname=mvindex(split(timetagname,"#"),1) 
| eval tagnameparam=tagname."#".param 
| chart values(val) over tagnameparam by day
| eval tagname=mvindex(split(tagnameparam,"#"),0) 
| eval parameter=mvindex(split(tagnameparam,"#"),1) 
| table tagname parameter Yesterday Today
| where Yesterday!=Today

View solution in original post

0 Karma

somesoni2
Revered Legend

Assuming you could cleanup the list of fields and you're comaring today versus yesterday, try this

For 2 days

your base search and cleanup logic to get only the fields _time tagname and paramenters, kind of | table _time tagname para1, para2... 
| eval day=if(_time>=relative_time(now(),"@d"),"Today","Yesterday")
| eval timetagname=day."#".tagname | fields - _time day tagname
| untable timetagname parameter val 
| eval day=mvindex(split(timetagname,"#"),0) 
| eval tagname=mvindex(split(timetagname,"#"),1) 
| eval tagnameparam=tagname."#".param 
| chart values(val) over tagnameparam by day
| eval tagname=mvindex(split(tagnameparam,"#"),0) 
| eval parameter=mvindex(split(tagnameparam,"#"),1) 
| table tagname parameter Yesterday Today
| where Yesterday!=Today
0 Karma

FrankSPL
Path Finder

wow! thanks so far!

I'm not able to test it right away but the first part is already working.
Never used UNTABLE before but this could be the gamechanger.
In need to fiddle around because my testdata is not actually from "today" and "yesterday" but I get the point.

0 Karma

somesoni2
Revered Legend

Since we're using dates as column name (Today/yesterday), we need to find a way to convert your two time ranges (which you want to compare) to some static value. FOr larger period (your 2nd requirement of multiple days would built on this but would be more complex.

0 Karma

FrankSPL
Path Finder

Yes, it works with some adjustments. I used source instead of day to distinguish the old and new data set.

sourcetype="dat:parameters" Blocktype!="EMPTY" 
| table * 
| fields - app change_type enabled eventtype host ids_type index linecount meta object_category product punct splunk_server sourcetype splunk_server_group status tag* timestamp user_type vendor range _raw _time 
| eval keydata=Tagname."#".source 
| fields - source 
| untable keydata parameter value 
| eval Tagname=mvindex(split(keydata,"#"),0) 
| eval source=mvindex(split(keydata,"#"),1) 
| eval TagPara=Tagname."#".parameter 
| chart values(value) over TagPara by source 
| rename Data0131.txt as Source_OLD Data0132.txt as Source_NEW 
| eval Tagname=mvindex(split(TagPara,"#"),0) 
| eval Parameter=mvindex(split(TagPara,"#"),1) 
| table Tagname Parameter Source_OLD Source_NEW 
| where Source_OLD!=Source_NEW

The result is exactly what I need.

somesoni2 thnx alot for your help!!!

0 Karma

FrankSPL
Path Finder

Some additional information:

The RAW info look like this:
tagname=abcdefg, sensortype=xyz,para1=123,para2=234,para3=456,......para100=999

In the total dataset there are 140 parameters, each sensor has on average 80 parameters.
The parameter values are mostly numerical, however some have textual characters.

0 Karma

somesoni2
Revered Legend

What are the actual parameter names? Do they follow any pattern (basically we should be able to issue a table command to just select tagname and parameters excluding all other irrelevant fields)?

0 Karma

FrankSPL
Path Finder

No, parameter names do not follow any pattern.

When I do a

sourcetype="sensordata" 
|  fieldsummary 
|  table field 

I receive all possible values including some I don't want (linke punct, s[lunk_server_ etc)
The same applies for the

| stats list(*) as * by Tagname 

I could clean it before this statement and remove uneeded fields.

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...