Splunk Search

Use results from a search to make multiple different searches and place the results in a table

Bleepie
Communicator

Dear Splunk Community,

I need some help fetching data from a source, then use the results as a searchparameter for different other searches  and put all of the results into one statistics table.

I have (names have been changed):

  • One index : [myIndex]
  • One host : [myHost]
  • Source one : [/xxx/xxx/xxxxx/xxxx/log/xxxxxxx/*/*.log]
  • Source two : [/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log]
  • Source three : [/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log]

I have the following dashboard:

old_splunk.png

In the upper left you see a statistics table. A RUNID is basically a timestamp. I have a lot of different files that contain the RUNID. I collect all the files and then display each RUNID only once (so duplicates are not shown) using the following query:

 

index="myIndex" host="myHost" source="/xxx/xxx/xxxxx/xxxx/log/xxxxxx/*/*.log" 
            | eval source=replace(source,"^/xxx/xxx/xxxxx/xxxx/log/xxxxxx/","") 
            | eval source=replace(source,"/.*","") 
            | stats values(source) by source
            | fields source 
            | rename source AS "RUNID" 
            | sort RUNID desc

 

When I click any RUNID another search is performed in the upper right (DATA) statistics table. This has the following query:

 

index="myIndex" host="myHost" source="/xxx/xxx/xxxxx/xxxx/log/xxxxxx/$tokenRUNID$/*.log" CTJT* $meldingen$  

            | fields _time, _raw
            | rename _time AS "Datum"
            | convert timeformat="%m-%d-%Y %H:%M:%S" ctime(Datum)
            | eval _raw = replace(_raw,"^[^#]*#", "")
            | rename _raw AS "Sensor Activiteit"
            | sort Datum desc

 

 In the bottom (center) I have 3 single value fields that show ERRORS, WARNINGS and INFO. For each I have the following code (with the exception of the field ERROR/WARN/INFO) : 

 

index="myIndex" host="myHost" source="/xxx/xxx/xxxxx/xxxx/log/xxxxxx/$tokenRUNID$/*.log" CTJT* AND CASE("ERROR") 

| stats count

 


And at last I have a single value field showing the profile:

 

index="myIndex" 
host="myHost" 
source="/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log" OR
source="/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log"

$tokenRUNID$ 
"started with profile"

| rex field=_raw "profile\s(?<whatever>[^\s\r]+)"
| stats count by whatever | fields - count			

 

 

So right now I have multiple data searches in different representations (single value fields, data tables etc.). I would like to create the following:

A table with RUNID's where the table also shows the PROFILE, DATE and the WARNINGS, ERRORS and INFO counts right next to it. It should look like this, but populated:

new_splunk.png

So I basically want to place multiple searches and results into one statistics table.

I have tried playing with appendcols like described in this topic:

https://community.splunk.com/t5/Splunk-Search/multiple-search-output-in-a-single-table-list-somethin...

But using index gives me the error 

Unknown search command 'index'.

How can I manage to get the above? Thanks in advance.

 

 

0 Karma
1 Solution

Bleepie
Communicator

Hi ITWhisper,

I tried your solution but could not get it to work. Thanks though. I eventually managed to get it to work using the below:

index="myIndex" 
host="myHost" 
source="/opt/IBM/taddm/dist/log/sensors/*/*.log" CTJT* 
    | rex field=source "^/opt/IBM/taddm/dist/log/sensors/(?<RUNID>.+)/"     
    | rex ".*(?<log_level>(INFO|WARN|ERROR))" 
    | chart latest_time(_time) AS _time count(eval(log_level="INFO")) AS informational count(eval(log_level="WARN")) AS warnings count(eval(log_level="ERROR")) AS errors by RUNID
    | rename informational AS "Goedmeldingen"
    | rename warnings AS "Waarschuwingen"
    | rename errors AS "Foutmeldingen"
    | eval _time=strftime(_time,"%d/%m/%Y %H:%M:%S")
    | rename _time AS "Datum"
    | sort RUNID desc
    | join type=left RUNID
    [ search index="myIndex" 
        host="myHost" 
		source="/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log" OR
		source="/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log"
        "started with profile" 
    | rex field=_raw "XXXXXXXXX\srun,\s(?<RUNID>[^\s]+)\sstarted\swith\sprofile\s(?<profile>[^\s\r]+)" 
    | stats count by profile RUNID 
    | fields profile RUNID]
    | rename profile AS Profiel

 

View solution in original post

0 Karma

Bleepie
Communicator

One of my colleagues helped me to start, right now we have everything in the table except for profile:

splunk.png

index="myIndex" 
host="myHost" 
source="/xxx/xxx/xxxxx/xxxx/log/xxxxxx/*/*.log" CTJT* 
    | rex field=source "^/xxx/xxx/xxxxx/xxxx/log/xxxxxx/(?<RUNID>.+)/"     
    | rex ".*(?<log_level>(INFO|WARN|ERROR))" 
    | chart latest_time(_time) AS _time count(eval(log_level="INFO")) AS informational count(eval(log_level="WARN")) AS warnings count(eval(log_level="ERROR")) AS errors by RUNID
    | rename informational AS "Goedmeldingen"
    | rename warnings AS "Waarschuwingen"
    | rename errors AS "Foutmeldingen"
    | eval _time=strftime(_time,"%d/%m/%Y %H:%M:%S")
    | rename _time AS "Datum"
    | sort RUNID desc

The only thing we need to add right now is to add the profile field in the table as mentioned in the OP. I need combine the following code with the above code:

index="myIndex" 
host="myHost" 
source="/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log" OR
source="/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log"

$tokenRUNID$ 
"started with profile"

| rex field=_raw "profile\s(?<whatever>[^\s\r]+)"
| stats count by whatever | fields - count			

My colleague is unavailable at the moment and is not quite sure how to combine the above. Maybe someone in the community does? Thanks in advance.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

The implication of your second search seems to be that the RUNID and profile are part of the same event, so append the second search to the first except instead of filtering by tokenRUNID, extract the RUNID and whatever, then join the two searches with a stats command

first search
| append [
  | search index="myIndex" 
host="myHost" 
source="/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log" OR
source="/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log"
"started with profile"

| rex field=_raw "profile\s(?<whatever>[^\s\r]+)"
| rex field=_raw "RUNID=(?<RUNID>whatever search string fits)"
| stats count by whatever RUNID | fields - count]
| stats values(*) as * by RUNID
0 Karma

Bleepie
Communicator

Hi ITWhisper,

I tried your solution but could not get it to work. Thanks though. I eventually managed to get it to work using the below:

index="myIndex" 
host="myHost" 
source="/opt/IBM/taddm/dist/log/sensors/*/*.log" CTJT* 
    | rex field=source "^/opt/IBM/taddm/dist/log/sensors/(?<RUNID>.+)/"     
    | rex ".*(?<log_level>(INFO|WARN|ERROR))" 
    | chart latest_time(_time) AS _time count(eval(log_level="INFO")) AS informational count(eval(log_level="WARN")) AS warnings count(eval(log_level="ERROR")) AS errors by RUNID
    | rename informational AS "Goedmeldingen"
    | rename warnings AS "Waarschuwingen"
    | rename errors AS "Foutmeldingen"
    | eval _time=strftime(_time,"%d/%m/%Y %H:%M:%S")
    | rename _time AS "Datum"
    | sort RUNID desc
    | join type=left RUNID
    [ search index="myIndex" 
        host="myHost" 
		source="/yyy/yyy/yyyyy/yyyy/log/yyyyyyyyy/firstlogfile.log" OR
		source="/zzz/zzz/zzzzz/zzzz/log/zzzzzzzz/seconflogfile.log"
        "started with profile" 
    | rex field=_raw "XXXXXXXXX\srun,\s(?<RUNID>[^\s]+)\sstarted\swith\sprofile\s(?<profile>[^\s\r]+)" 
    | stats count by profile RUNID 
    | fields profile RUNID]
    | rename profile AS Profiel

 

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...