Splunk Search

Help with Splunk json dymanic field extraction

Sanjana
Explorer

Hey all,

I am trying to extract dynamic field from json .

{"period":{"start":"2023-04-17","end":"2023-05-14"},"check-ins":{"203":{"avail":5,"price":5},"204":{"avail":5,"price":5},"205":{"avail":5,"price":5},"206":{"avail":5,"price":5},"207":{"avail":5,"price":5},"208":{"avail":5,"price":5},"209":{"avail":5,"price":5},"210":{"avail":5,"price":5},"211":{"avail":5,"price":5},"212":{"avail":5,"price":5},"213":{"avail":5,"price":5},"214":{"avail":5,"price":5},"215":{"avail":5,"price":5},"216":{"avail":5,"price":5},"217":{"avail":5,"price":5},"218":{"avail":5,"price":5},"219":{"avail":19,"price":5},"220":{"avail":19,"price":5},"221":{"avail":19,"price":5},"222":{"avail":19,"price":5},"223":{"avail":19,"price":5},"224":{"avail":19,"price":5},"225":{"avail":19,"price":5},"226":{"avail":19,"price":5},"227":{"avail":19,"price":5},"228":{"avail":19,"price":5},"229":{"avail":20,"price":5},"230":{"avail":20,"price":5}}}

 

I need to extract 203,204,205.........till 230 as per data mentioned above then each extracted value will be added in period.start field . At the end I need that date value after addition

 

Thanks in advance

Labels (3)
0 Karma
1 Solution

kamlesh_vaghela
SplunkTrust
SplunkTrust

@Sanjana 

Can you please try this?

YOUR_SEARCH
| spath content.kIndexKey_EventMessage{1}.check-ins 
| spath input=content.kIndexKey_EventMessage{1}.check-ins 
| spath output=period_start path=content.kIndexKey_EventMessage{1}.period.start 
| eval period_start = strptime('period_start', "%F")
| foreach *.avail 
    [ eval suffix = mvappend(suffix, "<<MATCHSTR>>")] 
| mvexpand suffix
| eval datevalue = period_start + 86400 * suffix
| fields datevalue
| fieldformat datevalue = strftime(datevalue, "%F")

 

I hope this will help you.

Thanks
KV
If any of my replies help you to solve the problem Or gain knowledge, an upvote would be appreciated.

View solution in original post

0 Karma

yuanliu
SplunkTrust
SplunkTrust

This is very much like https://community.splunk.com/t5/Splunk-Search/How-to-convert-check-ins-index-into-date-then-calculat....  You can use the same strategy.

 

| eval period_start = strptime('period.start', "%F")
| foreach check-ins.*.avail
    [eval suffix = mvappend(suffix, "<<MATCHSTR>>")]
| mvexpand suffix
| eval datevalue = period_start + 86400 * suffix
| fields datevalue
| fieldformat datevalue = strftime(datevalue, "%F")

 

Using your sample data, you get

datevalue
2023-11-05
2023-11-06
2023-11-07
2023-11-08
2023-11-09
2023-11-10
2023-11-11
2023-11-12
2023-11-13
2023-11-14
2023-11-15
2023-11-16
2023-11-17
2023-11-18
2023-11-19
2023-11-20
2023-11-21
2023-11-22
2023-11-23
2023-11-24
2023-11-25
2023-11-26
2023-11-27
2023-11-28
2023-11-29
2023-11-30
2023-12-01
2023-12-02

Here, I assume you want to display in calendar format, hence fieldformat.  If you want the date value to be numeric, just omit that. (The advantage of fieldformat is that the field itself is still in numeric.)

Sanjana
Explorer

Hello @yuanliu ,

I tried above response.but no luck. I am giving you exact scenario which I m trying to work.

This is my actual json structure of splunk data.

"content":
{"kIndexKey_EventMessage":
{
"period":{"start":"2022-10-03","end":"2022-10-19"},
"check-ins":{
"12":{"avail":13,"price":0},
"13":{"avail":13,"price":0},
"14":{"avail":13,"price":0},
"15":{"avail":13,"price":0},
"16":{"avail":13,"price":0},
"17":{"avail":13,"price":0},
"18":{"avail":13,"price":0},
"19":{"avail":13,"price":0},
"20":{"avail":13,"price":0},
"21":{"avail":13,"price":0},
"22":{"avail":13,"price":0},
"23":{"avail":13,"price":0},
"24":{"avail":13,"price":0},
"25":{"avail":13,"price":0},
"26":{"avail":13,"price":0},
"27":{"avail":13,"price":0},
"28":{"avail":13,"price":0}
}
}

My search is something like below:

index="index1"
| fields _raw
| spath content.kIndexKey_EventMessage{1}.check-ins
| spath input=content.kIndexKey_EventMessage{1}.check-ins
|spath output=period_start path=content.kIndexKey_EventMessage{1}.period.start

| eval period_start = strptime('period.start', "%F")
| foreach check-ins.*.avail
[eval suffix = mvappend(suffix, "<<MATCHSTR>>")]
| mvexpand suffix
| eval datevalue = period_start + 86400 * suffix
| fields datevalue
| fieldformat datevalue = strftime(datevalue, "%F")

 

After executing this ->Getting below erro without any data as you shown 

Field 'suffix' does not exist in the data.

Could you please help me where I m getting wrong in this query

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@Sanjana 

Can you please try this?

YOUR_SEARCH
| spath content.kIndexKey_EventMessage{1}.check-ins 
| spath input=content.kIndexKey_EventMessage{1}.check-ins 
| spath output=period_start path=content.kIndexKey_EventMessage{1}.period.start 
| eval period_start = strptime('period_start', "%F")
| foreach *.avail 
    [ eval suffix = mvappend(suffix, "<<MATCHSTR>>")] 
| mvexpand suffix
| eval datevalue = period_start + 86400 * suffix
| fields datevalue
| fieldformat datevalue = strftime(datevalue, "%F")

 

I hope this will help you.

Thanks
KV
If any of my replies help you to solve the problem Or gain knowledge, an upvote would be appreciated.

0 Karma

Sanjana
Explorer

Hey @kamlesh_vaghela 

Thanks for the response. It worked fine.

Sorry to bug you again. I have two queries running separate fine.

1. Query

MY_SEARCH

| spath content.kIndexKey_EventMessage{1}.check-ins
| spath input=content.kIndexKey_EventMessage{1}.check-ins
| spath output=period_start path=content.kIndexKey_EventMessage{1}.period.start
| eval period_start = strptime('period_start', "%F")
| foreach *.avail
[ eval suffix = mvappend(suffix, "<<MATCHSTR>>")]
| mvexpand suffix
| eval datevalue = period_start + 86400 * suffix
| fields datevalue
| fieldformat datevalue = strftime(datevalue, "%F")
|table datevalue

Output 

datevalue

2022-09-27
2022-09-28
2022-09-29
2022-09-30

 

2.Query:

My_Search

| fields _raw
|spath output=chain path=content.kIndexKey_EventMessage{1}.chain
|spath output=start path=content.kIndexKey_EventMessage{1}.period.start
| where like(chain, "IHG%")
| spath content.kIndexKey_EventMessage{1}.check-ins
| spath input=content.kIndexKey_EventMessage{1}.check-ins
| foreach *.check-ins
[| eval "<<MATCHSEG1>>"='<<FIELD>>'+'start']
| foreach *.avail
[| eval "<<MATCHSEG1>>"='<<FIELD>>']
| foreach *.price
[| eval "<<MATCHSEG1>>"='<<MATCHSEG1>>'+'<<FIELD>>']
| dedup _raw,_time
| fields - *.avail *.price check_ins check-ins _time
| untable _raw check_ins nb_desync
| search nb_desync > 0
| stats sum(nb_desync) as nb_desync by check_ins | sort -nb_desync
|join type=left
[search index="int-acrs-cde_appevent-shared"
|spath output=chain path=content.kIndexKey_EventMessage{1}.chain
| where like(chain, "IHG")
|spath output=nb_avail_desync path=content.kIndexKey_EventMessage{1}.total.avail
|spath output=nb_price_desync path=content.kIndexKey_EventMessage{1}.total.price
| eval tot_desync=nb_avail_desync+nb_price_desync
| stats sum(tot_desync) as total ]
| eval percent = round(nb_desync*100/total,1)
| search percent > 0
| fields - total_Desync , percent,total

Output:

check_ins nb_desync

013
113
213

Sanjana_0-1664289833447.png

I am trying to combine both query such that my output should look something as mentioned below:

1st column from 1st query as date 

2nd column form 2nd query as nb_desync

datevalue nb_desync

2022-09-2713
2022-09-2813
2022-09-2913
2022-09-3013

 

Thanks in advance!! Looking for your response.

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@Sanjana 

Can you please let us know how you are relating 1st query output with 2nd query output? if it is one-to-one mapping then just try the below search.

YOUR_FIRST_SEARCH
| table datevalue  | eval key=1 | accum key | append [search YOUR_SECOND_SEARCH 
| table check_ins nb_desync | eval key=1 | accum key]
| stats values(*) as * by key
|fields - key

 

My Sample Search :

 

| makeresults 
| eval _raw="datevalue
2022-10-15
2022-10-16
2022-10-17
2022-10-18" 
| multikv forceheader=1 
| table datevalue  | eval key=1 | accum key | append [| makeresults 
| eval _raw="check_ins,nb_desync
0,13
1,13
2,13
3,13" 
| multikv forceheader=1 
| table check_ins nb_desync | eval key=1 | accum key]
| stats values(*) as * by key
|fields - key

 

If there is some other correlation logic then please share _raw for 2nd query and logic.

KV

 

 

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...