All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Where I need to set kv_mode = json? 
Hey @Karthikeya  as always "it depends" what you mean by "extract json data" and what problem you are trying to solve? Are you seeing duplicate extractions? This thread talks about indexed extracti... See more...
Hey @Karthikeya  as always "it depends" what you mean by "extract json data" and what problem you are trying to solve? Are you seeing duplicate extractions? This thread talks about indexed extractions settings (in your case it would be needed on the UF) and search time "kv mode" settings (which would be on the Search Head) colliding. "Indexed Extractions" I would not suggest be the first step when working with JSON data. Splunk can extract well formed json at "search time", saving storage and search performance if it is not necessary to store the entire json blob in indexed fields.  So can you clarify what exactly you are doing, or even better post a new question and we can move there?   thanks!
Yes, foreach mode=multivalue appeared in 9.0.0.
m using 8.0.5
@ITWhisperer wrote: Splunk's version of arrays is multivalue field, so if you change you input to a multivalue field, you could do something like this   | eval Tag = split(lower("Tag3,Tag4"),"... See more...
@ITWhisperer wrote: Splunk's version of arrays is multivalue field, so if you change you input to a multivalue field, you could do something like this   | eval Tag = split(lower("Tag3,Tag4"),",") | spath | foreach *Tags{} [| eval field="<<FIELD>>" | foreach <<FIELD>> mode=multivalue [| eval tags=if(isnull(tags),if(mvfind(Tag,lower('<<ITEM>>')) >= 0, field, null()),mvappend(tags, if(mvfind(Tag,lower('<<ITEM>>')) >= 0, field, null())))] ] | stats values(tags)   Thank you for your response and the example, currently it is returnin 0 results for me. Could it have something to do with my Splunk version? I a
@danielbb Please, don't forget to accept this solution if it fits your needs. 
spath works because you're extracting just the json part with the rex command and only apply spath on that json field. Yes, you can create a macro but you will still need to manually execute that ma... See more...
spath works because you're extracting just the json part with the rex command and only apply spath on that json field. Yes, you can create a macro but you will still need to manually execute that macro in your search. Setting KV_MODE to json will not hurt (maybe except for some minor performance hit) but will not help.
And can I try giving kv_mode = JSON just to check my luck? What will be the consequences if it don't work? Please guide me through steps....
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for ... See more...
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for the AI to process and find thresholds for. Sort of Trust test of the AI really does find usuable thresholds. ) Example:  Every 5 minutes the KPI takes the latest value which I've set to correspond with the current weekday (+ minimal variance) For example: All KPI values on Mondays are within the range of 100-110, Tuesdays 200-210, Wednesdays 300-310 and so forth.  This is a preview of the data:  Now after a successful backfill of 30 days I would have expected the AI to see that each weekday needs its own time policy and thresholds.  However the result was this:  No weekdays detected, and instead it finds time policies for every 4hours regardless of days?  By now I've tried all possible adjustments I could think of (increasing the number of data points, greater differences between data points, other algorithmn, waiting for the next in hopes it would recalibrate itself over midnight, etc.) Hardly any improments at all and the thresholds are not usuable like this as it would not be able to detect outliers on mondays (expected values 100-110, outlier would 400 but not detected as it's still within thresholds. Thus my question to the community: Does anyone have some ideas/suggestions how I could make the AI understand the simple idea of "weekly time policies" and how I could tweak it? (Aside from doing everything manually and ditching the AI-Idea as a whole)?  Does anyone have good experience with Splunk AI defining Thresholds and if so what were the use cases?
Yes it's the latter case. But search query I mentioned above (spath) is working perfectly. Is there any way I can achieve this? If this is not possible, can I make macro of that query and use it in s... See more...
Yes it's the latter case. But search query I mentioned above (spath) is working perfectly. Is there any way I can achieve this? If this is not possible, can I make macro of that query and use it in search query ? I don't know how customer feels to it.     
I know it's a json. But is it the whole event? Or does the event have additional pieces? So does the event look like this: { "a":"b", "c":"d" } or more like this <12>Nov 12 20:15:12 localhost what... See more...
I know it's a json. But is it the whole event? Or does the event have additional pieces? So does the event look like this: { "a":"b", "c":"d" } or more like this <12>Nov 12 20:15:12 localhost whatever: data={"a":"b","c":"d"} and you only want the json part parsed? In the former case, it's enough to set KV_MODE to json (but KV_MODE=json doesn't handle multilevel field names). If it's the latter - that's the situation I described - Splunk cannot handle the structured _part_ automatically.
Hi @PickleRick , It's a structured json query we have and it is not extracting field values automatically. Everytime we need to give command in search which is not the customer wants. They want this... See more...
Hi @PickleRick , It's a structured json query we have and it is not extracting field values automatically. Everytime we need to give command in search which is not the customer wants. They want this extraction to be default. 
Yes, and the issues follows over to the cloned entry.
Unfortunately, at this moment Splunk can only do automatic structured data extraction if the whole event is well-formed structured data. So if your whole event is a json blob - Splunk can interpret i... See more...
Unfortunately, at this moment Splunk can only do automatic structured data extraction if the whole event is well-formed structured data. So if your whole event is a json blob - Splunk can interpret it automatically. If it isn't because it contains some header or footer, it's a no-go. There is an open idea about this on ideas.splunk.com - https://ideas.splunk.com/ideas/EID-I-208 Feel free to upvote it. For now all you can do is to trim your original event to contain only the json part. (But then you might lose some data, I know).
Description: Hello, I am experiencing an issue with the "event_id" field when transferring notable events from Splunk Enterprise Security (ES) to Splunk SOAR. Details: When sending the event to ... See more...
Description: Hello, I am experiencing an issue with the "event_id" field when transferring notable events from Splunk Enterprise Security (ES) to Splunk SOAR. Details: When sending the event to SOAR using an Adaptive Response Action (Send to SOAR), the event is sent successfully, but the "event_id" field does not appear in the data received in SOAR. Any assistance or guidance to resolve this issue would be greatly appreciated. Thank you
Yes, this is the way. Thanks @ITWhisperer  this is exactly what I was looking for.
Hi @pumphreyaw , @mattymo  Now I am stuck in same problem. We don't have HF actually. We have deployment server which push apps to our manager and deployer. From there manager will push apps to peer... See more...
Hi @pumphreyaw , @mattymo  Now I am stuck in same problem. We don't have HF actually. We have deployment server which push apps to our manager and deployer. From there manager will push apps to peers nodes. We have 3 search heads and a deployer.  Where I need to give these configurations to extract json data? Can you please help me step by step?
It shouldn't hurt. If you escape something that doesn't need escaping, nothing bad should happen. It's just ugly.
Hello @dbray_sd  Have you tried by cloning older input and creating new one ? Sometimes checkpoint fails during upgrade but cloning new input will create checkpoint and possibly resolves your issue.
There can be multiple reason behind streamfwd.exe not running, you should file support case to get this fixed.