All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if ... See more...
Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if there's another recommended way to achieve this directly from Splunk Cloud. I've been testing with the following endpoints: /services/search/jobs/ /servicesNS/admin/search/search/jobs But in both cases, I only get a 404 error indicating that the URL is not valid. Could you guide me on how to configure data collection in this format? What would be the correct endpoint? Which key parameters should I include in my request? Or, if there's an easier or more direct method, I'd appreciate it if you could explain. The version of Splunk I'm using is 9.3.2408.104. Thank you in advance for your help!
Hi @Karthikeya , you have to add this option to the stanza in props.conf where your sourcetype is defined. Then you have to add this props.conf to the add-on containing the inputs.conf and to the S... See more...
Hi @Karthikeya , you have to add this option to the stanza in props.conf where your sourcetype is defined. Then you have to add this props.conf to the add-on containing the inputs.conf and to the Search Head. Ciao. Giuseppe
Hi @kamlesh_vaghela   "I followed your previous instructions but encountered an error in my console, which is consistent with the issue in my primary use case. I suspect the problem lies in the plac... See more...
Hi @kamlesh_vaghela   "I followed your previous instructions but encountered an error in my console, which is consistent with the issue in my primary use case. I suspect the problem lies in the placement of my JavaScript file. Currently, the directory structure is as follows: Python script: /opt/splunk/etc/apps/search/bin JavaScript file: /opt/splunk/etc/apps/search/appserver/static Could you please help me identify if this directory setup might be causing the issue?"
@mattymo this is how my splunk events looks like: <12>Nov 12 20:15:12 localhost whatever: data={"a":"b","c":"d"} the rest are json fields.. As of now we are giving spath command in search which is ... See more...
@mattymo this is how my splunk events looks like: <12>Nov 12 20:15:12 localhost whatever: data={"a":"b","c":"d"} the rest are json fields.. As of now we are giving spath command in search which is not acceptable by customer. They want this json data fields to be extracted automatically once the on-boarding is done. Can I give indexed_extractions=json or kv_mode=json to achieve this? I am not sure where to give these settings? Iif i can achieve my requirement through this, please guide me through the steps atleast.
Hi Karthikeya! Are you parsing JSON out of a non-JSON payload? what would a sample event look like, are they not JSON to begin with? Do you need the rest of the event in splunk? or just the JSON ... See more...
Hi Karthikeya! Are you parsing JSON out of a non-JSON payload? what would a sample event look like, are they not JSON to begin with? Do you need the rest of the event in splunk? or just the JSON part? The short answer is once you prove your extraction works for all your events in search,  then you can move the regex parsing to the "props and transforms" configuration so you dont need to run it every time someone searches that sourcetype.  It is not possible to give you every step as it depends on your data and outcomes and environment, but from what you simply shared see this documentation - https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/Knowledge/Createandmaintainsearch-timefieldextractionsthroughconfigurationfiles  
I've seen that, but I don't see in it the right way to move between topologies.
Sorry for not getting terms right. So I started with an AIO. I added a Cluster Manager and Two Indexes. I connected the AIO to this as the Search Head. In that process I lost all of the settings and... See more...
Sorry for not getting terms right. So I started with an AIO. I added a Cluster Manager and Two Indexes. I connected the AIO to this as the Search Head. In that process I lost all of the settings and data that were in the AIO.
Hi @mattymo , Here is the question link - https://community.splunk.com/t5/Getting-Data-In/Query-to-be-auto-applied/m-p/708893.. Please help me out there.
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A cal... See more...
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A calls Tier B with HTTP,  Tier C calls these specific backends on nnn ports.  Or some utility that recursively "walks" the tree of Tiers/Nodes/Backends using the Application Model API calls?
Where I need to set kv_mode = json? 
Hey @Karthikeya  as always "it depends" what you mean by "extract json data" and what problem you are trying to solve? Are you seeing duplicate extractions? This thread talks about indexed extracti... See more...
Hey @Karthikeya  as always "it depends" what you mean by "extract json data" and what problem you are trying to solve? Are you seeing duplicate extractions? This thread talks about indexed extractions settings (in your case it would be needed on the UF) and search time "kv mode" settings (which would be on the Search Head) colliding. "Indexed Extractions" I would not suggest be the first step when working with JSON data. Splunk can extract well formed json at "search time", saving storage and search performance if it is not necessary to store the entire json blob in indexed fields.  So can you clarify what exactly you are doing, or even better post a new question and we can move there?   thanks!
Yes, foreach mode=multivalue appeared in 9.0.0.
m using 8.0.5
@ITWhisperer wrote: Splunk's version of arrays is multivalue field, so if you change you input to a multivalue field, you could do something like this   | eval Tag = split(lower("Tag3,Tag4"),"... See more...
@ITWhisperer wrote: Splunk's version of arrays is multivalue field, so if you change you input to a multivalue field, you could do something like this   | eval Tag = split(lower("Tag3,Tag4"),",") | spath | foreach *Tags{} [| eval field="<<FIELD>>" | foreach <<FIELD>> mode=multivalue [| eval tags=if(isnull(tags),if(mvfind(Tag,lower('<<ITEM>>')) >= 0, field, null()),mvappend(tags, if(mvfind(Tag,lower('<<ITEM>>')) >= 0, field, null())))] ] | stats values(tags)   Thank you for your response and the example, currently it is returnin 0 results for me. Could it have something to do with my Splunk version? I a
@danielbb Please, don't forget to accept this solution if it fits your needs. 
spath works because you're extracting just the json part with the rex command and only apply spath on that json field. Yes, you can create a macro but you will still need to manually execute that ma... See more...
spath works because you're extracting just the json part with the rex command and only apply spath on that json field. Yes, you can create a macro but you will still need to manually execute that macro in your search. Setting KV_MODE to json will not hurt (maybe except for some minor performance hit) but will not help.
And can I try giving kv_mode = JSON just to check my luck? What will be the consequences if it don't work? Please guide me through steps....
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for ... See more...
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for the AI to process and find thresholds for. Sort of Trust test of the AI really does find usuable thresholds. ) Example:  Every 5 minutes the KPI takes the latest value which I've set to correspond with the current weekday (+ minimal variance) For example: All KPI values on Mondays are within the range of 100-110, Tuesdays 200-210, Wednesdays 300-310 and so forth.  This is a preview of the data:  Now after a successful backfill of 30 days I would have expected the AI to see that each weekday needs its own time policy and thresholds.  However the result was this:  No weekdays detected, and instead it finds time policies for every 4hours regardless of days?  By now I've tried all possible adjustments I could think of (increasing the number of data points, greater differences between data points, other algorithmn, waiting for the next in hopes it would recalibrate itself over midnight, etc.) Hardly any improments at all and the thresholds are not usuable like this as it would not be able to detect outliers on mondays (expected values 100-110, outlier would 400 but not detected as it's still within thresholds. Thus my question to the community: Does anyone have some ideas/suggestions how I could make the AI understand the simple idea of "weekly time policies" and how I could tweak it? (Aside from doing everything manually and ditching the AI-Idea as a whole)?  Does anyone have good experience with Splunk AI defining Thresholds and if so what were the use cases?
Yes it's the latter case. But search query I mentioned above (spath) is working perfectly. Is there any way I can achieve this? If this is not possible, can I make macro of that query and use it in s... See more...
Yes it's the latter case. But search query I mentioned above (spath) is working perfectly. Is there any way I can achieve this? If this is not possible, can I make macro of that query and use it in search query ? I don't know how customer feels to it.     
I know it's a json. But is it the whole event? Or does the event have additional pieces? So does the event look like this: { "a":"b", "c":"d" } or more like this <12>Nov 12 20:15:12 localhost what... See more...
I know it's a json. But is it the whole event? Or does the event have additional pieces? So does the event look like this: { "a":"b", "c":"d" } or more like this <12>Nov 12 20:15:12 localhost whatever: data={"a":"b","c":"d"} and you only want the json part parsed? In the former case, it's enough to set KV_MODE to json (but KV_MODE=json doesn't handle multilevel field names). If it's the latter - that's the situation I described - Splunk cannot handle the structured _part_ automatically.