All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Where to you wanna  exactly add the hostname of the uf? To the log event itself or do you wanna override the host metadata field?   Maybe following links could be helpful for you: Set host values ... See more...
Where to you wanna  exactly add the hostname of the uf? To the log event itself or do you wanna override the host metadata field?   Maybe following links could be helpful for you: Set host values based on event data - Splunk Documentation Set a default host for a file or directory input - Splunk Documentation   Feel free to share your configuration to double check it.
Congratulations for heeding @PickleRick's advice and repost your search in text.  Now, let me try to understand this use case.  You are trying to use a lookup file to generate SPL code for some other... See more...
Congratulations for heeding @PickleRick's advice and repost your search in text.  Now, let me try to understand this use case.  You are trying to use a lookup file to generate SPL code for some other purpose. For that generated code, you wish to use multisearch.  But that multisearch has nothing to do with the question itself.  Is this accurate? Then, you want use the returned values from inputlookup as regex to match an indexed field named Web.url in a tstats command.  Is this correct? Documentation on tstats will tell you that the where clause of this command can only accept filters applicable in search command; in fact, only a fraction of these filters.  In other words, you cannot use those regex directly in tstats command. This is not to say that your search goal cannot be achieved.  You just need to restructure the subsearches so you can use the where command instead of where clause in tstats.  But let me first point out that your text illustration of the search not only does not match your screenshot, but also is wrong because url_regex is no longer used in the field filter, therefore no longer used in formulation of the search field.  You cannot possibly get the output as your screenshot show.  There is another "transcription" error in the last eval command as well because the syntax is incorrect. Correcting for those errors and simplifying the commands, here is something you can adapt:   | inputlookup my_lookup_file where Justification="Lookup Instructions" | eval search = "[| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype=\"mysourcetype\" by Web.url Web.user | where match(Web.url, \"" . url_regex . "\")]" | stats values(search) as search | eval search = "| multisearch " . mvjoin(search, " ")   Suppose your my_lookup_file contains the following entries (ignoring description field as it is not being used; also ignore fillnull because "*" is not a useful regex to match any URL.) url_regex regex [re]gex ^regex regex$ the above search will give you search | multisearch [| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype="mysourcetype" by Web.url Web.user | where match(Web.url, "[re]gex")] [| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype="mysourcetype" by Web.url Web.user | where match(Web.url, "^regex")] [| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype="mysourcetype" by Web.url Web.user | where match(Web.url, "regex")] [| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype="mysourcetype" by Web.url Web.user | where match(Web.url, "regex$")] Is this what you are looking for? Here is full emulation to get the above input and output:   | makeresults format=csv data="url_regex regex [re]gex ^regex regex$" ``` the above emulates | inputlookup my_lookup_file where Justification="Lookup Instructions" ``` | eval search = "[| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype=\"mysourcetype\" by Web.url Web.user | where match(Web.url, \"" . url_regex . "\")]" | stats values(search) as search | eval search = "| multisearch " . mvjoin(search, " ")   Play with it and compare with your real lookup.
"field for label" and "field for value" are not generic terms used in Splunk practice.  May be their meaning is clear in your context or in your organization, but for volunteers here, you need to de... See more...
"field for label" and "field for value" are not generic terms used in Splunk practice.  May be their meaning is clear in your context or in your organization, but for volunteers here, you need to define them, describe them in plain language without SPL. You need to give some example search where you are using a token, illustrate what values the token carries (you mentioned something works with a single value but not when more than one value is passed), illustrate what the result is supposed to look like (expected results) - to do this, you may also need to illustrate data given to that search, and illustrate what actual result you get when multiple values are passed to the search.  Additionally, explain the difference between actual result and expected result if that is not painfully obvious. In short, you need to follow the golden rules of asking an answerable question.  I call them Four Commandments: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search (SPL that volunteers here do not have to look at). Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious.
Hello @yuanliu ,    Thanks for your reply.    Already the query of input dropdown can pass multiselect values, here I'm having two field values one id for field for label and another one is for f... See more...
Hello @yuanliu ,    Thanks for your reply.    Already the query of input dropdown can pass multiselect values, here I'm having two field values one id for field for label and another one is for field for value. I need to pass field for value to the search, which is working fine in the current search, But i need to pass field for label values to the search, where us its a multi select values. Please let me know if i missed anything. Thanks!
able to get rid of one error:  
Thanks @PickleRick for your reply.  Hello Everyone.. i am pretty new for App dev, so pardon me, let me explain you step by step: 1) the external_cmd ---- ucd_category_lookup.py script is present.... See more...
Thanks @PickleRick for your reply.  Hello Everyone.. i am pretty new for App dev, so pardon me, let me explain you step by step: 1) the external_cmd ---- ucd_category_lookup.py script is present...screenshot: 2) the lookup definition was not created previously, so just i created one. given permissions to search app, selected the ucd_category_lookup.py as the external command, added and verified the fields properly 3) lookup definition done, restart splunk done, but still same error.  4) i thought automatic lookups is what needed, created one: 5) restart Splunk done, still the same error.   
PickleRick-san gcusello-san Thank you for response. As you pointed out, KVstore did not seem to be working properly. I reinstalled it with Fips-mode turned off and it worked fine. Is it impossib... See more...
PickleRick-san gcusello-san Thank you for response. As you pointed out, KVstore did not seem to be working properly. I reinstalled it with Fips-mode turned off and it worked fine. Is it impossible to install SplunkES on a Splunk Server running in Fips-mode?
Hi Splunk Community,  Is there a way to capture the host of a UF as its passing through a HF to add the host right before the log messaging being processed. I have tried a few things with no luck ... See more...
Hi Splunk Community,  Is there a way to capture the host of a UF as its passing through a HF to add the host right before the log messaging being processed. I have tried a few things with no luck but asking here while i dig through the documentations. Thanks!
Also unless you absolutely have to, avoid using leading wildcards in the search term host="*..."
It looks like this mystery has not been solved within those 11 years though! If your guess is correct, I guess they hadn't implemented Pull Requests yet whenever that code was added... I went lo... See more...
It looks like this mystery has not been solved within those 11 years though! If your guess is correct, I guess they hadn't implemented Pull Requests yet whenever that code was added... I went looking for this answer, though, because in Splunk Cloud the _audit index is set to this default value for retention, and you are unable to change the retention setting for internal indexes in Splunk Cloud, so if you have a requirement to retain audit logs for 6 years, you're technically short by 6 days!
   !!! NEW !!! Upcoming Cloud Event Alert !!! Register Today !!!Accelerate Digital Resilience with Splunk’s AI-Driven Cloud Platform | Thursday's, January 30 & February 6 | 10AM PT / 1PM ET Join us... See more...
   !!! NEW !!! Upcoming Cloud Event Alert !!! Register Today !!!Accelerate Digital Resilience with Splunk’s AI-Driven Cloud Platform | Thursday's, January 30 & February 6 | 10AM PT / 1PM ET Join us to hear top experts share how migrating to Splunk Cloud enhances data security and visibility, while empowering your organization to boost digital resilience to stay ahead in the AI era. Don’t miss this opportunity to learn from the best!
thank you for the response and I’ve updated the query now.   
I need to forward data from a heavy forwarder to two different indexer clusters. One of the clusters needs to have a field removed. If I use sedcmd in props.conf on the HF it removes it for both and ... See more...
I need to forward data from a heavy forwarder to two different indexer clusters. One of the clusters needs to have a field removed. If I use sedcmd in props.conf on the HF it removes it for both and putting sedcmd in props.conf on one of the indexers doesn't work (it does work if i bypass the HF).  Is there a way to do this? Edit: I was thinking of using an intermediate forwarder so heavy forwarder -> another heavy forwarder -> indexer cluster but the intermediate heavy forwarder props.conf does not work.
Sorry I should also mention that this also happens with the roles endpoint which is where I got that message.
I'm just an API consumer, so not sure if I have the addon installed (do you have more specific instructions for me to pass on to the Splunk admin?). As far as logs here is what I see: <msg type="ER... See more...
I'm just an API consumer, so not sure if I have the addon installed (do you have more specific instructions for me to pass on to the Splunk admin?). As far as logs here is what I see: <msg type="ERROR"> In handler 'splunk_ta_aws_iam_roles': bad character (52) in reply size</msg> </messages>
Ok. First and foremost - do you have the addon installed? Secondly - did you look what's in the logs?
1. Please don't post screenshots - copy-paste your code and results into code blocks or preformatted paragraphs. It makes it easier for everyone and is searchable. 2. You're trying to do something t... See more...
1. Please don't post screenshots - copy-paste your code and results into code blocks or preformatted paragraphs. It makes it easier for everyone and is searchable. 2. You're trying to do something that is generally not supported - you can generate conditions for a search dynamically by means of subsearch, not whole searches. To some extent you could use the map command but it is relatively limited. 3. You can't use multisearch with non-streaming commands (like tstats).
Hello Splunk experts, I’m currently trying to create a search using a multisearch command where I need to dynamically apply regex patterns from a lookup file to the Web.url field in a tstats search.... See more...
Hello Splunk experts, I’m currently trying to create a search using a multisearch command where I need to dynamically apply regex patterns from a lookup file to the Web.url field in a tstats search. When I use my current approach, it directly adds the regex value as a literal search condition instead of applying it as a regex filter. For example, instead of dynamically matching URLs with the regex, it ends up as if it’s searching for the literal pattern. I have a lookup that contains fields like url_regex and other filter parameters, and I need to: 1. Dynamically use these regex patterns in the search, so that only URLs matching the regex from the lookup get processed further. 2. Ensure that the logic integrates correctly within a multisearch, where the base search is filtered dynamically based on these values from the lookup. I’ve shared some screenshots showing the query and the resulting issue, where the regex appears to be used incorrectly. How can I properly use these regex values to match URLs instead of treating them as literal strings? Search :-  | inputlookup my_lookup_file | search Justification="Lookup Instructions" | fields url_regex, description | fillnull value="*" | eval url_regex="Web.url=\"" . url_regex . "\"" | eval filter="source=\"my_sourcetype\" " . "filter_field=" . " \"" | eval search="| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype=\"" . filter . " by Web.url Web.user" | stats values(search) as search | eval search=multisearch [ mvjoin(search, " | ") ] . " | stats count by search" As highlighted in the yellow from above I wanted to have the regex matching string instead of the direct regex search from events? Also, lastly, once the multisearch query generates another search as output, how can I automatically execute that resulting search within my main query? Any guidance would be greatly appreciated!
Hi there, I'm using this API: https://splunk.github.io/splunk-add-on-for-amazon-web-services/APIreference/ Whenever I send a POST request to create metadata inputs that already exists, I get a 500 I... See more...
Hi there, I'm using this API: https://splunk.github.io/splunk-add-on-for-amazon-web-services/APIreference/ Whenever I send a POST request to create metadata inputs that already exists, I get a 500 Internal Server Error. Error: Unable to create metadata inputs: Unexpected HTTP status: 500 Internal Server Error (500) Expected behaviour: Do not return 500, return a payload that indicates that the resource already exists.
I've found the custom triggers to be unreliable at best.  What works better is to put the alert condition in the search query and have the alert trigger when the number of results is not zero.