All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns so... See more...
my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns some results like  time                                               output1 output2  2024-11-13 04:00:00                8              30 2024-11-13 04:01:00                8              30   My question here is we need to compare the output1 and output2 like if the o/p1 more than 30% of o/p2 in 10 mins of interval.  
I am using multiple filter in that Error search is one of the filter in which i need to type the values or multiple values with comma and i need to filter the result   
Ok, for the first time I don't know which answer should I label as solution XD That because both @isoutamo and @dural_yyz hints helped me to build the final searche. Final result is: | rest splunk... See more...
Ok, for the first time I don't know which answer should I label as solution XD That because both @isoutamo and @dural_yyz hints helped me to build the final searche. Final result is: | rest splunk_server=local /servicesNS/-/-/configs/conf-savedsearches search="eai:acl.app=<app name here>" | rename "alert.track" as alert_track | eval type=case(alert_track=1, "alert", (isnotnull(actions) AND actions!="") AND (isnotnull(alert_threshold) AND alert_threshold!=""), "alert", (isnotnull(alert_comparator) AND alert_comparator!="") AND (isnotnull(alert_type) AND alert_type!="always"), "alert", true(), "report") | table title, type With this, I can get a table with searches title and its typology, I mean alert or report. Thanks to both!
Hi @karthi2809 , at first, if you have the fields in the main search, don't use the search command in the secondary lines but always in the main, then, the easiest way it to use the OR boolean oper... See more...
Hi @karthi2809 , at first, if you have the fields in the main search, don't use the search command in the secondary lines but always in the main, then, the easiest way it to use the OR boolean operator to divide words to search, instead commas: index=test Message="* ($Text_Token$) | sort PST_Time Ciao. Giuseppe
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashbo... See more...
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashboard XML: <input type="text" token="Text_Token" searchWhenChanged="true"> <label>Error Search (comm-seprated)</label> </input> index=test Message="*"| eval error_list=split("$Text_Token$", ",")| table PST_Time Environment Host Component FileName Message | search Message IN ("error_list") OR Environment=QDEV Component IN (AdminServer) FileName=*| search NOT Message IN ("*null*")|sort PST_Time  
You can store data in lookup files or KV store.
Your data is JSON, but you are showing a screenshot of Splunk presenting that data to you in a formatted way. Please click the show as raw text and show what time looks like in the RWA data, not the ... See more...
Your data is JSON, but you are showing a screenshot of Splunk presenting that data to you in a formatted way. Please click the show as raw text and show what time looks like in the RWA data, not the pretty-printed version.  
Hi @avifyi  Good day to you. thanks for the interesting question.  >>>cribl.io (for log optimization purpose and reducing log size) 1) May we know some details about how much data (approx) you a... See more...
Hi @avifyi  Good day to you. thanks for the interesting question.  >>>cribl.io (for log optimization purpose and reducing log size) 1) May we know some details about how much data (approx) you are having the plan? -------from Splunk DB Connector to cribl.io  2) may we know, approximately how much optimization and log size reduction you planning to achieve using the cribl.io? 3) though its doable task, it may not be necessary at all at sometimes   4) from where the Splunk DB Connector is reading the logs? lets say you have a DB X.  X DB ----- > Splunk DB Connector ----- > Cribl.io ------ > back to Splunk instead of this, maybe plan about X DB ------> cribl.io-------> to Splunk   Thanks and Best Regards (PS - my karma stats - given 2000+ and received 500. thanks for reading )
@tgombos Thank you thank you thank you! I just tried this and it worked perfectly. Sorry for the late reply on this. I think I worked around the issue in a kludgy way a while ago, but ran into th... See more...
@tgombos Thank you thank you thank you! I just tried this and it worked perfectly. Sorry for the late reply on this. I think I worked around the issue in a kludgy way a while ago, but ran into the same situation again recently. I went searching for an answer and found my old question with your answer on it. Again. Thank you. Great to have this working.
@Hi @christophecris  Good day to you. more details needed pls: 1) is it a test/dev or prod instance? 2) do you only face this issue or your colleagues as well? 3) any recent changes / upgrades in... See more...
@Hi @christophecris  Good day to you. more details needed pls: 1) is it a test/dev or prod instance? 2) do you only face this issue or your colleagues as well? 3) any recent changes / upgrades in the Splunk Enterprise (or is it the Splunk Cloud) 4) is it python 2 or 3 ? 5) are you a user with most access privilege's or is the access very tightly customized one?  6)did someone play with Splunk menu navigations xml file? thanks and best regards (PS - my karma stats - given 2000 and received 500. thanks for reading )
May I know where I can get Splunk Enterprise REST API OpenAPI Specification(OAS) JSON file?   Thanks
Hello, The issues got resolved. The port 8088 was used by other services causing that issue, had to kill that service to resolve that issue. Now working as expected. Thank you so much all.
Hey @isoutamo  Thank you for letting me know. But the token and the host URL provided are not the actual, I changed them a little. We should be fine. Thank you so much again.
Hi @majilan1  1) may i know if you understood the searchtime vs indextime 2) Indextime  - while indexing the data itself you can "catch" the required fields (this is called as index time) . 3) sea... See more...
Hi @majilan1  1) may i know if you understood the searchtime vs indextime 2) Indextime  - while indexing the data itself you can "catch" the required fields (this is called as index time) . 3) searchtime - if you didnt configure "indextime", then sometimes the fields may not be indexed(not catch'ed / caught when data onboarding). then we need to write the rex to catch the fields at search time. this is acceptable, but it if we use tooo many rex, splunk will struggle.  4) searchtime is always preferred over indextime - (this is a debatable topic), but as far as i remember, the splunk docs suggest us to use the search time instead of indextime.  5) situation like yours... complex list of field extractions... can be prepared and planned thru indextime. so splunk will not ask you its own limitations   thanks and best regards. (PS - my karma stats - given 2000 and received 500. thanks for reading )
PS - 2 karma points were given and i got notified and read and wanted to edit my answer to these questions,... and sorry for my wrong answer last time, hope nobody followed it even if somebody fol... See more...
PS - 2 karma points were given and i got notified and read and wanted to edit my answer to these questions,... and sorry for my wrong answer last time, hope nobody followed it even if somebody followed it, no harm done, the system will not work "workable splunk" )   >>>is it really necessary to run the splunk intsall on the new server , even when i detach and attach the disk... Yes, it is necessary to install Splunk once again(when u install the splunk, if the instance is already having the splunk, then, the installer script will detect the existing install and will ask you - do u really want to install or upgrade, etc) >>>the disk with /opt/splunk has the full setup. the installer script "intelligently" will take care this situation and ask your confirmation. >>>i understand to change the server.conf and inputs.conf, which ill take care.. but wanted to know if the splunk install was necessary in this case... it is suggested to add the drive from old system to the new system and then install the splunk on new system. the installer script will "understand" the existing data, version, etc.  EDIT - (PS - my karma stats - given 2000 and received 500. thanks for reading )
@christophecris This looks like python core functionality is broken? any details about your instance? what version? This might be a bug or incompatible OS package. Did this happen  after a change?... See more...
@christophecris This looks like python core functionality is broken? any details about your instance? what version? This might be a bug or incompatible OS package. Did this happen  after a change? or an upgrade? If this Helps, Please Upvote.
You could try using split to break up the field | eval fields=split(_raw, ";") | eval h_db_host=mvindex(fields,1) etc.
As @sainag_splunk says, use of unlimited wildcards (+, *) are usually the cause.  For others to help, you will need to post sample data that trigger these errors.  Usually the remedy is to analyze yo... See more...
As @sainag_splunk says, use of unlimited wildcards (+, *) are usually the cause.  For others to help, you will need to post sample data that trigger these errors.  Usually the remedy is to analyze your data boundaries and find more restrictive regex. 
It appears to me that you are overthinking the search language.  Assuming that RenderedMessage is already extracted (as is implied in your illustrated code), you can use roperties.application="xyz.a... See more...
It appears to me that you are overthinking the search language.  Assuming that RenderedMessage is already extracted (as is implied in your illustrated code), you can use roperties.application="xyz.api" (RenderedMessage="*$text_fnum$*" AND RenderedMessage="*$text_fdate$*") | spath Level | search Level!=Verbose AND Level!=Debug | eval combined_search_condition=mvjoin(mvfilter(search_condition_fnum!="") + mvfilter(search_condition_fdate!=""), " OR ") If you run this on paper, you will see that the wildcards will cause the search to behave as you described.
I am trying to create a dashboard. It has two input text fields. I want to run a search query based on these two inputs. If input A is null AND input B is null then no search results If input A ... See more...
I am trying to create a dashboard. It has two input text fields. I want to run a search query based on these two inputs. If input A is null AND input B is null then no search results If input A is not null AND input B is null then search using only A If input A is null AND input B is not null then search using only B If input A is null AND input B is not null then search using both A and B Following is my query. It returns no results    Properties.application="xyz.api" | spath Level | search Level!=Verbose AND Level!=Debug | eval search_condition_fnum=if(len(trim("$text_fnum$"))=0 OR isnull("$text_fnum$"), "", "RenderedMessage=\"*$text_fnum$*\"") | eval search_condition_fdate=if(len(trim("$text_fdate$"))=0 OR isnull("$text_fdate$"), "", "RenderedMessage=\"*$text_fdate$*\"") | eval combined_search_condition=mvjoin(mvfilter(search_condition_fnum!="") + mvfilter(search_condition_fdate!=""), " OR ") | table search_condition_fnum, search_condition_fdate, combined_search_condition | search [| makeresults | eval search_condition=mvjoin(mvfilter(search_condition_fnum!="") + mvfilter(search_condition_fdate!=""), " OR ") | fields search_condition]