All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Yeah, same for me. I assume that it is a "feature" of Splunk Add-on Builder because it will use template from UI to generate globalConfig.json and <appname>_rh_account.py and not read content of the... See more...
Yeah, same for me. I assume that it is a "feature" of Splunk Add-on Builder because it will use template from UI to generate globalConfig.json and <appname>_rh_account.py and not read content of these files. So from my side best option is to edit <appname>_rh_account.py  and globalConfg.json and add new fields or edit existing and create a copy of this files somewhere. After each change in add-on builder code editor you need to replace both files. Some kind of option is to change code directly in /bin/input_<inputname>.py and only change versions of app in properties. So there will be no need to use code editor and overwright your changes. Unfortunately when you want to add new parameter to input you will need to prepare manualy new veriosn of globalConfig.json becasue best option will be to add it via add-on builder UI and add you custom modification to it.
Hi, we use the app Splunk Add-on for Microsoft Cloud Services version 5.3.1 on our HeavyForwarder. We ingest data from an eventhub which is splitted in a lot of eventhub names for different microsof... See more...
Hi, we use the app Splunk Add-on for Microsoft Cloud Services version 5.3.1 on our HeavyForwarder. We ingest data from an eventhub which is splitted in a lot of eventhub names for different microsoft services (e.x. sharepoint, exchange etc.) The default sourcetype is "mscs:azure:eventhub" but the data isn't parsed with that. In some forums it was mentioned using the sourcetype "ms:o365:management". Someone had the same trouble finding the correct sourcetype? That app itself as a lot of config in props/transforms. Thanks  
Hi @marnall, Thanks a lot for your feedback ! I also thought about using a password field instead of Global account settings.  But I will have to re-enter the client secret every time I update the... See more...
Hi @marnall, Thanks a lot for your feedback ! I also thought about using a password field instead of Global account settings.  But I will have to re-enter the client secret every time I update the input which is not a good experience for the end user. Again this solution "works" but temporary and I don't understand why it is not persistent
Hello @nvonkorff , @tgombos  It doesn't worked for me It works but for example when I update the script of the addon then click the "Finish" button it goes back to default value. Did you had th... See more...
Hello @nvonkorff , @tgombos  It doesn't worked for me It works but for example when I update the script of the addon then click the "Finish" button it goes back to default value. Did you had the same issue ? Thanks,
Hi @hahhhaxin , for csv files you have to add also to UFs. Ciao. Giuseppe
Hi @BalajiRaju  try using stats, but you have tyo span the timestamps, e.g. every hour: index=sample sample="value1" | bin span=1h -time | stats count BY _time field1 | where field1>30 | timechart... See more...
Hi @BalajiRaju  try using stats, but you have tyo span the timestamps, e.g. every hour: index=sample sample="value1" | bin span=1h -time | stats count BY _time field1 | where field1>30 | timechart values(count) AS count BY field1 Ciao. Giuseppe
Hi @gcusello  except for default props and trans, we do not add any another stanza in local UF props and transforms 
Yes it is typo. We do not use sourcetype because various types of csv files are ingested, to keep this change only effectively in the current asset, we locate with sourcetype
Hi @bowesmana  raw text is look like as below. {"traceid":"00000000000000000033000000000000","spanid":"0000000000000000","datacontenttype":"application/json","data":{"retentionMinutes":43200,"batc... See more...
Hi @bowesmana  raw text is look like as below. {"traceid":"00000000000000000033000000000000","spanid":"0000000000000000","datacontenttype":"application/json","data":{"retentionMinutes":43200,"batchSize":1000,"windowDurationSeconds":600},"messages":[],"specversion":"1.0","classofpayload":"com.vl.decanter.decanter.generici.domain.command.PurgeCommand","id":"ccbae519-foa4-4c0c-ad75-261720d764e5","source":"decanter-scheduler","time":"2024-11-19T09:30:00.058376Z","type":"PurgeEventOutbox"}
my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns so... See more...
my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns some results like  time                                               output1 output2  2024-11-13 04:00:00                8              30 2024-11-13 04:01:00                8              30   My question here is we need to compare the output1 and output2 like if the o/p1 more than 30% of o/p2 in 10 mins of interval.  
I am using multiple filter in that Error search is one of the filter in which i need to type the values or multiple values with comma and i need to filter the result   
Ok, for the first time I don't know which answer should I label as solution XD That because both @isoutamo and @dural_yyz hints helped me to build the final searche. Final result is: | rest splunk... See more...
Ok, for the first time I don't know which answer should I label as solution XD That because both @isoutamo and @dural_yyz hints helped me to build the final searche. Final result is: | rest splunk_server=local /servicesNS/-/-/configs/conf-savedsearches search="eai:acl.app=<app name here>" | rename "alert.track" as alert_track | eval type=case(alert_track=1, "alert", (isnotnull(actions) AND actions!="") AND (isnotnull(alert_threshold) AND alert_threshold!=""), "alert", (isnotnull(alert_comparator) AND alert_comparator!="") AND (isnotnull(alert_type) AND alert_type!="always"), "alert", true(), "report") | table title, type With this, I can get a table with searches title and its typology, I mean alert or report. Thanks to both!
Hi @karthi2809 , at first, if you have the fields in the main search, don't use the search command in the secondary lines but always in the main, then, the easiest way it to use the OR boolean oper... See more...
Hi @karthi2809 , at first, if you have the fields in the main search, don't use the search command in the secondary lines but always in the main, then, the easiest way it to use the OR boolean operator to divide words to search, instead commas: index=test Message="* ($Text_Token$) | sort PST_Time Ciao. Giuseppe
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashbo... See more...
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashboard XML: <input type="text" token="Text_Token" searchWhenChanged="true"> <label>Error Search (comm-seprated)</label> </input> index=test Message="*"| eval error_list=split("$Text_Token$", ",")| table PST_Time Environment Host Component FileName Message | search Message IN ("error_list") OR Environment=QDEV Component IN (AdminServer) FileName=*| search NOT Message IN ("*null*")|sort PST_Time  
You can store data in lookup files or KV store.
Your data is JSON, but you are showing a screenshot of Splunk presenting that data to you in a formatted way. Please click the show as raw text and show what time looks like in the RWA data, not the ... See more...
Your data is JSON, but you are showing a screenshot of Splunk presenting that data to you in a formatted way. Please click the show as raw text and show what time looks like in the RWA data, not the pretty-printed version.  
Hi @avifyi  Good day to you. thanks for the interesting question.  >>>cribl.io (for log optimization purpose and reducing log size) 1) May we know some details about how much data (approx) you a... See more...
Hi @avifyi  Good day to you. thanks for the interesting question.  >>>cribl.io (for log optimization purpose and reducing log size) 1) May we know some details about how much data (approx) you are having the plan? -------from Splunk DB Connector to cribl.io  2) may we know, approximately how much optimization and log size reduction you planning to achieve using the cribl.io? 3) though its doable task, it may not be necessary at all at sometimes   4) from where the Splunk DB Connector is reading the logs? lets say you have a DB X.  X DB ----- > Splunk DB Connector ----- > Cribl.io ------ > back to Splunk instead of this, maybe plan about X DB ------> cribl.io-------> to Splunk   Thanks and Best Regards (PS - my karma stats - given 2000+ and received 500. thanks for reading )
@tgombos Thank you thank you thank you! I just tried this and it worked perfectly. Sorry for the late reply on this. I think I worked around the issue in a kludgy way a while ago, but ran into th... See more...
@tgombos Thank you thank you thank you! I just tried this and it worked perfectly. Sorry for the late reply on this. I think I worked around the issue in a kludgy way a while ago, but ran into the same situation again recently. I went searching for an answer and found my old question with your answer on it. Again. Thank you. Great to have this working.
@Hi @christophecris  Good day to you. more details needed pls: 1) is it a test/dev or prod instance? 2) do you only face this issue or your colleagues as well? 3) any recent changes / upgrades in... See more...
@Hi @christophecris  Good day to you. more details needed pls: 1) is it a test/dev or prod instance? 2) do you only face this issue or your colleagues as well? 3) any recent changes / upgrades in the Splunk Enterprise (or is it the Splunk Cloud) 4) is it python 2 or 3 ? 5) are you a user with most access privilege's or is the access very tightly customized one?  6)did someone play with Splunk menu navigations xml file? thanks and best regards (PS - my karma stats - given 2000 and received 500. thanks for reading )
May I know where I can get Splunk Enterprise REST API OpenAPI Specification(OAS) JSON file?   Thanks