All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| rex max_match=0 "(?<deleted_count>\{[^\}\{]+deleted_count[^\}]+\})" | rex max_match=0 field=deleted_count "\"\d+\":\"(?<count>\d+)" | eval count=sum(count)
This looks awfully like https://community.splunk.com/t5/Splunk-Search/How-to-extract-a-value-from-the-message-using-rex-command/m-p/652673/highlight/true#M225564, but the data fragment is somewhat mo... See more...
This looks awfully like https://community.splunk.com/t5/Splunk-Search/How-to-extract-a-value-from-the-message-using-rex-command/m-p/652673/highlight/true#M225564, but the data fragment is somewhat more straight-forward.  Let me give it a try using a similar method. (BTW, the data frame is not CSV, but a JSON object.) You never indicated whether the cited data is the complete raw event.  I will assume that it is in the following.  I also assume that you are using Splunk 8 or higher so you can take advantage of JSON functions.  Further more, I assume that you want to list each deleted_count with the corresponding table_name.  An additional assumption used in both is that each message contains < 9999 table names.   | eval message = replace(_raw, "Dataframe row : ", "") | spath input=message | eval json0 = json_object() | foreach c0.* [eval json0 = json_set(json0, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json1 = json_object() | foreach c1.* [eval json1 = json_set(json1, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json2 = json_object() | foreach c2.* [eval json2 = json_set(json2, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json3 = json_object() | foreach c3.* [eval json3 = json_set(json3, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | fields - _* c*.* message | eval frag = mvrange(1, mvcount(json_array_to_mv(json_keys(json0)))) | eval frag = mvmap(frag, printf("%04d", frag)) | eval jsontable = mvmap(frag, json_object(json_extract(json0, "0000"), json_extract(json0, frag), json_extract(json1, "0000"), json_extract(json1, frag), json_extract(json2, "0000"), json_extract(json2, frag), json_extract(json3, "0000"), json_extract(json3, frag))) | mvexpand jsontable | foreach table_name deleted_count load_date redelivered_count [ eval "<<FIELD>>" = json_extract(jsontable, "<<FIELD>>")] | table table_name deleted_count redelivered_count load_date   Your sample data should give table_name deleted_count redelivered_count load_date pc_dwh_rdv.gdh_ls2lo_s99 18 0 2023-08-28 pc_dwh_rdv.gdh_spar_s99 8061 1 2023-08-28 pc_dwh_rdv.cml_kons_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_tf3tx_s99 366619 0 2023-08-28 pc_dwh_rdv.gdh_wechsel_s99 2 0 2023-08-28 pc_dwh_rdv.gdh_revolvingcreditcard_s99 1285 0 2023-08-28 pc_dwh_rdv.gdh_phd_s99 2484 204 2023-08-28 pc_dwh_rdv.gdh_npk_s99 1705 0 2023-08-28 pc_dwh_rdv.gdh_npk_s98 1517 0 2023-08-28 pc_dwh_rdv.gdh_kontokorrent_s99 12998 0 2023-08-28 pc_dwh_rdv.gdh_gds_s99 13 0 2023-08-28 pc_dwh_rdv.gdh_dszins_s99 57 0 2023-08-28 pc_dwh_rdv.gdh_cml_vdarl_le_ext_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_cml_vdarl_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_avale_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_spar_festzi_s99 0 0 2023-08-28 pc_dwh_rdv_gdh_monat.gdh_phd_izr_monthly_s99 1315 0 2023-08-28 pc_dwh_rdv.gdh_orig_sparbr_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_terming_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_kredite_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_kksonst_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_baufi_daily_s99 0 0 2023-08-28 pc_dwh_rdv_creditcard.credit_card_s99 410973 0 2023-08-28 pc_dwh_rdv_csw.fkn_security_classification_s99 18588725 9293073 2023-08-28 pc_dwh_rdv_loan_appl.ccdb_loan_daily_s99 0 0 2023-08-28 pc_dwh_rdv_loan_appl.leon_loan_monthly_s99 0 0 2023-08-28 pc_dwh_rdv_loan_appl.nospk_loan_daily_s99 0 0 2023-08-28 pc_dwh_rdv_partnrdata.fkn_special_target_group_s99 0 0 2023-08-28 pc_dwh_rdv_talanx.insurance_s99 25238 0 2023-08-28 The following is an emulation you can play with and compare with real data   | makeresults | eval _raw = "Dataframe row : {\"_c0\":{\"0\":\"deleted_count\",\"1\":\"18\",\"2\":\"8061\",\"3\":\"0\",\"4\":\"366619\",\"5\":\"2\",\"6\":\"1285\",\"7\":\"2484\",\"8\":\"1705\",\"9\":\"1517\",\"10\":\"12998\",\"11\":\"13\",\"12\":\"57\",\"13\":\"0\",\"14\":\"0\",\"15\":\"0\",\"16\":\"0\",\"17\":\"1315\",\"18\":\"0\",\"19\":\"0\",\"20\":\"0\",\"21\":\"0\",\"22\":\"0\",\"23\":\"410973\",\"24\":\"18588725\",\"25\":\"0\",\"26\":\"0\",\"27\":\"0\",\"28\":\"0\",\"29\":\"25238\"},\"_c1\":{\"0\":\"load_date\",\"1\":\"2023-08-28\",\"2\":\"2023-08-28\",\"3\":\"2023-08-28\",\"4\":\"2023-08-28\",\"5\":\"2023-08-28\",\"6\":\"2023-08-28\",\"7\":\"2023-08-28\",\"8\":\"2023-08-28\",\"9\":\"2023-08-28\",\"10\":\"2023-08-28\",\"11\":\"2023-08-28\",\"12\":\"2023-08-28\",\"13\":\"2023-08-28\",\"14\":\"2023-08-28\",\"15\":\"2023-08-28\",\"16\":\"2023-08-28\",\"17\":\"2023-08-28\",\"18\":\"2023-08-28\",\"19\":\"2023-08-28\",\"20\":\"2023-08-28\",\"21\":\"2023-08-28\",\"22\":\"2023-08-28\",\"23\":\"2023-08-28\",\"24\":\"2023-08-28\",\"25\":\"2023-08-28\",\"26\":\"2023-08-28\",\"27\":\"2023-08-28\",\"28\":\"2023-08-28\",\"29\":\"2023-08-28\"},\"_c2\":{\"0\":\"redelivered_count\",\"1\":\"0\",\"2\":\"1\",\"3\":\"0\",\"4\":\"0\",\"5\":\"0\",\"6\":\"0\",\"7\":\"204\",\"8\":\"0\",\"9\":\"0\",\"10\":\"0\",\"11\":\"0\",\"12\":\"0\",\"13\":\"0\",\"14\":\"0\",\"15\":\"0\",\"16\":\"0\",\"17\":\"0\",\"18\":\"0\",\"19\":\"0\",\"20\":\"0\",\"21\":\"0\",\"22\":\"0\",\"23\":\"0\",\"24\":\"9293073\",\"25\":\"0\",\"26\":\"0\",\"27\":\"0\",\"28\":\"0\",\"29\":\"0\"},\"_c3\":{\"0\":\"table_name\",\"1\":\"pc_dwh_rdv.gdh_ls2lo_s99\",\"2\":\"pc_dwh_rdv.gdh_spar_s99\",\"3\":\"pc_dwh_rdv.cml_kons_s99\",\"4\":\"pc_dwh_rdv.gdh_tf3tx_s99\",\"5\":\"pc_dwh_rdv.gdh_wechsel_s99\",\"6\":\"pc_dwh_rdv.gdh_revolvingcreditcard_s99\",\"7\":\"pc_dwh_rdv.gdh_phd_s99\",\"8\":\"pc_dwh_rdv.gdh_npk_s99\",\"9\":\"pc_dwh_rdv.gdh_npk_s98\",\"10\":\"pc_dwh_rdv.gdh_kontokorrent_s99\",\"11\":\"pc_dwh_rdv.gdh_gds_s99\",\"12\":\"pc_dwh_rdv.gdh_dszins_s99\",\"13\":\"pc_dwh_rdv.gdh_cml_vdarl_le_ext_s99\",\"14\":\"pc_dwh_rdv.gdh_cml_vdarl_s99\",\"15\":\"pc_dwh_rdv.gdh_avale_s99\",\"16\":\"pc_dwh_rdv.gdh_spar_festzi_s99\",\"17\":\"pc_dwh_rdv_gdh_monat.gdh_phd_izr_monthly_s99\",\"18\":\"pc_dwh_rdv.gdh_orig_sparbr_daily_s99\",\"19\":\"pc_dwh_rdv.gdh_orig_terming_daily_s99\",\"20\":\"pc_dwh_rdv.gdh_orig_kredite_daily_s99\",\"21\":\"pc_dwh_rdv.gdh_orig_kksonst_daily_s99\",\"22\":\"pc_dwh_rdv.gdh_orig_baufi_daily_s99\",\"23\":\"pc_dwh_rdv_creditcard.credit_card_s99\",\"24\":\"pc_dwh_rdv_csw.fkn_security_classification_s99\",\"25\":\"pc_dwh_rdv_loan_appl.ccdb_loan_daily_s99\",\"26\":\"pc_dwh_rdv_loan_appl.leon_loan_monthly_s99\",\"27\":\"pc_dwh_rdv_loan_appl.nospk_loan_daily_s99\",\"28\":\"pc_dwh_rdv_partnrdata.fkn_special_target_group_s99\",\"29\":\"pc_dwh_rdv_talanx.insurance_s99\"}}" ``` data emulation above ```  
UP
Thank you for the workaround I will check out this approach
I think the issue is, at times I insert manual inputs in this textbox. So when I do that, on the further click the token value is not getting inserted properly. Wondering whether I should clear/reset... See more...
I think the issue is, at times I insert manual inputs in this textbox. So when I do that, on the further click the token value is not getting inserted properly. Wondering whether I should clear/reset the data/ token
Hi @Thulasinathan_M, I tried in my splunk and it runs: <fieldset submitButton="false"> <input type="text" token="token1"> <label>Token1</label> </input> <input type="text" token... See more...
Hi @Thulasinathan_M, I tried in my splunk and it runs: <fieldset submitButton="false"> <input type="text" token="token1"> <label>Token1</label> </input> <input type="text" token="token2"> <label>Token2</label> <default>$token1$</default> </input> </fieldset> Ciao. Giuseppe
Hi @gcusello Yes tried, but that doesn't seems to be working.
| rex field=log "\<>\<>\<>\<>\|\|(?<temp>[^\|]+)\|\|\s(?<message>.+)"
Hi @Thulasinathan_M, did you tried to insert the token in the default tag ? <input type="text" token="your_text_token"> <label>Text label</label> <default>$your_previous_token$</def... See more...
Hi @Thulasinathan_M, did you tried to insert the token in the default tag ? <input type="text" token="your_text_token"> <label>Text label</label> <default>$your_previous_token$</default> <prefix>your_field="</prefix> <suffix>"</suffix> </input> Ciao. Giuseppe
Hi Splunk Experts. I've a table with multiple fields, based on a click I've created a token to get a value of it. I need to pass this token's value to a Textbox of an another panel. Is it Possible. ... See more...
Hi Splunk Experts. I've a table with multiple fields, based on a click I've created a token to get a value of it. I need to pass this token's value to a Textbox of an another panel. Is it Possible. Please advice!!
I tried to follow the answer and found that the data could not be transferred. The cause was that the client's certificate is not trusted for TLS communication. Do you know how to solve this?
@ITWhisperer @Will you able to provide the Rex for the below log format too. <><><><>||1407||
  Dataframe row : {"_c0":{"0":"deleted_count","1":"18","2":"8061","3":"0","4":"366619","5":"2","6":"1285","7":"2484","8":"1705","9":"1517","10":"12998","11":"13","12":"57","13":"0","14":"0","15":"0... See more...
  Dataframe row : {"_c0":{"0":"deleted_count","1":"18","2":"8061","3":"0","4":"366619","5":"2","6":"1285","7":"2484","8":"1705","9":"1517","10":"12998","11":"13","12":"57","13":"0","14":"0","15":"0","16":"0","17":"1315","18":"0","19":"0","20":"0","21":"0","22":"0","23":"410973","24":"18588725","25":"0","26":"0","27":"0","28":"0","29":"25238"},"_c1":{"0":"load_date","1":"2023-08-28","2":"2023-08-28","3":"2023-08-28","4":"2023-08-28","5":"2023-08-28","6":"2023-08-28","7":"2023-08-28","8":"2023-08-28","9":"2023-08-28","10":"2023-08-28","11":"2023-08-28","12":"2023-08-28","13":"2023-08-28","14":"2023-08-28","15":"2023-08-28","16":"2023-08-28","17":"2023-08-28","18":"2023-08-28","19":"2023-08-28","20":"2023-08-28","21":"2023-08-28","22":"2023-08-28","23":"2023-08-28","24":"2023-08-28","25":"2023-08-28","26":"2023-08-28","27":"2023-08-28","28":"2023-08-28","29":"2023-08-28"},"_c2":{"0":"redelivered_count","1":"0","2":"1","3":"0","4":"0","5":"0","6":"0","7":"204","8":"0","9":"0","10":"0","11":"0","12":"0","13":"0","14":"0","15":"0","16":"0","17":"0","18":"0","19":"0","20":"0","21":"0","22":"0","23":"0","24":"9293073","25":"0","26":"0","27":"0","28":"0","29":"0"},"_c3":{"0":"table_name","1":"pc_dwh_rdv.gdh_ls2lo_s99","2":"pc_dwh_rdv.gdh_spar_s99","3":"pc_dwh_rdv.cml_kons_s99","4":"pc_dwh_rdv.gdh_tf3tx_s99","5":"pc_dwh_rdv.gdh_wechsel_s99","6":"pc_dwh_rdv.gdh_revolvingcreditcard_s99","7":"pc_dwh_rdv.gdh_phd_s99","8":"pc_dwh_rdv.gdh_npk_s99","9":"pc_dwh_rdv.gdh_npk_s98","10":"pc_dwh_rdv.gdh_kontokorrent_s99","11":"pc_dwh_rdv.gdh_gds_s99","12":"pc_dwh_rdv.gdh_dszins_s99","13":"pc_dwh_rdv.gdh_cml_vdarl_le_ext_s99","14":"pc_dwh_rdv.gdh_cml_vdarl_s99","15":"pc_dwh_rdv.gdh_avale_s99","16":"pc_dwh_rdv.gdh_spar_festzi_s99","17":"pc_dwh_rdv_gdh_monat.gdh_phd_izr_monthly_s99","18":"pc_dwh_rdv.gdh_orig_sparbr_daily_s99","19":"pc_dwh_rdv.gdh_orig_terming_daily_s99","20":"pc_dwh_rdv.gdh_orig_kredite_daily_s99","21":"pc_dwh_rdv.gdh_orig_kksonst_daily_s99","22":"pc_dwh_rdv.gdh_orig_baufi_daily_s99","23":"pc_dwh_rdv_creditcard.credit_card_s99","24":"pc_dwh_rdv_csw.fkn_security_classification_s99","25":"pc_dwh_rdv_loan_appl.ccdb_loan_daily_s99","26":"pc_dwh_rdv_loan_appl.leon_loan_monthly_s99","27":"pc_dwh_rdv_loan_appl.nospk_loan_daily_s99","28":"pc_dwh_rdv_partnrdata.fkn_special_target_group_s99","29":"pc_dwh_rdv_talanx.insurance_s99"}}  
Hi @theprophet01, using a search like yours with Real-Time it isn't a good idea because you are using one CPU only for this search reducing the resources of your global Splunk infrastructure. It's ... See more...
Hi @theprophet01, using a search like yours with Real-Time it isn't a good idea because you are using one CPU only for this search reducing the resources of your global Splunk infrastructure. It's better to schedure a search e.g. every 5 minutes, so, when running is finished, the search releases the CPU for other jobs. In addition, your search could be optimized to reduce the execution time and the CPU use: | tstats max(_time) AS latest count BY host | eval recent= if(latest > relative_time(now(),"-5m"),1,0). realLatest = strftime(latest, "%Y-%M-%D %H%M%S") | where recent = 0 | rename host AS Host, realLatest AS "Latest Timestamp" | table Host, "Latest Timestamp" At least, using this search you find only the hosts that didn't send logs in the last 5 minutes, but that sent logs in the previous 10 minutes (using a timeframe of 15 minutes); if your host doesn't send logs for 15 minutes you loose this information. The best approach is having a lookup containing all the hosts to monitor (called e.g. perimeter.csv) containing at least one column (host) and running a search like the following: | tstats max(_time) AS latest count BY host | append [ | inputlookup perimeter.csv | eval count=0 | fields host count ] | stats max(_time) AS latest sum(count) AS total BY host | where total = 0 | rename host AS Host, realLatest AS "Latest Timestamp" | table Host, "Latest Timestamp" in this way you have to manage the lookup but you have a more affidable control. Ciao. Giuseppe
Hello, I have created a Splunk app and it is currently in marketplace. I am getting a timeout error while pulling data from my API into Splunk app. Upon investigation, I figured out that I need to ... See more...
Hello, I have created a Splunk app and it is currently in marketplace. I am getting a timeout error while pulling data from my API into Splunk app. Upon investigation, I figured out that I need to increase 'splunkdConnectionTimeout' from 30 sec to a higher value, in `$ SPLUNK_HOME /lib /python3.7 /site-packages /splunk /rest /__ init__. py’, line number 52. I want to modify this as and when the user installs my app, this modification should be applied upon restarting the splunk. I tried doing this by using `web. conf` file in my app but I am not sure where and how to use this. Please help me how can I do this.
Hi @yr, as I said, I don't know why, since some time Splunk changed approach using the same sourcetype for all WinEventLogs distinguishing them by source. I saw that you forced sourcetype in each i... See more...
Hi @yr, as I said, I don't know why, since some time Splunk changed approach using the same sourcetype for all WinEventLogs distinguishing them by source. I saw that you forced sourcetype in each inputs stanza, in this way you should be sure to have the sourcetype you want, in this way you shouldn't miss any log. I disagree with the last input stanza: Splunk logs are ingested in another input stanza and this is a duplication, in addition you forced sourcetype, in this way you're losing some internal monitoring features (e.g. Monitoring Console). Ciao. Giuseppe
Hello! Can you write a statue or instruction with examples about this and help us integrate a Splunk with Git? Begining from install Git, and finish to control scripts can easily be scripts to matc... See more...
Hello! Can you write a statue or instruction with examples about this and help us integrate a Splunk with Git? Begining from install Git, and finish to control scripts can easily be scripts to match scheduled or production rules with dedicated shell menus and so on. Thank you!
Hi @gcusello ,  Thank you very much for your assist. What you understand is correct, both of your query works perfectly fine as expected.
Hi,   I tried the follow the link https://splunkonbigdata.com/failed-to-start-kv-store-process-see-mongod-log-and-splunkd-log-for-details/ , surprisingly after restart everything back to normal.
Hi I need to do a dashboard in dashboard studio. I already configured some rules and if it triggers any I need to paint it on the dashboard, like this...