All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

i am having a hard time integrating opencti into splunk, not sure if you have done it, can you help me
@priyanshuraj400 - I'm assuming you are using the `rest.simpleRequest` method. In there, you can pass a parameter called timeout. For example rest.simpleRequest(apiPath, sessionKey=mySessionKey, me... See more...
@priyanshuraj400 - I'm assuming you are using the `rest.simpleRequest` method. In there, you can pass a parameter called timeout. For example rest.simpleRequest(apiPath, sessionKey=mySessionKey, method='GET', timeout=None)   I hope this helps!!!
Good day The following problem: I load data into Splunk once a week. However, not always on the same day. I now want to show a trend to last week on a dashboard, but the span option must fit to the... See more...
Good day The following problem: I load data into Splunk once a week. However, not always on the same day. I now want to show a trend to last week on a dashboard, but the span option must fit to the day. Is there a way that the span option automatically adjusts to the next date where there is data? Or do you have another suggestion how I can solve the problem? Currently, if the span does not fit exactly, I have an increase of 100%. My current search query is very basic: index=test CVSS_v3_Severity=$severity_tok$ Operating_System_Generation=$os_dd_tok$ | dedup CVE | timechart span=7d count Thanks in advance and best regards Nico
Hi. i have a search a show a graphchart for 14 months. If i change the timepicker it still shows 14 months for some reason. As you can see  in the picture, the time picker says 30 days, but the gr... See more...
Hi. i have a search a show a graphchart for 14 months. If i change the timepicker it still shows 14 months for some reason. As you can see  in the picture, the time picker says 30 days, but the graph still shows 14 months. What gives? Also, is there a way to display a trendline on the graph? If i use the | trendline sma10(Cores) or the like, it changes the graph instead of just showing a linear line
Please share the complete event which is not working for you (anonymised of course). Please use a code block </> so the formatting and special characters are preserved.
@ITWhisperer @Whatever you provided rex expression is not fetching the values 
Have you tried my suggestion on your actual events? (You don't need to include the lines which attempt to set up sample data based on the example you posted.)
Hi @Thulasinathan_M , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Ka... See more...
Hi @Thulasinathan_M , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Can someone please help to fetch the other fields like groupByUser?  
@ssharm223 One thing that may be worth trying is adding the app to your connection parameters. I get a different error when I do this and it may simply be that my permissions aren't set up correctly.
@ssharm223 did you ever get an answer to this? Guessing no? I'm having the same issue with accessing a csv lookup that I can access via the web UI, however attempting to access it via API gets me: N... See more...
@ssharm223 did you ever get an answer to this? Guessing no? I'm having the same issue with accessing a csv lookup that I can access via the web UI, however attempting to access it via API gets me: Non-result: ERROR The lookup table 'asset_lookup-by_str' requires a .csv or KV store lookup definition.. However changing the search to "|inputlookup asset_lookup-by_str.csv" still gets me: Non-result: ERROR The lookup table 'asset_lookup-by_str.csv' requires a .csv or KV store lookup definition.. I suspect there is some combination of non-filesystem access and non-default csv locations that means we are SOL, but happy to be proven wrong by the brains trust!
So, we can;t make a regex on search to fetch the fields values ?
After resetting token, it started working. Thanks!!
It needs a longer explanation. I believe long time ago the things were as you tried to set them up - the events were distinguishable by sourcetypes. But since there is no actual need to treat them as... See more...
It needs a longer explanation. I believe long time ago the things were as you tried to set them up - the events were distinguishable by sourcetypes. But since there is no actual need to treat them as separate sourcetypes (sourcetype defines how the data is processed - ingested and parsed) because the data is in the same format regardless of which particular EventLog channel it came from and having separate sourcetypes for each EventLog  channel would mean that you'd need to define settings for each new channel you ingest (and you can pull any of the channels you see in your EventLog!). So there was a shift in the approach to windows events (and it happened looooong time ago). And in order to accomodate all those forwarders installed long time ago and still working with old defaults (configured as you tried to set it up), there are transforms in TA_windows which "normalize" the sources and sourcetypes. This is from default/transforms.conf: ## Setting generic sourcetype and unique source [ta-windows-fix-classic-source] DEST_KEY = MetaData:Source REGEX = (?m)^LogName=(.+?)\s*$ FORMAT = source::WinEventLog:$1 [ta-windows-fix-xml-source] DEST_KEY = MetaData:Source REGEX = <Channel>(.+?)<\/Channel>.* FORMAT = source::XmlWinEventLog:$1 [ta-windows-fix-sourcetype] SOURCE_KEY = MetaData:Sourcetype DEST_KEY = MetaData:Sourcetype REGEX = sourcetype::([^:]*) FORMAT = sourcetype::$1 Even if you explicitly configure your inputs to provide source and sourcetype "old style" the transforms will get invoked during indexing an will overwrite the metadata fields to the "new style". So all windows EventLog-sourced events are of either WinEventLog sourcetype or XmlWinEvenLog one (depending on whether you ingest them as "classic" or XML).
| rex max_match=0 "(?<deleted_count>\{[^\}\{]+deleted_count[^\}]+\})" | rex max_match=0 field=deleted_count "\"\d+\":\"(?<count>\d+)" | eval count=sum(count)
This looks awfully like https://community.splunk.com/t5/Splunk-Search/How-to-extract-a-value-from-the-message-using-rex-command/m-p/652673/highlight/true#M225564, but the data fragment is somewhat mo... See more...
This looks awfully like https://community.splunk.com/t5/Splunk-Search/How-to-extract-a-value-from-the-message-using-rex-command/m-p/652673/highlight/true#M225564, but the data fragment is somewhat more straight-forward.  Let me give it a try using a similar method. (BTW, the data frame is not CSV, but a JSON object.) You never indicated whether the cited data is the complete raw event.  I will assume that it is in the following.  I also assume that you are using Splunk 8 or higher so you can take advantage of JSON functions.  Further more, I assume that you want to list each deleted_count with the corresponding table_name.  An additional assumption used in both is that each message contains < 9999 table names.   | eval message = replace(_raw, "Dataframe row : ", "") | spath input=message | eval json0 = json_object() | foreach c0.* [eval json0 = json_set(json0, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json1 = json_object() | foreach c1.* [eval json1 = json_set(json1, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json2 = json_object() | foreach c2.* [eval json2 = json_set(json2, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | eval json3 = json_object() | foreach c3.* [eval json3 = json_set(json3, printf("%04d", <<MATCHSTR>>), '<<FIELD>>')] | fields - _* c*.* message | eval frag = mvrange(1, mvcount(json_array_to_mv(json_keys(json0)))) | eval frag = mvmap(frag, printf("%04d", frag)) | eval jsontable = mvmap(frag, json_object(json_extract(json0, "0000"), json_extract(json0, frag), json_extract(json1, "0000"), json_extract(json1, frag), json_extract(json2, "0000"), json_extract(json2, frag), json_extract(json3, "0000"), json_extract(json3, frag))) | mvexpand jsontable | foreach table_name deleted_count load_date redelivered_count [ eval "<<FIELD>>" = json_extract(jsontable, "<<FIELD>>")] | table table_name deleted_count redelivered_count load_date   Your sample data should give table_name deleted_count redelivered_count load_date pc_dwh_rdv.gdh_ls2lo_s99 18 0 2023-08-28 pc_dwh_rdv.gdh_spar_s99 8061 1 2023-08-28 pc_dwh_rdv.cml_kons_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_tf3tx_s99 366619 0 2023-08-28 pc_dwh_rdv.gdh_wechsel_s99 2 0 2023-08-28 pc_dwh_rdv.gdh_revolvingcreditcard_s99 1285 0 2023-08-28 pc_dwh_rdv.gdh_phd_s99 2484 204 2023-08-28 pc_dwh_rdv.gdh_npk_s99 1705 0 2023-08-28 pc_dwh_rdv.gdh_npk_s98 1517 0 2023-08-28 pc_dwh_rdv.gdh_kontokorrent_s99 12998 0 2023-08-28 pc_dwh_rdv.gdh_gds_s99 13 0 2023-08-28 pc_dwh_rdv.gdh_dszins_s99 57 0 2023-08-28 pc_dwh_rdv.gdh_cml_vdarl_le_ext_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_cml_vdarl_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_avale_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_spar_festzi_s99 0 0 2023-08-28 pc_dwh_rdv_gdh_monat.gdh_phd_izr_monthly_s99 1315 0 2023-08-28 pc_dwh_rdv.gdh_orig_sparbr_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_terming_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_kredite_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_kksonst_daily_s99 0 0 2023-08-28 pc_dwh_rdv.gdh_orig_baufi_daily_s99 0 0 2023-08-28 pc_dwh_rdv_creditcard.credit_card_s99 410973 0 2023-08-28 pc_dwh_rdv_csw.fkn_security_classification_s99 18588725 9293073 2023-08-28 pc_dwh_rdv_loan_appl.ccdb_loan_daily_s99 0 0 2023-08-28 pc_dwh_rdv_loan_appl.leon_loan_monthly_s99 0 0 2023-08-28 pc_dwh_rdv_loan_appl.nospk_loan_daily_s99 0 0 2023-08-28 pc_dwh_rdv_partnrdata.fkn_special_target_group_s99 0 0 2023-08-28 pc_dwh_rdv_talanx.insurance_s99 25238 0 2023-08-28 The following is an emulation you can play with and compare with real data   | makeresults | eval _raw = "Dataframe row : {\"_c0\":{\"0\":\"deleted_count\",\"1\":\"18\",\"2\":\"8061\",\"3\":\"0\",\"4\":\"366619\",\"5\":\"2\",\"6\":\"1285\",\"7\":\"2484\",\"8\":\"1705\",\"9\":\"1517\",\"10\":\"12998\",\"11\":\"13\",\"12\":\"57\",\"13\":\"0\",\"14\":\"0\",\"15\":\"0\",\"16\":\"0\",\"17\":\"1315\",\"18\":\"0\",\"19\":\"0\",\"20\":\"0\",\"21\":\"0\",\"22\":\"0\",\"23\":\"410973\",\"24\":\"18588725\",\"25\":\"0\",\"26\":\"0\",\"27\":\"0\",\"28\":\"0\",\"29\":\"25238\"},\"_c1\":{\"0\":\"load_date\",\"1\":\"2023-08-28\",\"2\":\"2023-08-28\",\"3\":\"2023-08-28\",\"4\":\"2023-08-28\",\"5\":\"2023-08-28\",\"6\":\"2023-08-28\",\"7\":\"2023-08-28\",\"8\":\"2023-08-28\",\"9\":\"2023-08-28\",\"10\":\"2023-08-28\",\"11\":\"2023-08-28\",\"12\":\"2023-08-28\",\"13\":\"2023-08-28\",\"14\":\"2023-08-28\",\"15\":\"2023-08-28\",\"16\":\"2023-08-28\",\"17\":\"2023-08-28\",\"18\":\"2023-08-28\",\"19\":\"2023-08-28\",\"20\":\"2023-08-28\",\"21\":\"2023-08-28\",\"22\":\"2023-08-28\",\"23\":\"2023-08-28\",\"24\":\"2023-08-28\",\"25\":\"2023-08-28\",\"26\":\"2023-08-28\",\"27\":\"2023-08-28\",\"28\":\"2023-08-28\",\"29\":\"2023-08-28\"},\"_c2\":{\"0\":\"redelivered_count\",\"1\":\"0\",\"2\":\"1\",\"3\":\"0\",\"4\":\"0\",\"5\":\"0\",\"6\":\"0\",\"7\":\"204\",\"8\":\"0\",\"9\":\"0\",\"10\":\"0\",\"11\":\"0\",\"12\":\"0\",\"13\":\"0\",\"14\":\"0\",\"15\":\"0\",\"16\":\"0\",\"17\":\"0\",\"18\":\"0\",\"19\":\"0\",\"20\":\"0\",\"21\":\"0\",\"22\":\"0\",\"23\":\"0\",\"24\":\"9293073\",\"25\":\"0\",\"26\":\"0\",\"27\":\"0\",\"28\":\"0\",\"29\":\"0\"},\"_c3\":{\"0\":\"table_name\",\"1\":\"pc_dwh_rdv.gdh_ls2lo_s99\",\"2\":\"pc_dwh_rdv.gdh_spar_s99\",\"3\":\"pc_dwh_rdv.cml_kons_s99\",\"4\":\"pc_dwh_rdv.gdh_tf3tx_s99\",\"5\":\"pc_dwh_rdv.gdh_wechsel_s99\",\"6\":\"pc_dwh_rdv.gdh_revolvingcreditcard_s99\",\"7\":\"pc_dwh_rdv.gdh_phd_s99\",\"8\":\"pc_dwh_rdv.gdh_npk_s99\",\"9\":\"pc_dwh_rdv.gdh_npk_s98\",\"10\":\"pc_dwh_rdv.gdh_kontokorrent_s99\",\"11\":\"pc_dwh_rdv.gdh_gds_s99\",\"12\":\"pc_dwh_rdv.gdh_dszins_s99\",\"13\":\"pc_dwh_rdv.gdh_cml_vdarl_le_ext_s99\",\"14\":\"pc_dwh_rdv.gdh_cml_vdarl_s99\",\"15\":\"pc_dwh_rdv.gdh_avale_s99\",\"16\":\"pc_dwh_rdv.gdh_spar_festzi_s99\",\"17\":\"pc_dwh_rdv_gdh_monat.gdh_phd_izr_monthly_s99\",\"18\":\"pc_dwh_rdv.gdh_orig_sparbr_daily_s99\",\"19\":\"pc_dwh_rdv.gdh_orig_terming_daily_s99\",\"20\":\"pc_dwh_rdv.gdh_orig_kredite_daily_s99\",\"21\":\"pc_dwh_rdv.gdh_orig_kksonst_daily_s99\",\"22\":\"pc_dwh_rdv.gdh_orig_baufi_daily_s99\",\"23\":\"pc_dwh_rdv_creditcard.credit_card_s99\",\"24\":\"pc_dwh_rdv_csw.fkn_security_classification_s99\",\"25\":\"pc_dwh_rdv_loan_appl.ccdb_loan_daily_s99\",\"26\":\"pc_dwh_rdv_loan_appl.leon_loan_monthly_s99\",\"27\":\"pc_dwh_rdv_loan_appl.nospk_loan_daily_s99\",\"28\":\"pc_dwh_rdv_partnrdata.fkn_special_target_group_s99\",\"29\":\"pc_dwh_rdv_talanx.insurance_s99\"}}" ``` data emulation above ```  
UP
Thank you for the workaround I will check out this approach
I think the issue is, at times I insert manual inputs in this textbox. So when I do that, on the further click the token value is not getting inserted properly. Wondering whether I should clear/reset... See more...
I think the issue is, at times I insert manual inputs in this textbox. So when I do that, on the further click the token value is not getting inserted properly. Wondering whether I should clear/reset the data/ token
Hi @Thulasinathan_M, I tried in my splunk and it runs: <fieldset submitButton="false"> <input type="text" token="token1"> <label>Token1</label> </input> <input type="text" token... See more...
Hi @Thulasinathan_M, I tried in my splunk and it runs: <fieldset submitButton="false"> <input type="text" token="token1"> <label>Token1</label> </input> <input type="text" token="token2"> <label>Token2</label> <default>$token1$</default> </input> </fieldset> Ciao. Giuseppe