All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello Splunk community, Let's say my input to Splunk is three csv files that use the following schema. Each csv populate an index: Faults, Incidents and Status For each Faults entry there is on... See more...
Hello Splunk community, Let's say my input to Splunk is three csv files that use the following schema. Each csv populate an index: Faults, Incidents and Status For each Faults entry there is one (and just one) Status entry. That Status entry will have parent_id = id of that fault. In the same way there is also a 'Status' entry for each Incident. When I am querying Splunk or making dashboards I have to retrieve information not only from 'Faults' or 'Incident' indexes but also from 'Status'. That makes me use a lot of joining indexes queries like this:   index="faults" |join type=outer status_id [search index="status" | rename id as status_id]   I liked this solution at first because 'Faults' and 'Incident' indexes look very clean, but I have read that these types of SPL queries are computational expensive and I am concerned that perhaps this will not escalate well in the future. Should I perhaps modify the schema and remove the Status index and put all that information in the Faults and Incidents like this? Thank you all a lot in advance for your answers. Fran  
We have multiple environments and we have to check whether splunk logs indexed or not. Based on environment we have to run query. In this below query, it's running as: (Dev1 is missing in env=traffi... See more...
We have multiple environments and we have to check whether splunk logs indexed or not. Based on environment we have to run query. In this below query, it's running as: (Dev1 is missing in env=trafficui) index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=trafficui OR env=trafficbatchDev1 NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set>   But I wanted like: index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=trafficuiDev1 OR env=trafficbatchDev1 NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set>   Original Query: <form> <label>DEV1 - Logs Indexed</label> <fieldset submitButton="false" autoRun="true"> <input type="dropdown" token="env_tok" searchWhenChanged="true"> <label>Select Environment</label> <choice value="Dev1">DEV1</choice> <choice value="Dev2">DEV2</choice> <default>Dev1</default> <initialValue>Dev1</initialValue> </input> <input type="dropdown" token="app_tok" searchWhenChanged="true"> <label>Select Application</label> <fieldForLabel>env</fieldForLabel> <fieldForValue>env</fieldForValue> <choice value="trafficui OR env=trafficbatch">Traffic</choice> <choice value="roadsui OR env=roadsbatch">Roads</choice> <change> <condition value="Roads"> <set token="new_search">index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=$app_tok$$env_tok$ NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set> </condition> <default>Roads</default> <initialValue>Roads</initialValue> </input> <input type="time" searchWhenChanged="true"> <label>Select Date</label> <default> <earliest>-1d@d</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <table> <search> <query>$new_search$ </query> </search> <option name="showPager">true</option> <option name="count">50</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>
Hi Splunkers,   We have a plan to upgrade splunk version to 8.1 in PROD environment. Before it we have upgraded version on test environment. In our organisation we are using one customized app whic... See more...
Hi Splunkers,   We have a plan to upgrade splunk version to 8.1 in PROD environment. Before it we have upgraded version on test environment. In our organisation we are using one customized app which is using for sending alert to CAUIM from splunk and this app was build in Splunk add-on builder. In PROD we have a source code of the old application ,  that same file we have copied into test after upgrade the version. Now we have to recompile this application which will be compatible with Python 3 and splunk new version .  We wants to know the flow of recompilation. We tried to export the file but couldn't exported. Anybody knows the location where that file is getting stored. Kindly let us know,   Thanks & Regards , Abhijeet Bandre.
Hi I have log file like this, need to extract "id" from lines that A=20 and match these lines to lines where that B=10, finally show them in a single table. 1-where A=20 export "id(s)" from the... See more...
Hi I have log file like this, need to extract "id" from lines that A=20 and match these lines to lines where that B=10, finally show them in a single table. 1-where A=20 export "id(s)" from these lines: 07:59:42.213 app module: Z[200]id[12]A[20] 07:59:42.213 app module: Y[300]id[88]A[20] 2-if "id" that export in pervious step and B=10 matched join them and make table. share field is "id" between these lines. 07:58:21.533 app module: Q[230]id[12]B[10] 07:58:21.533 app module: V[230]id[88]B[10] expected result: id        A      B 12     20  10 88     20  10 any idea? Thanks,
I'm a Splunk Add-on/App developer, and the Splunk-App I developed has passed the app inspect and runs well on Splunk Enterprise, but how can I get Splunk Cloud certification flag? The flag like this... See more...
I'm a Splunk Add-on/App developer, and the Splunk-App I developed has passed the app inspect and runs well on Splunk Enterprise, but how can I get Splunk Cloud certification flag? The flag like this picture: Only customers can send Splunk Cloud App request? Really? I think It's too inconvenient, as a developer, I can't submit my own review request...
Hello, I have Universal Forward and Heavy Forward in Linux machine, how would I stop and restart them.  Any help will be highly appreciated. Thank you so much, appreciate your support in these effor... See more...
Hello, I have Universal Forward and Heavy Forward in Linux machine, how would I stop and restart them.  Any help will be highly appreciated. Thank you so much, appreciate your support in these efforts.
Is there an SPL to list all my Hosts (Win & Linus), version of their UF, date & time & TZ please? Thanks a million.
Has anyone run into an issue where the Microsoft 365 App for Splunk is causing a search head to crash? I'm wondering if part of the issue is the use of the sankey visualizations on some of the dashbo... See more...
Has anyone run into an issue where the Microsoft 365 App for Splunk is causing a search head to crash? I'm wondering if part of the issue is the use of the sankey visualizations on some of the dashboards given the volume of data it is trying to display. The SH that are crashing are moderately beefy.  https://splunkbase.splunk.com/app/3786/ Does the new answers backend support tagging the app? I forget how to do that.
We have Splunk Ent. + ES. I have a dashboard that I 'd like to install in Security Essentials. What level permission does a user need to install this in Security Essentials? ES admin or Ent. Admin le... See more...
We have Splunk Ent. + ES. I have a dashboard that I 'd like to install in Security Essentials. What level permission does a user need to install this in Security Essentials? ES admin or Ent. Admin level permission? Thanks a million for your reply.
Coming from older version of Splunk.  We have basically html links that when selected opened a new tab with a presaved search.  We have about 30 searches on a single page which all are unique and all... See more...
Coming from older version of Splunk.  We have basically html links that when selected opened a new tab with a presaved search.  We have about 30 searches on a single page which all are unique and all open a new tab to display.  How is this done with the newest version of Splunk?  The only thing I can find has is a panel that launches a saved search in a new tab but it also shows in on the dashboard which is what I don't want because I need to have 29 other items that can be selected.  
I was asked to ask - Our alerts are relying on various lookups, lookup generators, and other searches. If anything about these underlying layers fail, we have an alert with failing SPL, and these fa... See more...
I was asked to ask - Our alerts are relying on various lookups, lookup generators, and other searches. If anything about these underlying layers fail, we have an alert with failing SPL, and these failures are silent, so the alert fails, and we have no idea that it’s because an error in SPL not because there are no events generating them. Would you ask Splunk Support groups, do we have any option to create an alert action to send us an email whenever a scheduled alert SPL fails due to errors in that SPL? We really need that.  
I have a large environment that the TZs between hosts & Splunk are off by minutes & hours at times. How do I get started ? If you have done such a project please share the procedures - any helpful SP... See more...
I have a large environment that the TZs between hosts & Splunk are off by minutes & hours at times. How do I get started ? If you have done such a project please share the procedures - any helpful SPLs. Thanks a million.
I have an input text and input dropdown that both need to allow blank value.  They cannot be null since the token must be set or the queries that use it won't run.  I need a prefix and suffix both wi... See more...
I have an input text and input dropdown that both need to allow blank value.  They cannot be null since the token must be set or the queries that use it won't run.  I need a prefix and suffix both wildcards only when there's a value, and to use a single wildcard (*) in its absence. I tried this, but the prefix and suffix keep multiplying and soon i have 10 suffixes and 10 prefixes. here's the input text:   <prefix/> <suffix/> <change> <eval token="assetFilter">if(len($assetFilter$)&gt;0, "*" . $assetFilter$ . "*", "*")</eval> </change>   after entering, removing, entering, removing values, the suffix and prefix kept multiplying and eventually looked like this ****tag=****   Here's the other one, a dropdown of macros so it needs the ` char with the wildcard, after selecting, unselecting, selecting, and unselecting values in the dropdown.   <change> <eval token="asset">if(len($asset$)&gt;0, "*`" . $asset$ . "`*", "*")</eval> </change>   *`*`*`*`*`*`*`*`*`*`*`*`*`*`*`ED_ENDI_Asdf`*`*`*`*`*`*`*`*`*`*`*`*`*`*`*   How can I use prefix and suffix conditionally a little better or in a way that works? -c    
I have a list of files name under one field called "attachment"  and I would like to split this string into multiple rows by file names. I tried split/makemv method but it was unsuccessful.. really a... See more...
I have a list of files name under one field called "attachment"  and I would like to split this string into multiple rows by file names. I tried split/makemv method but it was unsuccessful.. really appreciate any tips on separating this string by file. Thank you    For example:  Raw output: "image001.png Project_Preference_crosstab (3).csv GC_Preferences - 8 Oct.xlsx GC updated skills with ratings - 30 Sep 2021.xlsx Skill_Details_-_Base_crosstab (3).csv AP Talent list - 30 Sep 2021.xlsx UCD_Skills_Compliance_Details_crosstab (2).csv"   I would like to see: image001.png Project_Preference_crosstab (3).csv GC_Preferences - 8 Oct.xlsx GC updated skills with ratings - 30 Sep 2021.xlsx Skill_Details_-_Base_crosstab (3).csv AP Talent list - 30 Sep 2021.xlsx UCD_Skills_Compliance_Details_crosstab (2).csv
I need help to use the values from a lookup table into multiple fields, where the output from the lookup table is a list of values. The value from the table will be populated in a_ims, b_ims, c_ims.... See more...
I need help to use the values from a lookup table into multiple fields, where the output from the lookup table is a list of values. The value from the table will be populated in a_ims, b_ims, c_ims... instead of  "*" I tried this query below and some other variations but none of them worked. index=*  sourcetype=v_main (a_imsi=* OR b_imsi=* OR c_imsi=* OR d_imsi=* OR Imsi=*) | lookup ADHOC.csv Comment OUTPUT Imsi | eval IMSI=mvappend(a_imsi,b_imsi,c_imsi,d_imsi,Imsi) | mvexpand IMSI | bin span=1d _time | stats sum(TCDuration) as TCDuration by _time IMSI | eval TCDuration=TCDuration/1000 | eval Utilization=round(((TCDuration/86400)*100),1) | eval Utilization=if(Utilization >100, 100, Utilization) | fields - TCDuration | timechart eval(round(avg(Utilization),1)) by IMSI limit=0   Any ideas will be really helpful    thanks so much
What search can I do to find peers with status=down. Looking to form an alert when this happens but can't find it within a search. 
I have a dashboard that has a timechart displaying a count of values occurring every hour. My query is: index=app host=... sourcetype="..." siteType=... | timechart span=1h count(eval(status!=200)) ... See more...
I have a dashboard that has a timechart displaying a count of values occurring every hour. My query is: index=app host=... sourcetype="..." siteType=... | timechart span=1h count(eval(status!=200)) as Fails | eval Time = strftime(_time, "%Y-%d-%m %H:%M:%S") | fields - _time | table Time, Fails This works perfectly, but I want to add a drilldown on my table so the user can click on a row and see all the values for that hour. The closest thing I have been able to come up with is this query: index=app host=... sourcetype="..." siteType=... status!=200 ((earliest=$earliest$ latest<$latest$)) But if I click on a row, it gives me a search that looks like this: index=app host=... sourcetype="..." siteType=... status!=200 ((earliest=1633096800 latest<1633702712)) And I have an error in the search, "Invalid earliest_time." What is going on here? Is there a conversion I need to do on the earliest and latest tokens to get the correct time? 
Hello there, I hope someone can help! a report we generated doesn't send emails, or just sporadically. also, the action time for this is very high as you can see here (field "action_time_ms"):  10-... See more...
Hello there, I hope someone can help! a report we generated doesn't send emails, or just sporadically. also, the action time for this is very high as you can see here (field "action_time_ms"):  10-08-2021 14:20:35.156 +0200 INFO SavedSplunker - savedsearch_id="nobody;search;SIGNL4 - High or Critical Notable Events Clone", search_type="scheduled", user="maximilian.wehner", app="search", savedsearch_name="SIGNL4 - High or Critical Notable Events Clone", priority=default, status=success, digest_mode=0, scheduled_time=1633695120, window_time=0, dispatch_time=1633695122, run_time=2.838, result_count=1, alert_actions="email", sid="scheduler_bWF4aW1pbGlhbi53ZWhuZXI__search__RMD55d86aa6233cebf27_at_1633695120_428", suppressed=0, fired=1, skipped=0, action_time_ms=509817, thread_id="AlertNotifierWorker-1", message="", workload_pool=""   action_time_ms is a LOT. so something prevents it from being sent or whatever is going on.  usually I think splunk could send an email without configuring a mailserver but currently we want to use our o365 mailserver for it. this has been tested with another environment and there it definitely works like a charm.  here the config of the alert and the mailserver config:  --> we've artificially set the maxtime very high to check if splunk is finally sendint the mail after a while. record was over 8 minutes until a mail was sent.    My questions now are how can this happen? is there a way to further investigate and resolve this issue? currently this alert is mandatory for a security view and if this alert only comes every now and then, it's a main issue.    [email] auth_password = **** auth_username = user@xyz.de from = splunk@sxyz.de mailserver = smtp.office365.com pdf.header_left = none pdf.header_right = none use_tls = 1 reportPaperSize = a4 hostname = somehostname maxtime = 20m   is there something wrong with the config? What can I do to further troubleshoot this issue and hopefully resolve it? I guess this issue has come up in the past   thanks a lot for help!
Hello, I have a field with this values /v1/accounts/96ea01b5-7ea7-4dc6-b534-39ae8b114bba/transactions /v1/accounts/ff572b85-c3c6-4e54-8343-75c5aa954285 /v1/accounts/469754d0-9169-45ca-af86-a885... See more...
Hello, I have a field with this values /v1/accounts/96ea01b5-7ea7-4dc6-b534-39ae8b114bba/transactions /v1/accounts/ff572b85-c3c6-4e54-8343-75c5aa954285 /v1/accounts/469754d0-9169-45ca-af86-a885142d6ad4/transactions /v1/accounts/c68b8246-bd76-4d34-9d33-7fb4be4ebe9f/limits /v1/accounts/d9f1e948-e9aa-4a46-9e78-deeaf1d21143/limits /v1/accounts/f6fa235c-858d-42d2-80ae-85b12a750351 /v1/accounts/f4a0877f-5807-41ed-b7ee-c6be2e4e25be /v1/accounts/042c6b58-ea01-48cd-838e-06929b427f75 I need a query that show me only the lines that doesn't have nothing after the ID. Exemple  /v1/accounts/ff572b85-c3c6-4e54-8343-75c5aa954285 /v1/accounts/f6fa235c-858d-42d2-80ae-85b12a750351 /v1/accounts/f4a0877f-5807-41ed-b7ee-c6be2e4e25be /v1/accounts/042c6b58-ea01-48cd-838e-06929b427f75 Thanks
I am trying to install the Splunk App For Jenkins (v 2.0.4) in our Splunk Cloud environment (v 8.1.2103.3) and the app is failing vetting.  On the Splunkbase page for the App, the compatibility indic... See more...
I am trying to install the Splunk App For Jenkins (v 2.0.4) in our Splunk Cloud environment (v 8.1.2103.3) and the app is failing vetting.  On the Splunkbase page for the App, the compatibility indicates that I should be able to install. Is someone out there actively maintaining this app and can take note of some changes in a future version? Thanks, REID These are the messages related to the failures (there are some warning too): [ Failure Summary ] Failures will block the Cloud Vetting. They must be fixed. check_rest_handler_python_executable_exists The handler of stanza [script:customIndex] should be `python3` executable. File: default/restmap.conf Line Number: 12 The handler of stanza [script:customPanel] should be `python3` executable. File: default/restmap.conf Line Number: 1 The handler of stanza [script:userValidation] should be `python3` executable. File: default/restmap.conf Line Number: 23 check_for_telemetry_metrics_in_javascript The telemetry operations are not permitted. Match: window._splunk_metrics_events.push File: appserver/static/pages/job.js Line Number: 6 The telemetry operations are not permitted. Match: window._splunk_metrics_events.push File: appserver/static/pages/health.js Line Number: 6 The telemetry operations are not permitted. Match: window._splunk_metrics_events.push File: appserver/static/pages/audit.js Line Number: 6