All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Or maybe it is easier, to mention the date in the description, when an e-mail is sent. This is my search at the moment: index=smsc tag=MPRO_PRODUCTION DATA ="*" command_id_description="*" NOT (... See more...
Or maybe it is easier, to mention the date in the description, when an e-mail is sent. This is my search at the moment: index=smsc tag=MPRO_PRODUCTION DATA ="*" command_id_description="*" NOT (...) command_status_code="*" NOT (...) | dedup DATA | chart count by SHORT_ID, command_status_code | search NOT ESME_RTHROTTLED=0 | sort - ESME_RTHROTTLED | head 15 Thanks for your help!
| metadata type=sourcetypes index=* group by index | search sourcetype=* | where lastTime < (now() - 86400) | eval Duration=tostring(now() - lastTime,"duration") | search Duration="*" ... See more...
| metadata type=sourcetypes index=* group by index | search sourcetype=* | where lastTime < (now() - 86400) | eval Duration=tostring(now() - lastTime,"duration") | search Duration="*" | fields sourcetype lastTime Duration | sort - lastTime | eval lastTime = strftime(lastTime,"%Y/%m/%d %H:%M" ) | rex field=Duration "(?(\d+))+"
Hi , I am trying to print user active from directory Splunk active/inactive users <input type="radio" token="active_account"> <label>Active accounts</label> <choice value="*">all</choic... See more...
Hi , I am trying to print user active from directory Splunk active/inactive users <input type="radio" token="active_account"> <label>Active accounts</label> <choice value="*">all</choice> <choice value="1">active</choice> <choice value="0">inactive</choice> <default>1</default> </input> <input type="text" token="user_field" searchWhenChanged="true"> <label>User:</label> <default>*</default> </input> <input type="text" token="role_field" searchWhenChanged="true"> <label>Role:</label> <default>*</default> </input> <panel> <table> <search> <query>| rest /services/authentication/users | dedup title | rename title as user | eval firstHit=0 | eval lastHit=0 | eval active=1 | table user, firstHit, lastHit, roles, active | inputlookup append=true splunk_users | eval user=if(isnull(_key), user, _key) | stats max(firstHit) as firstHit, max(lastHit) as lastHit, values(roles) as roles, max(active) as active by user | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(firstHit) | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(lastHit) | eval active=if(active==1, active, 0) | search user="$user_field$" | search active=$active_account$ | search roles="$role_field$"</query> <earliest>-15m@m</earliest> <latest>now</latest> </search> <option name="wrap">true</option> <option name="rowNumbers">true</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="count">100</option> </table> </panel> User/Role/Index Management <panel> <title>Splunk indexes with corresponding roles</title> <input type="radio" token="view_field1" searchWhenChanged="true"> <label>View:</label> <choice value="| nomv index">One line</choice> <choice value="">Human readable (currently not working)</choice> <default>| nomv index</default> </input> <input type="text" token="role_field1" searchWhenChanged="true"> <label>Role:</label> <default>*</default> </input> <input type="text" token="index_field1"> <label>Index:</label> <default>*</default> </input> <table> <search> <query>| inputlookup admin_role_indexes | eval index = mvappend(srchIndexesAllowed, imported_srchIndexesAllowed) | fields role, index $view_field1$ | search role=$role_field1$ | search index=$index_field1$ | dedup role | rex field=index max_match=200 "(?<idx>\w+)" | lookup admin_indexes_data_owners index as idx | stats values(index) as index, values(data_owner) as data_owner by role -15m@m now 20 none none false true <panel> <title>Splunk users details</title> <input type="radio" token="view_field2" searchWhenChanged="true"> <label>View:</label> <choice value="| nomv index | nomv role">One line</choice> <choice value="">Human readable (currently not working)</choice> <default>| nomv index | nomv role</default> </input> <input type="text" token="user_field2" searchWhenChanged="true"> <label>User:</label> <default>*</default> </input> <input type="text" token="role_field2" searchWhenChanged="true"> <label>Role:</label> <default>*</default> </input> <input type="text" token="index_field2"> <label>Index:</label> <default>*</default> </input> <table> <search> <query>| inputlookup admin_user_index_role | rename roles as role $view_field2$ | search user=$user_field2$ | search role=$role_field2$ | search index=$index_field2$ | lookup splunk_users _key as user OUTPUT lastHit as last_seen| eval user=if(isnull(_key), user, _key) | convert timeformat="%Y-%m-%d %H:%M:%S" ctime(last_seen) | table user, last_seen, index, role | eval last_seen=if(isnull(last_seen), "never", last_seen)</query> <earliest>-15m@m</earliest> <latest>now</latest> </search> <option name="wrap">true</option> <option name="rowNumbers">false</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="count">20</option> </table> </panel>
Hello All, Trying to setup Splunk add-on for New Relic and getting this errors in the logs : ValueError: No JSON object could be decoded return _default_decoder.decode(s) File "/opt/splun... See more...
Hello All, Trying to setup Splunk add-on for New Relic and getting this errors in the logs : ValueError: No JSON object could be decoded return _default_decoder.decode(s) File "/opt/splunk/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/opt/splunk/lib/python2.7/json/decoder.py", line 382, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded 2020-02-05 07:09:20,410 INFO pid=19395 tid=MainThread file=connectionpool.py:_new_conn:758 | Starting new HTTPS connection (1): 127.0.0.1 2020-02-05 07:09:21,302 INFO pid=19395 tid=MainThread file=connectionpool.py:_new_conn:758 | Starting new HTTPS connection (1): 127.0.0.1 2020-02-05 07:09:23,247 INFO pid=19395 tid=MainThread file=connectionpool.py:_new_conn:758 | Starting new HTTPS connection (1): 127.0.0.1 2020-02-05 07:09:25,140 INFO pid=19395 tid=MainThread file=setup_util.py:log_info:114 | Proxy is not enabled! Lookin at the logs using this request index=_internal newrelic sourcetype="splunk_ta_new_relic_new_relic_account_input-too_small" Any help is much appreciated. Regards, Serge
Hi, intermittent events missing when pulling through Microsoft Log Analytics Add-on (Formerly Known as OMS) . can not find any err or warn in the internal logs. When tried to pull with larger ... See more...
Hi, intermittent events missing when pulling through Microsoft Log Analytics Add-on (Formerly Known as OMS) . can not find any err or warn in the internal logs. When tried to pull with larger Event Delay / Lag Time it pulled all the events. so its working but when changed to 15 min it again has this intermittent event loss. interval : 60 Also is there a plan for python 3 support , eventually splunk 8 would go for python 3.
I would to export splunkd.log from production and import it into my sandbox for analysis. Once I export the splunkd.log using raw format, the file is like this. "01-17-2020 13:53:20.815 +0800 ... See more...
I would to export splunkd.log from production and import it into my sandbox for analysis. Once I export the splunkd.log using raw format, the file is like this. "01-17-2020 13:53:20.815 +0800 INFO loader - Splunkd starting (build 2dc56eaf3546)." "01-17-2020 13:53:20.816 +0800 INFO loader - Detected 8 (virtual) CPUs, 8 CPU cores, and 7822MB RAM" "01-17-2020 13:53:20.816 +0800 INFO loader - Maximum number of threads (approximate): 3911" It got double quote around the actual raw data. Any quick method to remove it so that I can add it to my sandbox.
I'm trying to embed a webpage within splunk dashboard, this is the sourcecode - <panel> <html> <h3>Embedded Web Page! $url$</h3> <iframe src="$url$" width="100%" height="300"/> </html... See more...
I'm trying to embed a webpage within splunk dashboard, this is the sourcecode - <panel> <html> <h3>Embedded Web Page! $url$</h3> <iframe src="$url$" width="100%" height="300"/> </html> </panel> This works perfectly in splunk enterprise but iframe is not at all recognized in splunk cloud. Is that a restriction in cloud or I'm missing something?!
Hello Splunkers, Please help me to find the solution of this use case. Use case: Can we add " View dashboard in Splunk" as link in alert mail notification like we have "View results in Splunk ... See more...
Hello Splunkers, Please help me to find the solution of this use case. Use case: Can we add " View dashboard in Splunk" as link in alert mail notification like we have "View results in Splunk " and "Link to Alert" in alert mail configuration? What I tried: <a href="https://www.splunk.com/">Visit Dashboard</a> I thought Visit Dashboard will act as hyperlink like its working in html but here it is not working. Even I tried with link tag also but still getting same result.
I have created an Alert and configured it to "Triggered Alerts" and "Send email". I want the "link to results" sent in email, to expire after 60 days. Is it possible to change the expiry duration o... See more...
I have created an Alert and configured it to "Triggered Alerts" and "Send email". I want the "link to results" sent in email, to expire after 60 days. Is it possible to change the expiry duration of email link? If yes , how could I do so? I have tried changing from:- Settings > Searches,Reports&Alerts > (Alert-name) > Advanced Edit > alert.expires. Tried setting "alert expires" to 5 mins but the link did not expire in 5 mins. I also tried changing ttl value under [email] stanza in alert_actions.conf, but this as well did not work. Please suggest some method to change expiration duration of email link to view results. Also I want to add one more link in email directing to a dashboard displaying total alerts generated. How could I add a "link to dashboard" in the mail along with the already present "link to results" of current alert?
Hi folks, I have a requirement to add custom app in the default splunk executable. Currently, we are having splunk .tar setup after which we untar it to install and when splunk service starts we ... See more...
Hi folks, I have a requirement to add custom app in the default splunk executable. Currently, we are having splunk .tar setup after which we untar it to install and when splunk service starts we place our custom application in the "etc/apps" folder. I was thinking is it possible that if I untar the setup file, place my app in the "etc/apps" folder and again zip it. following which I give it for deployment so that it's a one-step. Is it possible or would it result in some hash mismatch since the original setup is getting tempered with?
Hello, Pls advise how one can backup from an existing splunk (7.0) and restore the saved splunk's data to another new splunk (latest version). I can schedule a shutdown for the backup/restore. Than... See more...
Hello, Pls advise how one can backup from an existing splunk (7.0) and restore the saved splunk's data to another new splunk (latest version). I can schedule a shutdown for the backup/restore. Thanks
Need some suggestion for field extraction. Take this as an example: I have a file path /opt/splunk/var/log/splunk/splunkd.log There is already a field extraction done for this called file_n... See more...
Need some suggestion for field extraction. Take this as an example: I have a file path /opt/splunk/var/log/splunk/splunkd.log There is already a field extraction done for this called file_name. I would like to do a filed extraction with just the Directory path ( /opt/splunk/var/log/splunk/) and name dir_name. The problem arises when I try to do a new extraction, as the path is sort of already used by file_name and splunk mentions that I need "To highlight text that is already part of an existing extraction, first turn off the existing extractions" My doubt here is. If I turn of the existing extraction and then create one for dir_name, would I still be able to use file_name or does that get over-ridden by the new extraction? Thanks, AKN
Hi All, I've got into weird situation where in i need to get data of 10 different companies into Splunk with two indexers in single site clustering with one SH. I'm under planning stage. I'm... See more...
Hi All, I've got into weird situation where in i need to get data of 10 different companies into Splunk with two indexers in single site clustering with one SH. I'm under planning stage. I'm wondering how can we install apps for each company individually on same indexer since each company will have their own indexes. For example: If i install windows add-on, It would give results of any one company. Then , How can i get data of other 9 companies as well. Please suggest.
Hi, I have an alert action that triggers a python script; In the intended workflow, this alert action can either 1. be manually executed by a user, or 2. be scheduled to execute as an ale... See more...
Hi, I have an alert action that triggers a python script; In the intended workflow, this alert action can either 1. be manually executed by a user, or 2. be scheduled to execute as an alert action of several different alerts How can I pass 1. the username that manually runs the search in case 1, and 2. the name of the alert that triggered this alert action into the python script itself? E.g. if user Alice@zzz.com runs search xxx | sendalert alert_action_1 I want to use the variable "Alice@zzz.com" in the python script; and if "Alert_ABC" triggers the action alert_action_1, I want to use the variable "Alert_ABC" in the python script I've got fields like "_raw" working, but couldn't find any parameters related to what/who triggered the alert action itself... Any hints would be really appreciated!
Currently on our U.F we have a very old version of splunk_ta_windows 4.8.1. We want to upgrade to 7.0 the current version.Do we need to upgrade from 4.81. to and then to 6 or can we upgrade directly ... See more...
Currently on our U.F we have a very old version of splunk_ta_windows 4.8.1. We want to upgrade to 7.0 the current version.Do we need to upgrade from 4.81. to and then to 6 or can we upgrade directly to 7.Do we need to wait anytime if we upgrade to 6 and then to 7
I have lookup that has 2 columns IP address and hostname , I see output when I run command | inputlookup serverip.csv. Now, I want to use these IPs to look at src_ip field in firewall logs and find ... See more...
I have lookup that has 2 columns IP address and hostname , I see output when I run command | inputlookup serverip.csv. Now, I want to use these IPs to look at src_ip field in firewall logs and find matches, after match look at destinations like dest_ip,dest_location etc. | inputlookup serverip.csv and then other query index=firewall | stats values(dest_IP),values(url),values(dest_location) by src_ip , where the src_ip is the IP from the csv. How do I do the correlation, also the csv as server name which I would like to pull into the output. Thanks in advance for any help.
Hi, For https://docs.appdynamics.com/display/PRO45/Install+the+.NET+Agent+for+Windows#Installthe.NETAgentforWindows-install_procedure, the AppDynamics Download Center link is pointing to "https://ww... See more...
Hi, For https://docs.appdynamics.com/display/PRO45/Install+the+.NET+Agent+for+Windows#Installthe.NETAgentforWindows-install_procedure, the AppDynamics Download Center link is pointing to "https://www.appdynamics.com/download", resulting in a 404. I believe the actual link should be "https://download.appdynamics.com/download/" Thanks, William
Hello, I want to break the TestTransaction inside testVal values, JSON needs to break up and show all field values inside JSON, how can this search be rewritten? Splunk search: index=* sour... See more...
Hello, I want to break the TestTransaction inside testVal values, JSON needs to break up and show all field values inside JSON, how can this search be rewritten? Splunk search: index=* sourcetype=WORKER | fields TestVal TestVal values: {"@t":"2020-02-04T22:16:20.8458700Z","@mt":"{@parameters}","parameters":{"info":"Published","message":{"TestTransaction":{"sampleval":"10298684736384305384235533777352","EntryType":141,"CheckNumber":783562,"CheckCloseDate":"2020-02-04T22:16:08.0000000Z","CurrencyCode":"USD","Tenders":[{"Amount":5.3,"Description":"SBUX Card","TenderId":"SV5j8AtfYVm","SvcVal":6147524390259141,"CurrencyCode":null,"$type":"TestTender"}],"TotalAmount":5.3,"SubtotalAmount":4.95,"TaxAmount":0.35,"DiscountAmount":0.0,"Header":{"ServiceType":null,"Number":22,"PosRequestDate":"2020-02-04T22:16:08.0000000Z","$type":"TestHeader"},"Preparation":"ConsumeOutOfStore","TestDetails":{"Discounts":[],"Items":[{"Qty":1.0,"Sku":null,"Price":4.95,"Discounts":[],"Description":null,"Price":null,"Suffix":null,"ChildItems":[],"Commerce":{"Sku":"11105767","edSku":null,"PosStatus":null,"Value":null,"$type":"Commerce"},"Product":{"ProductTypeId":11,"ProductType":"Beverage","ProductNumber":2123078,"FormCode":"salty","SizeCode":"test","LocalDescription":"test","$type":"Product"},"IsRefunded":false,"IsTaxed":false,"Summary":{"TotalPrice":4.95,"DiscountAmount":0,"SubtotalAmount":4.95,"$type":"TestItemSummary"},"$type":"TestItem"}],"Taxes":[{"Name":"State+Local Meals Tax 7%","Amount":35,"$type":"TestTax"}],"ReceiptLines":[],"Delivery":null,"$type":"TestDetails"},"$type":"TestTransaction"},"RequestId":"pos-200204141619-prodrh50592773796","MessageId":"BTxnApi_MID_b6cea268-af3c-4334-85df-c34108e81705","$type":"UpsertTestTransaction"}}}
I need an SPL that will take input from Authentication dataset in the Authentication datamodel, at the same time taking the expired_identities dataset from the Identity_Management datamodel. I want ... See more...
I need an SPL that will take input from Authentication dataset in the Authentication datamodel, at the same time taking the expired_identities dataset from the Identity_Management datamodel. I want only the matches then i need the event time from when the authentication happened and when the identity was expired then eval. Below is what I have so far, it appends both data sources together and foreach is supposed to look for matches, it does, but I think it is only comparing the side by side column not searching the entire column for each entry in users. Thanks any help would be greatly appreciated. | datamodel Authentication "Authentication" search | stats count by Authentication.user | rename Authentication.user as user | appendcols [| datamodel Identity_Management "Expired_Identities" search | stats count by All_Identities.LoginID] | foreach user [eval match=if(user=All_Identities.LoginID, user, NULL)] | table user All_Identities.LoginID count match
We have a large number of hosts reporting to Splunk, and sometimes (rarely), some of them stop sending events. Is there an elegant search for hosts, which have last reported anything more than T ago?... See more...
We have a large number of hosts reporting to Splunk, and sometimes (rarely), some of them stop sending events. Is there an elegant search for hosts, which have last reported anything more than T ago? I'd like to make an alert for T being above, say, 6 hours or so...