All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

when inputing a customer field of data/time in container,  is there any ways to do hints of input and input validation  ?  Currently it only support text/select in 4.10.X in customer field ,   and i... See more...
when inputing a customer field of data/time in container,  is there any ways to do hints of input and input validation  ?  Currently it only support text/select in 4.10.X in customer field ,   and it is not real-time if done by playbook or some action.
Failure to open phantom (4.10.x)  GUI after setting up warm/standby ,   no error message when setup warm/standby and starting phantom.  Any further troubleshooting or logs to check ?
Hi there, I have 2 separate queries that I built using Rex. 1. This query captures the logg on and logg off status of the service. Query: index=windows_log host=abc-05-hiddencam logged* | rex fi... See more...
Hi there, I have 2 separate queries that I built using Rex. 1. This query captures the logg on and logg off status of the service. Query: index=windows_log host=abc-05-hiddencam logged* | rex field=_raw  "(?<Date>\w{3}\s+\d+ \d+:\d+:\d+)\s(?<hostname>\w+-\w+-\w+).+Audit\S+\s\w+\s\w+\s(?<status>.+).\s\s\s\sSub.*" | eval "Hidden Cam Monitoring" = Date + " : " + hostname + " " + status | table "Hidden Cam Monitoring" 1. Sample Output: Dec 10 13:35:12 : abc-05-hiddencam successfully logged on Dec 10 06:19:24 : abc-05-hiddencam successfully logged on Dec 10 06:17:01 : abc-05-hiddencam logged off Dec 10 06:11:55 : abc-05-hiddencam logged off   2. This query captures the service entering the start or stop status. Query: index=windows_log host=abc-05-hiddencam entered* | rex field=_raw "(?<Date>\w{3}\s+\d+ \d+:\d+:\d+)\s(?<hostname>\w+-\d+-\w+).*(?<status>service\s\w+\s\w+\s\w+\s\w+)" | eval "Hidden Cam Monitoring" = Date + " : " + hostname + " " + status | table "Hidden Cam Monitoring"   2. Sample Output: Dec 10 16:10:04 : abc-05-hiddencam service entered the stopped state Dec 10 15:31:31 : abc-05-hiddencam service entered the stopped state Dec 10 15:28:19 : abc-05-hiddencam service entered the running state Dec 10 15:28:18 : abc-05-hiddencam service entered the running state   My issue is, I want to combine above queries into a single query and get an output in a table as shown below. 3. Expected sample results: Dec 10 13:35:12 : abc-05-hiddencam successfully logged on Dec 10 16:10:04 : abc-05-hiddencam service entered the stopped state Dec 10 06:19:24 : abc-05-hiddencam successfully logged on Dec 10 15:28:18 : abc-05-hiddencam service entered the running state Dec 10 06:17:01 : abc-05-hiddencam logged off Dec 10 15:28:19 : abc-05-hiddencam service entered the running state Dec 10 06:11:55 : abc-05-hiddencam logged off Dec 10 15:31:31 : abc-05-hiddencam service entered the stopped state ( The results are going to be different to above based on the timestamp and the events. What I mean here is the results come mixing together in a single table as and when they take place.) Thank you heaps in advance. 
Batch input is described when discussing file ingestion using inputs.conf.  I do not see a mentioning in Monitor files and directories in Splunk Enterprise with Splunk Web and cannot find a button in... See more...
Batch input is described when discussing file ingestion using inputs.conf.  I do not see a mentioning in Monitor files and directories in Splunk Enterprise with Splunk Web and cannot find a button in GUI.   Is there an option to do so?
Our SHs & Indexers are clustered. Not sure if this has to do with AWS going down yesterday. But I noticed the error early today & it has red exclamation on it at end of the day. Restarting the cluste... See more...
Our SHs & Indexers are clustered. Not sure if this has to do with AWS going down yesterday. But I noticed the error early today & it has red exclamation on it at end of the day. Restarting the cluster master did not help. Thanks & Happy holidays. 
Hi, I am new to SPL and have figured out how to do one rex Field extract - like this index=xxxxx  "PUT /app/1/projects" | rex field=_raw "HTTP\/1\.1\" (?P<Status_Code>[^\ ]*)" this is from the fo... See more...
Hi, I am new to SPL and have figured out how to do one rex Field extract - like this index=xxxxx  "PUT /app/1/projects" | rex field=_raw "HTTP\/1\.1\" (?P<Status_Code>[^\ ]*)" this is from the following search results log line HTTP/1.1" 200 44 188 This gives me the Status code and I can sort them and report - example 200 , 201, 400 or 500 I need to use the last field (2 or 3) digits to get the speed - how would I do that - I am stuck with formatting   Thanks in advance
Hello community,    I have an issue in my environment and I have been for a while trying to catch the root cause and I feel I am not even close. I am receiving this message frequently: A... See more...
Hello community,    I have an issue in my environment and I have been for a while trying to catch the root cause and I feel I am not even close. I am receiving this message frequently: And I don't know where this come from: I checked the %iowait at the SO and never is up to 0.02 but the alert about IOWait is stilling coming for search heads and indexers as well.   I checked the resources and there is not issue: Also I check the CPU running this search and by the MC and there is not a huge use of the CPU. This is for the last 4 hours So I am really confused, I don't know if I missing something. Version is 8.2.2 - Cluster environment. Can you please can help me on this? Kind Regards.
We have 2 inputlookup files, 1 with All-users and another with Disabled-users.   Is there a way to remove the records from the All-users inputlookup file if the user matches/exists in the Disabled-us... See more...
We have 2 inputlookup files, 1 with All-users and another with Disabled-users.   Is there a way to remove the records from the All-users inputlookup file if the user matches/exists in the Disabled-users file and/or if needed generate a new outputlookup file with the new results?   Both files have the same field name, sAMAccountName.    We've tried dedup and append=f, with no luck so far.  We also tried uniq which I think should've only returned unique records, but unfortunately could not get it to work. Thanks in advanced for your help.  
Hi all, This is the sample Azure nsg log ingested from Azure log analytics  "aaaedbb3-407b-4d6c-9f11-dc4640e9acf4", "Azure", "", "", "2021-12-10T19:06:17.001Z", "", "", "", "", "", "", "", "", ""... See more...
Hi all, This is the sample Azure nsg log ingested from Azure log analytics  "aaaedbb3-407b-4d6c-9f11-dc4640e9acf4", "Azure", "", "", "2021-12-10T19:06:17.001Z", "", "", "", "", "", "", "", "", "", "", "2", "2021-12-10T18:00:00Z", "2021-12-10T19:00:00Z", "2021-12-10T18:09:01Z", "2021-12-10T18:36:26Z", "S2S", "", "10.115.1.77", "34.206.244.234", "", 54443, "T", "Unknown", "O", false, "A", "d88af0da-cfee-4f3e-bb50-58341fe4e132/c-hal-it-ss-prod-eus-rg/cap-subnet1-nsg", "0|cap_mgmt_to_hal|O|A|4", "cap_mgmt_to_hal", "UserDefined", "d88af0da-cfee-4f3e-bb50-58341fe4e132", "", "eastus", "", "c-halazops-connectivity-eus-criticalassetprotection-rg/np1caps009v-nic1", "c-halazops-connectivity-eus-criticalassetprotection-rg/np1caps009v-nic1", "", "c-halazops-connectivity-eus-criticalassetprotection-rg/np1caps009v", "c-halazops-connectivity-eus-criticalassetprotection-rg/np1caps009v", "", "c-hal-it-ss-prod-eus-rg/c-hal-it-ss-prod-eus-vnet1/cap-subnet1", "", "", "", "", "", "", "", "", "d88af0da-cfee-4f3e-bb50-58341fe4e132/c-hal-it-ss-prod-scus-rg/c-hal-it-ss-prod-scus-er2", "AzurePrivatePeering", "d88af0da-cfee-4f3e-bb50-58341fe4e132/c-hal-it-ss-prod-eus-rg/c-hal-it-ss-prod-eus-scus-conn2", "", "", "", 0, 0, 4, 0, 4, 39, 34, 26863, 4706, 4, "", "", "", null, "", "", "", "", "", "", "", null, "", "", "", "", "", "", "ExpressRoute", null, "", null, "", "", null, "", "", "", "", "", "", "", "", "", "", "", "", "", "", "", "c-hal-it-ss-prod-eus-rg/c-hal-it-ss-prod-eus-vnet1/cap-subnet1", "", "", "", "", "", "", null, null, "", null, "", "", "", "", null, null, "", "", "", null, null, "", "", null, null, "", null, "", "", "", null, "", "", "", "", "eastus", "", "FlowLog", "d88af0da-cfee-4f3e-bb50-58341fe4e132", "", "2021-12-10T19:06:11.622Z", "", "", "", "", "", "", "", null, "", "", "", null, "", "", "", "", "", "", null, "00-0D-3A-1A-C0-F7", "", "", "", "", null, "", "", null, null, null, null, "", "", "AzureNetworkAnalytics_CL", "" Can anybody please help me in parsing and get into meaningful data.
Hi, hoping to get some more insight on my current problem. My problem is the following  I am using a where clause to capture data for a specific field value. If the specific value does not exist f... See more...
Hi, hoping to get some more insight on my current problem. My problem is the following  I am using a where clause to capture data for a specific field value. If the specific value does not exist for the current time period I get the following message as a result 'No results found. Try expanding the time range.' Instead of the no results message showing up I would like to display something else. The following is an example. index=sample_idex sourcetype="smf001" | fields _time,  FIELD | lookup sample_lookup.csv system as FIELD output sample_env | eval e=if(in(sample_env, "env"), 1, 0) | where e=1 | where FIELD=="value" | table FIELD I was thinking of doing something like the following with proper syntax: | eval where FIELD=="value" else   
Hi all, We have Splunk on-prem and have recently started using DUO for authentication.  We are interested in knowing if anyone has configured SSO for their Splunk and how they did it?  The DUO docu... See more...
Hi all, We have Splunk on-prem and have recently started using DUO for authentication.  We are interested in knowing if anyone has configured SSO for their Splunk and how they did it?  The DUO documentation points at having to install a Duo Access Gateway.  At the moment we have an AD sync to Duo but no D.A.G.  We also have access to Azure and wondered if we could use that as an identity provider instead?   Thanks!
Hi @Anonymous  / @Anonymous  I have recently started using your "File/Directory Information Input" app. I believe that it does not work with splunk python3 - which is the default version in splunk8... See more...
Hi @Anonymous  / @Anonymous  I have recently started using your "File/Directory Information Input" app. I believe that it does not work with splunk python3 - which is the default version in splunk8.  Is this something that you still work on and maintain? I have been able to get it working if I set splunk system/server.conf to "python.version = python2".  It would be better though if I could set this within the app and not splunk system wide. In general it has been working for me when I use within a UF that has the latest version of python2,  so 2.7.5-89 works in Linux. It does have some issues around the 'file_filter` when filtering, this again seemed to work closer to as expected for me when python2 was patched to latest minor release version of 2.7.5-89 But when it works it is great and does exactly what I want, so thank you very much regards
We can sent emails to recipients but it does not include the host name that generated the alerts. 
We are just now beginning to deploy the splunkforwarder for linux in our Large organization We are running the agent as s systemd service the file  /opt/splunkforwarder/etc/apps/<redacted>/local/de... See more...
We are just now beginning to deploy the splunkforwarder for linux in our Large organization We are running the agent as s systemd service the file  /opt/splunkforwarder/etc/apps/<redacted>/local/deploymentclient.conf has these settings [deployment-client] clientName =<redacted> [target-broker:depolymentServer] targetUri = splunkuf-<redacted>:8089 # # The splunk UF will phone home every 14400 sec = 4hrs # 300sec=5min 900sec=15min phoneHomeIntervalInSecs = 900   On RedHat 8 .4 systems there are no issues when I run systemctl status SplunkForwarder -l but on RedHat 7.9 i get the following Splunk> Australian for grep. Checking prerequisites... Management port has been set disabled; cli support for this configuration is currently incomplete. Invalid key in stanza [target-broker:depolymentServer] in /opt/splunkforwarder/etc/apps/<redacted>local/deploymentclient.conf, line 12: phoneHomeIntervalInSecs (value: 900). Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug' Checking conf files for problems... Done Checking default conf files for edits... if I move the  phoneHomeIntervalInSecs  entry to under the [deployment-client] I don't get the error [deployment-client] clientName =<redacted> phoneHomeIntervalInSecs = 900 [target-broker:depolymentServer] targetUri = splunkuf-<redacted>:8089 # # The splunk UF will phone home every 14400 sec = 4hrs # 300sec=5min 900sec=15min # phoneHomeIntervalInSecs = 900   Please advise on the correct location for this setting   Thanks  
Aloha,  We’ve a reporting requirement to create a Pie chart using 2 input files.  So far we’ve successfully created Bar charts with inputlookup files.    Could you please advise the best way to c... See more...
Aloha,  We’ve a reporting requirement to create a Pie chart using 2 input files.  So far we’ve successfully created Bar charts with inputlookup files.    Could you please advise the best way to create a Pie chart using 2 inputlookup files?   Thanks in advance.
Hi, i'm trying to execute a query on SQL editor. But the problem is, the editor doesn't let me write or execute any query and i have the admin permission level. This is the first time i'm having this... See more...
Hi, i'm trying to execute a query on SQL editor. But the problem is, the editor doesn't let me write or execute any query and i have the admin permission level. This is the first time i'm having this issue.
I hate hardcoding dynamic things. Sooner or later those thing break. I have data with fields   ... forecast_2020=400, forecast_2021=500, forecast_2022=650, forecast_2023=800 ...   and in some sea... See more...
I hate hardcoding dynamic things. Sooner or later those thing break. I have data with fields   ... forecast_2020=400, forecast_2021=500, forecast_2022=650, forecast_2023=800 ...   and in some search I need to use the correct forecast for the current year. What I could do is   ... | eval year=strftime(now(),"%Y"), forecast=case(year==2021, forecast_2021, year==2022, forecast_2022, year==2023, forecast_2023, 1==1, 0)   This definitely results in problems in 2024; by then I will have a field forecast_2024 but nobody will remember to update the search. I'd rather use something along these lines:   ... | eval year=strftime(now(),"%Y"), forecast=coalesce(forecast_{year}, 0)   However, the {} trick can only be used on the left hand side in eval. Is there any similar cool trick which works on the right hand side?
I have a dashboard where i have date filter in DD/MM/YYY format and have a table which shows data for the dates selected in the drop down and it filters based on the date selected. Now i have a requ... See more...
I have a dashboard where i have date filter in DD/MM/YYY format and have a table which shows data for the dates selected in the drop down and it filters based on the date selected. Now i have a required to additionally show data of 7 days back too. Means the data currently showing for the date selected along with the data which was 7 days back too. For example : If date selected on drop down is 07/01/2021 then 1st table should show data for 7th Jan and 2nd table should show data for 1st Jan. My fields are like HOST (server hostname) and RESULT (shows 2 values as either PASS or FAIL). so the table i have created is  <index = XXX | | stats count(eval(searchmatch("PASS"))) AS PASS count(eval(searchmatch("FAIL"))) AS FAIL by HOST> This gives me 2 fields as PASS and FAIL count against the HOST for the date selected. My requirement is merge both the dates data into one table, but even i can make it in 2 separate tables then it should do.. Can any one help guide me..
Need to declare in spl Include only those file that has ended with date not .bz2 (I don’t want to use  NOT)   Here is spl: index="myindex" source="/data/app/20211209/CUS/app.log.*" | dedup source|... See more...
Need to declare in spl Include only those file that has ended with date not .bz2 (I don’t want to use  NOT)   Here is spl: index="myindex" source="/data/app/20211209/CUS/app.log.*" | dedup source| table source   Return: /data/app/20211209/CUS/app.log.2021-12-09.bz2 /data/app/20211209/CUS/app.log.2021-12-09   I try below spl but doesn’t return result source="/data/app/20211209/CUS/app.log.*.”   Any idea? Thanks
We use Splunk for storing and analyzing Windows security events. We now want to start storing firewall events related tot management ports.   I plan to use the following for retrieving the relevant... See more...
We use Splunk for storing and analyzing Windows security events. We now want to start storing firewall events related tot management ports.   I plan to use the following for retrieving the relevant data from the Windows security log whitelist9 = EventCode="(?:515[67])" Message="(?i)Direction\:\t+Inbound" Message="Destination\sPort\:\t+(135|139|445|3389|5985|5986)"   I would like to store these events using a diiferent source type than the other events from [WinEventLog://Security]   How can I achieve this?