All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

i have splunk 7.3.1.  i have a dashboard with a col chart that displays _time and metric_value i want to add additions data that gets displayed but is not relevant to metric_value or _time 1) when ... See more...
i have splunk 7.3.1.  i have a dashboard with a col chart that displays _time and metric_value i want to add additions data that gets displayed but is not relevant to metric_value or _time 1) when publishing the event attributes do i use the key "dimensions" and add an arbitrary dictionary? 2) will "dimensions" values be displayed in the col chart toolbar?
Enable alerts and reports on real-time searches seen in the internal audit index.
Hi,  I am working on a query to write an alert where i need to monitor few pages for 500 Errors. Now currently there are some investigations going on to fix those errors so we usually get few every 3... See more...
Hi,  I am working on a query to write an alert where i need to monitor few pages for 500 Errors. Now currently there are some investigations going on to fix those errors so we usually get few every 30 minutes. So basically at the moment not every 5xx is a problem and I know it's hard for us to see, which of those are "real problems" and which not. What I want to do is something like adjust our thresholds in the Splunk search so that it only recognise peaks, which are surely something "going wrong" and then trigger the alert. I have a alert like this at the moment which is triggered every 30 minutes. index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=50* | stats count by status | where (status=500 AND count > 10)   Right now it gets triggered when the count of 500 error is 10 in last 30 minutes but I don't want that as it is unnecessary flooding the mailbox.  Is there a way to achieve what I am required here please?
Hi,    How can I find PII data in our email dashboard. Thank you Personally Identifiable Information Detected Detects personally identifiable information (PII) in the form of payment card data ... See more...
Hi,    How can I find PII data in our email dashboard. Thank you Personally Identifiable Information Detected Detects personally identifiable information (PII) in the form of payment card data in machine-generated data. Some systems or applications inadvertently include sensitive information in logs thus exposing it in unexpected ways. No specific data model: system log files, application log files, network traffic payloads, etc.
Hello, Ignoring commas and spaces, how do I grab just the name string from the below log? Below regex kept returning the value of FirstName. It's not seeing name as "Name=", but as "*Name=*". Regex... See more...
Hello, Ignoring commas and spaces, how do I grab just the name string from the below log? Below regex kept returning the value of FirstName. It's not seeing name as "Name=", but as "*Name=*". Regex:   "Name=(?<nameCaptured>[^\,]*)"   Log:   FirstName=Hello, LastName=World, Name=HelloWorld, Address=NoAddress  
Hello all, I am having a problem with my Splunk install that it has stopped accepting syslogs from my Cisco ASA. It was working until 2 days ago. I have the ASA sending the logs to the Splunk server... See more...
Hello all, I am having a problem with my Splunk install that it has stopped accepting syslogs from my Cisco ASA. It was working until 2 days ago. I have the ASA sending the logs to the Splunk server. I am trying to get it back up and running so that I can try and get my ASA access lists worked on. The Splunk install is on a Windows 2016 server and I do not know much about Linux. I really do not want to blow it away and have to redo it all from scratch...  I know the server is getting the logs as I have Kiwi on the box as well and it shows the logs as they come in. The firewall is off on the Windows server as well. Can anyone help and point me in the right direction to find out what is going on and get it fixed? Thanks!
What I am trying to accomplish with the command is to find the events with the EventCode "4624" and Logon_Type "10" or "2", and to name them as "RDP", however i get the following error: Here is the ... See more...
What I am trying to accomplish with the command is to find the events with the EventCode "4624" and Logon_Type "10" or "2", and to name them as "RDP", however i get the following error: Here is the query below: index=wineventlogsecurity source=xmlWinEventLog:Security | stats count(eval(EventCode="4624") AND (Logon_Type="10")) AS RDP Then I get this error:  Error in 'stats' command: The eval expression for dynamic field 'eval(EventCode="4624") AND (Logon_Type="10")' is invalid. Error='The operator at ') AND (Logon_Type="10"' is invalid.'. Thanks in advance for any help! and apologies for the newbie questions as I am rather new to Splunk.
I'm using the Splunk Addon for Microsoft Cloud Service to import our ATP / Microsoft Defender Endpoint Data into Splunk. I've succeeded into getting the data in but the events aren't getting separate... See more...
I'm using the Splunk Addon for Microsoft Cloud Service to import our ATP / Microsoft Defender Endpoint Data into Splunk. I've succeeded into getting the data in but the events aren't getting separated correctly. Below is a screenshot of a single event. Each Record should be an individual Splunk event.  My question is should the Splunk Addon for Microsoft Cloud Service automatically parse this out or is this something I should work through in the props.conf and linebreaks.    Here's the information I used to set this up:  ATP to EventHub: https://docs.microsoft.com/en-us/windows/security/threat-protection/microsoft-defender-atp/raw-data-export-event-hub Event Hub to Splunk: https://www.splunk.com/en_us/blog/platform/splunking-azure-event-hubs.html    
I would like to see instances with the source 'test*' - that is everything that starts with 'test' but eliminate 'testn' occurrences. I am fine with everything else so 'test1', 'test2' are okay but n... See more...
I would like to see instances with the source 'test*' - that is everything that starts with 'test' but eliminate 'testn' occurrences. I am fine with everything else so 'test1', 'test2' are okay but not 'testn1' and not 'testn2'.  Do I need regex or is there another way of doing this? 
Hi I want to find all "Error Message" in my log file and get everything after  that, with field extraction. Here is my log: 2021-01-26 23:55:55,265 ERROR APP-02-3452345 [Processor] Error Message, ... See more...
Hi I want to find all "Error Message" in my log file and get everything after  that, with field extraction. Here is my log: 2021-01-26 23:55:55,265 ERROR APP-02-3452345 [Processor] Error Message, [00000000000000] was not validated with [n{0,18}] format 2021-01-26 23:55:55,264 ERROR APP-02-2431234 [CPI] Error List: Severity: [0], Type: [FORMAT_ERROR], data: [message.body], object: [ID], message: [Error Message, [000000000000D] was not validated with [n{0,18}] format] Data: root: message 1: message.header ... Here is expectation: [0000000000000000] was not validated with [n{0,18}] format [000000000000000D] was not validated with [n{0,18}] format   Thanks,
Hi,  I have installed new TA-MS-AD version (3.0.1) and I can see several errors : 01-27-2021 15:12:23.900 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA... See more...
Hi,  I have installed new TA-MS-AD version (3.0.1) and I can see several errors : 01-27-2021 15:12:23.900 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" ERROR'access_token' 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" KeyError: 'access_token' 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" return response['access_token'] 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_azure_utils/auth.py", line 59, in _get_access_token 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" raise e 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_azure_utils/auth.py", line 61, in _get_access_token 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" return _get_access_token(endpoint, helper, payload) 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_azure_utils/auth.py", line 51, in get_mgmt_access_token 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" raise e 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_azure_utils/auth.py", line 53, in get_mgmt_access_token 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" access_token = azauth.get_mgmt_access_token(client_id, client_secret, tenant_id, environment, helper) 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/input_module_azure_comp.py", line 47, in collect_events 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" input_module.collect_events(self, ew) 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py", line 108, in collect_events 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" self.collect_events(ew) 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_ms_aad/aob_py3/modinput_wrapper/base_modinput.py", line 128, in stream_events 01-27-2021 15:12:23.853 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" Traceback (most recent call last): 01-27-2021 15:09:30.187 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" ERROR'access_token' 01-27-2021 15:09:30.126 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" Key Error: 'access_token' 01-27-2021 15:09:30.126 +0000 ERROR ExecProcessor - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/TA-MS-AAD/bin/azure_comp.py" return response['access_token'] I checked the settings and all is right, BUT I does not indexing computing logs. Nothing goes back in splunk from compute input. The older version produce same error logs.  Can you help me ?  
Hello can you help I am not the most technical and I have signed up to the UAT of Splunk. Logged on as a Administrator and built a dashboard simply form a CSV file use in the Add Data Functionality ... See more...
Hello can you help I am not the most technical and I have signed up to the UAT of Splunk. Logged on as a Administrator and built a dashboard simply form a CSV file use in the Add Data Functionality path of creating a dashboard. So I have a LIVE environment and when I go to add data this is not available, I see I have to add data using a Lookup functionality?  I only want to create a dashboard based on a CSV that is it.  I am little confused training one way and not having this functionality available to me in Live?  I might be confused myself can you help. Do I need a lookup ?  Plus when I uploaded file I get this error via the Lookup method of the flat CSV file I want to use.  The File does not contain any invalid characters.     I am little bit nervous about what I have to do and I am confused on how this system Please Please can you help... Encountered the following error while trying to save: Invalid name: only alphanumeric characters, '-', '_', and '.' are allowed. 
Hello,     I would like to search our email data for sensitive info ..ie Social Security #'s etc. I have an email dashboard created to ingest our exchange info.                                     ... See more...
Hello,     I would like to search our email data for sensitive info ..ie Social Security #'s etc. I have an email dashboard created to ingest our exchange info.                                                                          Thank You
I am using below app to pull the alerts from ATP to Splunk, which actually gives functionality to pull the data directly from ATP, alert with evidence or with associated user or any of the data that ... See more...
I am using below app to pull the alerts from ATP to Splunk, which actually gives functionality to pull the data directly from ATP, alert with evidence or with associated user or any of the data that is supported by Advanced Hunting query. https://splunkbase.splunk.com/app/4623/#/details But this is not consistent it actually stops in between, then i need to disable and reenabled the inputs to get this work again. Setup is pretty simple, set the id and then to set the Advanced hunting query from ATP. This app is really nice and can fulfil lot of use case of pulling the data from ATP other than only alerts, so I really wanted to get this worked consistently as do not wanted to skip the alerts from ATP to Splunk where our entire Ops team relies to take further action on the alert. Have also raised a case with Splunk support but this add on is not supported by the support so I am raising the concern over here if anyone has the same issue and if have solved the same. @jorritf If possible can you please help here as I can see you have developed the application, Thank you in advanced.
I have a query to detect missing forwarders (hosts)   | metadata type=hosts | eval age = now() - lastTime | search host=* | search age > 10 | sort age d | convert ctime(lastTime) | fields age,host,... See more...
I have a query to detect missing forwarders (hosts)   | metadata type=hosts | eval age = now() - lastTime | search host=* | search age > 10 | sort age d | convert ctime(lastTime) | fields age,host,lastTime     This works and, obviously, reveals the age, host, and last time they were seen.  I need to also include the index where the host is sending its data.  Since this query is using a metadata, that information doesn't appear to be available.  How can I modify this search to also include the actual index to which a host is reporting?  
How can I run a ldapsearch command from Splunk to get the list of user attribute names ONLY not the values available in a AD directory. I am using SA-ldap add-on but don't see the parameters that I n... See more...
How can I run a ldapsearch command from Splunk to get the list of user attribute names ONLY not the values available in a AD directory. I am using SA-ldap add-on but don't see the parameters that I need to use to get the attributes name list. Looking back into AD documentation there is a option for --typesOnly but not sure how to use it in Splunk ldap search query. my query is as follows:  | ldapsearch domain=abc search="(&(objectClass=user)(!(objectClass=computer)))"
Hello, We're running Splunk 8.0.3 with a 2G/day license and want to load a CSV with 332928 lines so that we can use it to enrich our events:     [root@splunk lookups]# pwd /opt/splunk/etc/apps/se... See more...
Hello, We're running Splunk 8.0.3 with a 2G/day license and want to load a CSV with 332928 lines so that we can use it to enrich our events:     [root@splunk lookups]# pwd /opt/splunk/etc/apps/search/lookups [root@splunk lookups]# wc -l int_des.csv 332928 int_des.csv [root@splunk lookups]# ls -l int_des.csv -rw------- 1 root root 23997247 Jan 26 15:51 int_des.csv     The problem we're facing is apparently the CSV is not loaded completely. When we query like this: | inputlookup int_des we only get 173201 records. Is there some limit we're hitting?   This is our limits.conf from /opt/splunk/etc/system/local/limits.conf:       [root@splunk lookups]# cat /opt/splunk/etc/system/local/limits.conf [search] allow_batch_mode = 1 allow_inexact_metasearch = 0 always_include_indexedfield_lispy = 0 default_allow_queue = 1 disabled = 0 enable_conditional_expansion = 1 enable_cumulative_quota = 0 enable_datamodel_meval = 1 enable_history = 1 enable_memory_tracker = 0 force_saved_search_dispatch_as_user = 0 load_remote_bundles = 0 log_search_messages = 0 read_final_results_from_timeliner = 1 record_search_telemetry = 1 remote_timeline = 1 search_retry = 0 timeline_events_preview = 0 track_indextime_range = 1 track_matching_sourcetypes = 1 truncate_report = 0 unified_search = 0 use_bloomfilter = 1 use_metadata_elimination = 1 use_search_evaluator_v2 = 1 write_multifile_results_out = 1 ############################################################################ # Concurrency ############################################################################ # This section contains settings for search concurrency limits. # The total number of concurrent searches is # base_max_searches + #cpus*max_searches_per_cpu # The base number of concurrent searches. base_max_searches = 6 # Max real-time searches = max_rt_search_multiplier x max historical searches. max_rt_search_multiplier = 10 # The maximum number of concurrent searches per CPU. max_searches_per_cpu = 10 [lookup] # Maximum size of static lookup file to use a in-memory index for. max_memtable_bytes = 262144000 # Maximum reverse lookup matches (for search expansion). max_reverse_matches = 50 # Default setting for if non-memory file lookups (for large files) # should batch queries. # Can be overridden using a lookup table's stanza in transforms.conf. batch_index_query = true # When doing batch request, what's the most matches to retrieve? # If more than this limit of matches would otherwise be retrieved, # we will fall back to non-batch mode matching. batch_response_limit = 5000000 # Maximum number of lookup error messages that should be logged. max_lookup_messages = 20 # time to live for an indexed csv indexed_csv_ttl = 300 # keep alive token file period indexed_csv_keep_alive_timeout = 30 # max time for the CSV indexing indexed_csv_inprogress_max_timeout = 300       Where should we look for relevant logs? What can we do to troubleshoot further? Thanks
Hello can you help I am not the most technical and I have signed up to the UAT of Splunk. Logged on as a Administrator and built a dashboard simply form a CSV file use in the Add Data Functionality ... See more...
Hello can you help I am not the most technical and I have signed up to the UAT of Splunk. Logged on as a Administrator and built a dashboard simply form a CSV file use in the Add Data Functionality path of creating a dashboard. So I have a LIVE environment and when I go to add data this is not available, I see I have to add data using a Lookup functionality?  I only want to create a dashboard based on a CSV that is it.  I am little confused training one way and not having this functionality available to me in Live?  I might be confused myself can you help. Do I need a lookup ?  Plus when I uploaded file I get this error via the Lookup method of the flat CSV file I want to use.  The File does not contain any invalid characters.     I am little bit nervous about what I have to do and I am confused on how this system Please Please can you help... Encountered the following error while trying to save: Invalid name: only alphanumeric characters, '-', '_', and '.' are allowed. 
Hi, looked through documentation and Splunk answers but did not find reason/root cause for the following obervation: We have an index with 2 sourcetypes. one is JSON, the other plain text. Event e... See more...
Hi, looked through documentation and Splunk answers but did not find reason/root cause for the following obervation: We have an index with 2 sourcetypes. one is JSON, the other plain text. Event examples:   Sourcetype 1, JSON notation: { [-] context: xyz criteria: { [+] } device: desktop results: [ [-] 50832171 ] searchType: QuickSearch } Raw Text notation: {"context":"xyz","device":"desktop","searchType":"QuickSearch","results":["50832171"],"criteria":{"Item name":"Example"}} Sourcetype 2, None-JSON event: 2021-01-27 10:27:39.000, timestamp="2021-01-27 10:27:39.0", context="abc", searchType="Advanced Search", device="Mobile", criteria="Item Name", results="93751371"   I want to do a lookup of the values in the results field.  Problem: in the JSON event, this is an array, in the None-JSON a string. So I tried to use spath to extract results{} into field results and then do a lookup with that common field name for both sourcetypes:   <base search> | spath output=results path=results{} | lookup myLookup id as results   The problem is, when I do this, the results field in the None-JSON event disappears... So without the spath I have an auto-extracted results{} field in the JSON event and a results field in the None-JSON. Adding the spath removed the results field from the None-JSON event. --> Why? I have found a way to work around this, but I would like to understand the technical reasone behind the behaviour. My workaround is:   | spath output=result path=results{} | eval results=coalesce(results, result) | lookup myLookup id as results  
Hello to all, I have the following question: Currently we are using SLES12SP1 and Splunk Enterprise version 7.2.9. Now we want to upgrade the Linux machine and also the Splunk version. The Linux ... See more...
Hello to all, I have the following question: Currently we are using SLES12SP1 and Splunk Enterprise version 7.2.9. Now we want to upgrade the Linux machine and also the Splunk version. The Linux version should be upgraded to SLES12SP5 and Splunk should first be upgraded to 7.3.8, later to 8.x. Are the SLES version and Splunk compatible? I have looked in the docs but found nothing. I am grateful for any help. Many greetings