All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi I am trying to send an alert to Telegram from Splunk using the default webhooks as the Telegram alert action is no longer supported (https://splunkbase.splunk.com/app/4917). Has anyone managed t... See more...
Hi I am trying to send an alert to Telegram from Splunk using the default webhooks as the Telegram alert action is no longer supported (https://splunkbase.splunk.com/app/4917). Has anyone managed to get it working, there is no documentation on the Splunk site on using the webhook. I am using the string below that works in a browser, but not in Splunk, so not sure how to format the string. https://api.telegram.org/bot#########/sendMessage?chat_id=-########&text=test Thanks  
Tried below regex to blacklist OR ignore 4688 event codes from the *.exe coming from the splunk forwarder path/directory But not working, it's considering 4688 from splunk and non-splunk path OR n... See more...
Tried below regex to blacklist OR ignore 4688 event codes from the *.exe coming from the splunk forwarder path/directory But not working, it's considering 4688 from splunk and non-splunk path OR not sending events from both splunk and non-splunk path. Looking for a regex to be added as blacklist to ignore 4688 coming from *.exe files part of splunk universal forwarder path/directory   blacklist = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder)?\bin\(?:btool|splunkd|splunk|splunk-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi)).exe)" blacklist = EventCode="4688" Message="New Process Name: (?:[a-zA-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.+\.exe)" blacklist = EventCode="4688" Message="New Process Name: (?:[a-zA-Z]:\\\\Program Files\\\\Splunk(?:\\\\UniversalForwarder)?\\\\bin\\\\.+\\.exe)" blacklist = EventCode="4688" Message="New Process Name: C:\\\\Program Files\\\\SplunkUniversalForwarder\\\\bin\\\\" blacklist = EventCode="4688" Message="New Process Name: C:\\Program Files\\SplunkUniversalForwarder\\bin\\" blacklist = EventCode="4688" Message="New Process Name: (?i)[A-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.*\\.exe)" blacklist = EventCode="4688" Message="New Process Name:\s*[A-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.+\\.exe)" blacklist = EventCode="4688" Message="New Process Name:\s*[A-Z]:\\\\Program Files\\\\SplunkUniversalForwarder\\\\bin\\\\.*\\.exe"
I'm looking to export notable events from the Incident Review dashboard in Splunk Enterprise Security to a CSV/Excel file. I need to include the following details: Notable Name (Rule Name) Notable... See more...
I'm looking to export notable events from the Incident Review dashboard in Splunk Enterprise Security to a CSV/Excel file. I need to include the following details: Notable Name (Rule Name) Notable Triggered Time Time Assigned for Investigation Conclusion (e.g., True Positive (TP), False Positive (FP), Benign True Positive (BTP)) Open/Closed Status What would be the best SPL query or method to extract this information? Also, is there a way to automate this export on a scheduled basis?   Currently using the SPL query: `notable` | eval original_time=strftime(orig_time,"%c") | eval reviewing_time=strftime(review_time,"%c") | table search_name, comment, disposition_label, original_time, reviewing_time, owner, search_name, reviewer, status, status_description, status_label, urgency, username and I'm getting results. However, I'm not getting an ID to locate and go through an individual notable if I wanted to. How can I search for a specific notable? Is there a tracking number for a notable? I'd like to include it in my table as well.
Hi, does someone knows where we can download the app for the BMC AMI Defender logs. Splunk base provides a link to a BMC 404 page not found error. I looked around on the BMC site when logged in, but... See more...
Hi, does someone knows where we can download the app for the BMC AMI Defender logs. Splunk base provides a link to a BMC 404 page not found error. I looked around on the BMC site when logged in, but couldn't find the app. BMC AMI Defender | Splunkbase Is there another place we can grab this APP? The BMC documentation of the APP is here: BMC AMI Defender App for Splunk 2.9 - BMC Documentation Thanks!
Hello, I am attempting to forward data from an older indexer to a new indexer so that I can decommission the server the old indexer currently sits on. These indexers are not currently clustered, and... See more...
Hello, I am attempting to forward data from an older indexer to a new indexer so that I can decommission the server the old indexer currently sits on. These indexers are not currently clustered, and the old is set up to forward to the new (so the indexes are all mirrored), but this was only sending new data, not any of the previously indexed data on the old. What are my options? Am I able to forcibly forward the old data to the new? Do I need to manually sync the old data and the new by passing the old buckets to the new indexer? Ideally I'd like to migrate the data over time (there's a fair amount), but in my research so far that doesn't appear feasible.
The current version is not available for the cloud. According to conversations with Splunk Support, the update addresses a skipped jobs issue when the status on Salesforce REST API tool is at idle. ... See more...
The current version is not available for the cloud. According to conversations with Splunk Support, the update addresses a skipped jobs issue when the status on Salesforce REST API tool is at idle. Please update version 1.0.6 for cloud support compatibility and ensure future updates are cloud compatible. 
Hi Community, I have the following challenge. I have different events, and for each event, I want to generate a summary with different values. These values are defined in a lookup table. The fo... See more...
Hi Community, I have the following challenge. I have different events, and for each event, I want to generate a summary with different values. These values are defined in a lookup table. The following example: E1: id=1 , dest_ip=1.1.1.1, src_ip=2.2.2.2,..... E2: id=2, user=bob,  domain=microsoft E3: id=3 county=usa, city=seattle E4: id=4 company=cisco, product=splunk Lookup Table: (Potential more fieldnames) ID Field1  Field2 1 dest_ip src_ip 2 user domain 3 country   4 company product Expected Output: id1: Summary dest_ip=1.1.1.1 src_ip=2.2.2.2 Id2: Summary user=bob domain=microsoft id3: Summary country=usa Id4: Summary company=splunk, product =splunk The solution could be using a case function but it doesn't scale well becuse I woult need to add a new line for each case. Potentially, the number of cases could grow to 1000. I tried to solve with foreach, but I am unable to retrieve the values from the event. Here's the query I tried.       index=events | lookup cases.csv id OUTPUT field1, field2 | foreach field* [ eval summary = summary + "<<field>>" + ":" <<ITEM>> ] table id, summary       Thanks for your help! Alesyo
Dear Splunk community, I have following sample input data, containing JSON snippets in MV fields:   | makeresults count=5 | streamstats count as a | eval _time = _time + (60*a) | eval json1="{\"... See more...
Dear Splunk community, I have following sample input data, containing JSON snippets in MV fields:   | makeresults count=5 | streamstats count as a | eval _time = _time + (60*a) | eval json1="{\"id\":1,\"attrib_A\":\"A1\"}#{\"id\":2,\"attrib_A\":\"A2\"}#{\"id\":3,\"attrib_A\":\"A3\"}#{\"id\":4,\"attrib_A\":\"A4\"}#{\"id\":5,\"attrib_A\":\"A5\"}", json2="{\"id\":2,\"attrib_B\":\"B2\"}#{\"id\":3,\"attrib_B\":\"B3\"}#{\"id\":4,\"attrib_B\":\"B4\"}#{\"id\":6,\"attrib_B\":\"B6\"}" | makemv delim="#" json1 | makemv delim="#" json2 | table _time, json1, json2   The lists of ids in json1 and json2 may be disjoint, identical or overlap. For example, in above data, id=1 and id=5 only exist in json1, id=6 only exists in json2, the other ids exist in both. Attributes can be null values, but may then be treated as if the id didn't exist. For each event, I would like to merge the data from json1 and json2 into a single table with columns id, attrib_A and attrib_B. The expected output for the sample data would be: _time id attrib_A attrib_B t 1 A1 null t 2 A2 B2 t 3 A3 B3 t 4 A4 B4 t 5 A5 null t 6 null B6 ... ... ... ... t+5 1 A1 null t+5 2 A2 B2 t+5 3 A3 B3 t+5 4 A4 B4 t+5 5 A5 null t+5 6 null B6 How can I achieve this in a straighforward way? The following works for the sample data, but it seems overly complicated and am not sure if it works in all cases:   ```insert after above sample data generation:``` ```extract and expand JSONs``` | mvexpand json2 | spath input=json2 | rename id as json2_id | mvexpand json1 | spath input=json1 | rename id as json1_id | table _time, json1_id, attrib_A, json2_id, attrib_B ```create mv fields containing the subsets of IDs from json1 and json2``` | eventstats values(json1_id) as json1, values(json2_id) as json2 by _time | eval only_json1=mvmap(json1, if(isnull(mvfind(json2, json1)), json1, null())) | eval only_json2=mvmap(json2, if(isnull(mvfind(json1, json2)), json2, null())) | eval both=mvmap(json1, if(isnotnull(mvfind(json2, json1)), json1, null())) | table _time, json1_id, attrib_A, json2_id, attrib_B, json1, json2, only_json1, only_json2, both ```keep json2 record if a) json2_id equals json1_id or b) json2_id does not appear in json1``` | eval attrib_B=if(json2_id==json1_id or isnull(mvfind(json1, json2_id)), attrib_B, null()) | eval json2_id=if(json2_id==json1_id or isnull(mvfind(json1, json2_id)), json2_id, null()) ```keep json1 record if a) json1_id equals json2_id or b) json1_id does not appear in json2``` | eval attrib_A=if(json1_id==json2_id or isnull(mvfind(json2, json1_id)), attrib_A, null()) | eval json1_id=if(json1_id==json2_id or isnull(mvfind(json2, json1_id)), json1_id, null()) ```remove records where json1 and json2 are both null``` | where isnotnull(json1_id) or isnotnull(json2_id) | table _time, json1_id, attrib_A, json2_id, attrib_B | dedup _time, json1_id, attrib_A    Thank you!
  Commands used to run docker image: docker run -d -p 9997:9997 -p 8080:8080 -p 8089:8089 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=test12345" --name uf splunk/universalforward... See more...
  Commands used to run docker image: docker run -d -p 9997:9997 -p 8080:8080 -p 8089:8089 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=test12345" --name uf splunk/universalforwarder:latest Seeing below error when Splunkforwarder image in starting up in docker. 2025-03-05 14:47:58 included: /opt/ansible/roles/splunk_universal_forwarder/tasks/../../../roles/splunk_common/tasks/check_for_required_restarts.yml for localhost 2025-03-05 14:47:58 Wednesday 05 March 2025 09:17:58 +0000 (0:00:00.044) 0:00:30.316 ******* 2025-03-05 14:48:31 FAILED - RETRYING: [localhost]: Check for required restarts (5 retries left). 2025-03-05 14:48:31 FAILED - RETRYING: [localhost]: Check for required restarts (4 retries left). 2025-03-05 14:48:31 FAILED - RETRYING: [localhost]: Check for required restarts (3 retries left). 2025-03-05 14:48:31 FAILED - RETRYING: [localhost]: Check for required restarts (2 retries left). 2025-03-05 14:48:31 FAILED - RETRYING: [localhost]: Check for required restarts (1 retries left). 2025-03-05 14:48:31 2025-03-05 14:48:31 TASK [splunk_universal_forwarder : Check for required restarts] **************** 2025-03-05 14:48:31 fatal: [localhost]: FAILED! => { 2025-03-05 14:48:31 "attempts": 5, 2025-03-05 14:48:31 "changed": false, 2025-03-05 14:48:31 "changed_when_result": "The conditional check 'restart_required.status == 200' failed. The error was: error while evaluating conditional (restart_required.status == 200): 'dict object' has no attribute 'status'. 'dict object' has no attribute 'status'" 2025-03-05 14:48:31 } 2025-03-05 14:48:31 2025-03-05 14:48:31 MSG: 2025-03-05 14:48:31 2025-03-05 14:48:31 GET/services/messages/restart_required?output_mode=jsonadmin********8089NoneNoneNone[200, 404];;; failed with NO RESPONSE and EXCEP_STR as Not supported URL scheme http+unix Splunk.d is running fine, the ports are open as well Tried to curl http://localhost:8089/services/messages/restart_required?output_mode=json
Hi All, I am new to Power BI. My question is, how do we integrate between Splunk and Power BI. Is there an Official guide or manual from Splunk on how we configure this integration for both ? ... See more...
Hi All, I am new to Power BI. My question is, how do we integrate between Splunk and Power BI. Is there an Official guide or manual from Splunk on how we configure this integration for both ? Cheers Zamir
Hi All, In SPL2 Ingest Pipeline I want to assemble a regular expression and then use that in a rex command but I am having trouble. For example this simple test I am specifying the regex as a text ... See more...
Hi All, In SPL2 Ingest Pipeline I want to assemble a regular expression and then use that in a rex command but I am having trouble. For example this simple test I am specifying the regex as a text string on the rex command works: But this version doesnt: Any idea what I am doing wrong? Thanks
Hi, I am new to Ingest Processor and have had some success but am having an issue with the rex command so I have created a very simple example copied from the manual here https://docs.splunk.com/Doc... See more...
Hi, I am new to Ingest Processor and have had some success but am having an issue with the rex command so I have created a very simple example copied from the manual here https://docs.splunk.com/Documentation/SCS/current/SearchReference/RexCommandExamples#2._Regular_expressions_with_character_classes But I am getting this error: Any ideas why? Thanks      
We are using the Splunk Add-On for GWS Version3.0.3 for Splunk Cloud and receiving this error when attempting to pull in the (user) identities portion. I have tried both 'admin_view' and 'domain_publ... See more...
We are using the Splunk Add-On for GWS Version3.0.3 for Splunk Cloud and receiving this error when attempting to pull in the (user) identities portion. I have tried both 'admin_view' and 'domain_public' in the Inputs config with same error. All other functions are working fine. I need to bring in this sourcetype "gws_users_identity" to populate our identities lookup. Has anyone else encountered this? Maybe you found a "fix"?   ERROR pid=<redacted> tid=MainThread file=log.py:log_exception:351 | exc_l="User Identity Error" Exception raised while ingesting data for users: <HttpError 400 when requesting https[:]//admin.googleapis.com/admin/directory/v1/users?customer=<redacted>&orderBy=email&maxResults=500&viewType=domain_public&alt=json returned "Bad Request". Details: "[{'message': 'Bad Request', 'domain': 'global', 'reason': 'badRequest'}]">. Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_Google_Workspace/bin/gws_user_identity.py", line 139, in stream_events service.users()  
What is the best practice to have a Splunk heavy forwarder call out to a third party API and pull logs into Splunk. Most of the solutions I use have apps on Splunk base but this one does not. Do I ha... See more...
What is the best practice to have a Splunk heavy forwarder call out to a third party API and pull logs into Splunk. Most of the solutions I use have apps on Splunk base but this one does not. Do I have to build a custom add-on using something like the add-on builder? 
I have a file I'm monitoring that changes several times a day. It is likely that sometimes the file contents will be the same as a previous iteration, but not guaranteed (the file name name does not ... See more...
I have a file I'm monitoring that changes several times a day. It is likely that sometimes the file contents will be the same as a previous iteration, but not guaranteed (the file name name does not change). The file is in text format and is a few dozen lines long. I want to process the file every time the modtime changes, even if the content is 100% the same, and I want to create a single event with the contents each time. props.conf: [my_sourcetype] DATETIME_CONFIG = current BREAK_ONLY_AFTER = nevereverbreak [source::/path/to/file-to-be-read] CHECK_METHOD = modtime sourcetype = my_sourcetype inputs.conf: [monitor:///path/to/file-to-be-read] disabled = 0 sourcetype = my_sourcetype crcSalt = some_random_value_to_try_to_make_it_always_read   If I update file-to-be-read manually by adding new lines to the end, it gets read in immediately and I get an event just like I want. But when the automated process creates the file (with an updated modtime), Splunk seems not to be interested in it. Perms are correct and splunkd.log reflects that the modtime is different and it's re-reading the file... but it doesn't create a new event. I'm sure I'm missing something obvious, but I'd appreciate any advice. Cheers.  
Hello, I am trying to write a search query for responding byte sizes that is a catch all. Currently I have: index=index  8.8.8.8 | stats sum(resp_bytes) as resp_bytes | eval resp_bytes=if(resp_by... See more...
Hello, I am trying to write a search query for responding byte sizes that is a catch all. Currently I have: index=index  8.8.8.8 | stats sum(resp_bytes) as resp_bytes | eval resp_bytes=if(resp_bytes=0, "0B",if(resp_bytes<1000000,resp_bytes/1024 . "KB",if(resp_bytes>1000000,resp_bytes/1024/1024 . "MB", null)))  I have tested this and it works, but now i am trying to add in a "round" to the 2nd decimal spot. and Im not sure where it would go.  
Hello team, In my distributed Splunk lab created on VMware client virtual machine, facing the below issues.  Distributed environment consists of below components with Splunk free  licences - 4 Inde... See more...
Hello team, In my distributed Splunk lab created on VMware client virtual machine, facing the below issues.  Distributed environment consists of below components with Splunk free  licences - 4 Indexers (part of an Indexer Cluster) - 1 Cluster Manager (for managing the indexer cluster) - 2 Universal Forwarders (UFs) sending data - 1 DS/LM/MC (Deployment Server + License Manager + Monitoring Console combined on one server) - 1 Search Head (for searching and dashboards)    I am facing an issue to enable Splunk monitoring for /opt/log directory. I have checked that /var/log can be monitored successfully whereas Splunk forwarder is failed to monitor /opt/log directory. I have checked permission issue other things but no luck 
Hy, By reading the documentation, it seems like the Splunk ESCU app is build with contentctl from its git content GitHub - splunk/security_content: Splunk Security Content. I tried with several r... See more...
Hy, By reading the documentation, it seems like the Splunk ESCU app is build with contentctl from its git content GitHub - splunk/security_content: Splunk Security Content. I tried with several release, the latest included: Release v5.1.0 · splunk/security_content · GitHub. The build constantly fail.  A whole bunch of: " Error: 1 validation error for Detection Value error, Found 1 issues when resolving references Security Content Object names: - Failed to find the following 'DataSource' " Did I miss something? I tried finding a switch to ignore the errors and build the app anyway without success. The dist directory remain empty. I used a clean Ubuntu 24.04.2 LTS and used : apt update apt full-upgrade reboot now apt update apt install pipx pipx ensurepath reboot now pipx install contentctl wget https://github.com/splunk/security_content/archive/refs/tags/v5.1.0.tar.gz tar -xzf v5.1.0.tar.gz cd security_content-5.1.0/ contentctl build
I am trying to ingest a csv file which has headers with double quotes " and %. They are separated by comma. But after ingestion if two field names has same name except one has # and the other one has... See more...
I am trying to ingest a csv file which has headers with double quotes " and %. They are separated by comma. But after ingestion if two field names has same name except one has # and the other one has % then it merges both of them into one field while using table output. How to fix this issue. If splunk does`nt support csv headers then i have to remove before ingesting them. Any ideas.
We are planning to on-board Akamai platform logs to Splunk. We are following this link to implement the same - SIEM Splunk connector   In the process we have installed this Akamai add-on - Akamai S... See more...
We are planning to on-board Akamai platform logs to Splunk. We are following this link to implement the same - SIEM Splunk connector   In the process we have installed this Akamai add-on - Akamai SIEM Integration | Splunkbase   When we are going to Settings > Data Inputs as mentioned here - SIEM Splunk connector – we are unable to find this data input - ​Akamai​ Security Incident Event Manager API.   And we are getting the following error in Splunk post installing the add-on.   Deployer       Search head   Can you help us in this challenge? We are stuck at “data inputs”. I think we need to perform these pre-requisites to get this Akamai add-on (Modular Input) work –     Please help us in installing Java in our Splunk instance and whether KVStore is installed or not and is it working fine?