All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi everyone! My goal is to create an alert to monitor in ALL saved search if there's any email that no longer exist (mainly, colleagues that left the company or similar). My idea was to search for ... See more...
Hi everyone! My goal is to create an alert to monitor in ALL saved search if there's any email that no longer exist (mainly, colleagues that left the company or similar). My idea was to search for the same patter of Mail Delivery Subsystem that happens when sending an email from Gmail (or any other) to a non-existing mail. Bud didn't find anything in _internal index, nor with a rest to saved search and index=mail is empty. Amy idea?
Hi, Looking at the activity of the Splunkd threads on the indexers, I've seen in the monitoring console that sometimes there is no activity for a period of 1 minute. Is this normal? evidence  ... See more...
Hi, Looking at the activity of the Splunkd threads on the indexers, I've seen in the monitoring console that sometimes there is no activity for a period of 1 minute. Is this normal? evidence   Regards, thank you very much  
Hey,  So I have a playbook that fetches multiple files and adds them to the soar vault. I can then send each individual file to Jira by specifying the files vault_id in the update_ticket action on t... See more...
Hey,  So I have a playbook that fetches multiple files and adds them to the soar vault. I can then send each individual file to Jira by specifying the files vault_id in the update_ticket action on the Jira app. Ideally I would like to send only one file over to Jira, an archive containing each of the other files. I can create a file and add it to the archive after seeing this post - https://community.splunk.com/t5/Splunk-SOAR/SOAR-Create-File-from-Artifacts/m-p/581662 However, I don't know how I could take each individual file from the vault and add it to this archive before I sent it over. Any help would be appreciated! Thanks
I am building a Splunk dashboard that displays a table of content, once it's displayed I want to have couple of buttons as Stop All and Start All, while clicking the same this in turn execute a searc... See more...
I am building a Splunk dashboard that displays a table of content, once it's displayed I want to have couple of buttons as Stop All and Start All, while clicking the same this in turn execute a search to invoke a Python code to perform the actions. Please can someone guide if that's possible?
hi, Wondering if there is a document or guidance on how to estimate the  volume of data ingested in Splunk by pulling data from DNA Centre using the Splunk Add-on: Cisco DNA Center Add-on. Cheers, ... See more...
hi, Wondering if there is a document or guidance on how to estimate the  volume of data ingested in Splunk by pulling data from DNA Centre using the Splunk Add-on: Cisco DNA Center Add-on. Cheers, Ahmed.
Hi all, I just started a trial for Splunk Cloud , my URL looks similar to this: https://prd-p-s8qvw.splunkcloud.com/en-GB/app/launcher/home   I want to get data in with the HEC. I have read all t... See more...
Hi all, I just started a trial for Splunk Cloud , my URL looks similar to this: https://prd-p-s8qvw.splunkcloud.com/en-GB/app/launcher/home   I want to get data in with the HEC. I have read all the following documentation: https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/Data/UsetheHTTPEventCollector#Configure_HTTP_Event_Collector_on_Splunk_Cloud_Platform According to the documentation, my URL should look like this: https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event However this does not work. It seems the DNS cannot be resolved. My NodeJS gives "ENOTFOUND" I have tried different options (HHTP / HTTPS, host, port etc): HTTP: http://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event HTTPS: https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event GCP: http://http-inputs.prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs.prd-p-s8qvw.splunkcloud.com:8088/services/collector/event host: http://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs-p-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs.s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs.s8qvw.splunkcloud.com:8088/services/collector/event port: http://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs.prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs.prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs.s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs.s8qvw.splunkcloud.com:443/services/collector/event No prefix: http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://p-s8qvw.splunkcloud.com:8088/services/collector/event http://s8qvw.splunkcloud.com:8088/services/collector/event http://s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://p-s8qvw.splunkcloud.com:8088/services/collector/event https://s8qvw.splunkcloud.com:8088/services/collector/event https://s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://p-s8qvw.splunkcloud.com:443/services/collector/event http://s8qvw.splunkcloud.com:443/services/collector/event http://s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://p-s8qvw.splunkcloud.com:443/services/collector/event https://hs8qvw.splunkcloud.com:443/services/collector/event https://s8qvw.splunkcloud.com:443/services/collector/event None of these work. All give one of the following errors: Error: getaddrinfo ENOTFOUND http-inputs-prd-p-s8qvw.splunkcloud.com Error: read ECONNRESET HTTP 400 Sent HTTP to port 443 HTTP 404 Not Found Can anybody help me get this working?   Regards,   Lawrence
Hi Splunk Community, I am looking to edit the inputs.conf file programmatically via the Splunk API. Specifically, I want to know: Is there an API endpoint available to update the inputs.conf file? ... See more...
Hi Splunk Community, I am looking to edit the inputs.conf file programmatically via the Splunk API. Specifically, I want to know: Is there an API endpoint available to update the inputs.conf file? If yes, what would be the correct method to achieve this (e.g., required endpoint, parameters, or payload)? I understand that inputs.conf primarily configures data inputs, and certain operations might have to be performed via the REST API or directly through configuration file updates. Any documentation or examples regarding: Supported Splunk API endpoints for modifying input configurations. Best practices for editing inputs.conf programmatically. Any necessary permissions or prerequisites to perform such updates.
I have been using the Splunk API from within a Python script to retrieve information about saved searches using a call to the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/<name_of_saved_s... See more...
I have been using the Splunk API from within a Python script to retrieve information about saved searches using a call to the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/<name_of_saved_search>?output_mode=json   The <name_of_saved_search> has been URL encoded to deal with some punctuation (including '/'), using the Python function:   name_of_searched_search = urllib.parse.quote(search_name, safe='')   It has been working so far, but recently I encountered an issue when the name of the saved search contains square brackets (e.g. "[123] My Search") Even after URL encoding, Splunk's API just does not accept the API call at the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/%5B123%5D%20My%20Search?output_mode=json   and returns a response with HTTP status code of 404 (Not Found). I am not sure what else I should be doing to handle the square brackets in the name of the saved search to make the API call work.
I encountered an issue while trying to integrate a Python script into my Splunk dashboard to export Zabbix logs to a Splunk index. When I click the button on the dashboard, the following error is log... See more...
I encountered an issue while trying to integrate a Python script into my Splunk dashboard to export Zabbix logs to a Splunk index. When I click the button on the dashboard, the following error is logged in splunkd.log: 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': Traceback (most recent call last): 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': File "/opt/splunk/bin/runScript.py", line 72, in <module> 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': os.chdir(scriptDir) 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': FileNotFoundError: [Errno 2] No such file or directory: '' Setup: Python Script: Location: /opt/splunk/etc/apps/search/bin/zabbix_handler.py Function: Export Zabbix logs to a Splunk index using the HEC endpoint. JavaScript Code: Location: /opt/splunk/etc/apps/search/appserver/static/ Function: Adds a button to the dashboard, which triggers the Python script. Observed Behavior: When the button is clicked, the error indicates that the scriptDir variable in runScript.py is empty, leading to the os.chdir(scriptDir) call failing. Questions: Why might scriptDir be empty when runScript.py is executed? Is there a specific configuration required in the Splunk dashboard or app structure to ensure the ScriptPath is correctly passed to the ScriptRunner? How can I debug or fix this issue to ensure the Python script is executed properly? Any help or guidance would be greatly appreciated. Thank you!
I am referencing the following to create a custom command. https://github.com/splunk/splunk-app-examples/tree/master/custom_search_commands/python/reportingsearchcommands_app I am downloading the a... See more...
I am referencing the following to create a custom command. https://github.com/splunk/splunk-app-examples/tree/master/custom_search_commands/python/reportingsearchcommands_app I am downloading the app and running it. In makeresults, even if I generate 200000 lines and run it, only 1 result comes out. However, if I put the content in the index or lookup and run it, the number of results is 7~10, etc. The desired result is 1, but multiple results come out. Is it not possible to make it so that only one is shown?
Hello! I am getting this error when I am trying to authenticate to Splunk Enterprise. Could someone help me with this error? Below putting screenshot.  
I would like to understand if the following scenario would be possible: 1. Security detection queries/analytics relying on sysmon logs are onboarded and enabled. 2. When the logs of a certain endpo... See more...
I would like to understand if the following scenario would be possible: 1. Security detection queries/analytics relying on sysmon logs are onboarded and enabled. 2. When the logs of a certain endpoint matches the security analytic, it creates an alert and is sent to a case management system for the analyst to investigate. 3.  at this point, the analyst is not able to view the sysmon logs of that particular endpoint. he will need to manually trigger the sysmon log to be indexed from the case management platform, only then he will be able to search the sysmon log on splunk for the past X number of days  4. however, the analyst will not be able to search for sysmon logs of the other unrelated endpoints.    In summary, is there a way that we can deploy and have the security detection analytics to monitor and detect across all endpoints, yet only allowing the security analyst to only have the ability to search for the sysmon logs of the endpoint which triggered the alert based on an adhoc request via the case management system?
What's the difference between "Splunk VMware OVA for ITSI" and "Splunk OVA for VMware"?   The Splunk OVA for VMware appears to be more recent. Do they serve the same function? Can the "Splunk OVA f... See more...
What's the difference between "Splunk VMware OVA for ITSI" and "Splunk OVA for VMware"?   The Splunk OVA for VMware appears to be more recent. Do they serve the same function? Can the "Splunk OVA for VMware" be used with ITSI? 
Is there a command or app that will decode base64 and detect the correct charset to output to? Currently, I'm currently unable to decode to UTF-16LE. Splunk wants to decode UTF-8.  In my curre... See more...
Is there a command or app that will decode base64 and detect the correct charset to output to? Currently, I'm currently unable to decode to UTF-16LE. Splunk wants to decode UTF-8.  In my current role, I cannot edit any .conf files. Those are administrated by a server team.  If there is an app, I can request it be installed, else I'm working solely out of the SPL. 
Hi All My issue is i have logstash data coming in splunk logs source type is Http Events and logs are coming in JSON format. I need to know how can i use this data to find something meaningful tha... See more...
Hi All My issue is i have logstash data coming in splunk logs source type is Http Events and logs are coming in JSON format. I need to know how can i use this data to find something meaningful that i can use also as we get event code in windows forwarders so i block unwanted  event codes giving repeated information but in logstash data what we can do if i want to do something like this. How to take out information which we can use in splunk?
I have an existing search head that is peered to 2 cluster mgrs. This SH has the ES app on it. I am looking to add additional data from a remote indexers. Do i just need to add the remote cluster mgr... See more...
I have an existing search head that is peered to 2 cluster mgrs. This SH has the ES app on it. I am looking to add additional data from a remote indexers. Do i just need to add the remote cluster mgr as a peer to my existing SH so that i can access the data in ES?
Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if ... See more...
Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if there's another recommended way to achieve this directly from Splunk Cloud. I've been testing with the following endpoints: /services/search/jobs/ /servicesNS/admin/search/search/jobs But in both cases, I only get a 404 error indicating that the URL is not valid. Could you guide me on how to configure data collection in this format? What would be the correct endpoint? Which key parameters should I include in my request? Or, if there's an easier or more direct method, I'd appreciate it if you could explain. The version of Splunk I'm using is 9.3.2408.104. Thank you in advance for your help!
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A cal... See more...
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A calls Tier B with HTTP,  Tier C calls these specific backends on nnn ports.  Or some utility that recursively "walks" the tree of Tiers/Nodes/Backends using the Application Model API calls?
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for ... See more...
Hi everyone,  I've revently tested the new Splunk AI feature within Splunk ITSI to define thresholds based on historic Data/KPI points. ("Test" as in I literally created very obvious dummy-data for the AI to process and find thresholds for. Sort of Trust test of the AI really does find usuable thresholds. ) Example:  Every 5 minutes the KPI takes the latest value which I've set to correspond with the current weekday (+ minimal variance) For example: All KPI values on Mondays are within the range of 100-110, Tuesdays 200-210, Wednesdays 300-310 and so forth.  This is a preview of the data:  Now after a successful backfill of 30 days I would have expected the AI to see that each weekday needs its own time policy and thresholds.  However the result was this:  No weekdays detected, and instead it finds time policies for every 4hours regardless of days?  By now I've tried all possible adjustments I could think of (increasing the number of data points, greater differences between data points, other algorithmn, waiting for the next in hopes it would recalibrate itself over midnight, etc.) Hardly any improments at all and the thresholds are not usuable like this as it would not be able to detect outliers on mondays (expected values 100-110, outlier would 400 but not detected as it's still within thresholds. Thus my question to the community: Does anyone have some ideas/suggestions how I could make the AI understand the simple idea of "weekly time policies" and how I could tweak it? (Aside from doing everything manually and ditching the AI-Idea as a whole)?  Does anyone have good experience with Splunk AI defining Thresholds and if so what were the use cases?
Description: Hello, I am experiencing an issue with the "event_id" field when transferring notable events from Splunk Enterprise Security (ES) to Splunk SOAR. Details: When sending the event to ... See more...
Description: Hello, I am experiencing an issue with the "event_id" field when transferring notable events from Splunk Enterprise Security (ES) to Splunk SOAR. Details: When sending the event to SOAR using an Adaptive Response Action (Send to SOAR), the event is sent successfully, but the "event_id" field does not appear in the data received in SOAR. Any assistance or guidance to resolve this issue would be greatly appreciated. Thank you