All Topics

Top

All Topics

Hi, Looking at the activity of the Splunkd threads on the indexers, I've seen in the monitoring console that sometimes there is no activity for a period of 1 minute. Is this normal? evidence  ... See more...
Hi, Looking at the activity of the Splunkd threads on the indexers, I've seen in the monitoring console that sometimes there is no activity for a period of 1 minute. Is this normal? evidence   Regards, thank you very much  
Hey,  So I have a playbook that fetches multiple files and adds them to the soar vault. I can then send each individual file to Jira by specifying the files vault_id in the update_ticket action on t... See more...
Hey,  So I have a playbook that fetches multiple files and adds them to the soar vault. I can then send each individual file to Jira by specifying the files vault_id in the update_ticket action on the Jira app. Ideally I would like to send only one file over to Jira, an archive containing each of the other files. I can create a file and add it to the archive after seeing this post - https://community.splunk.com/t5/Splunk-SOAR/SOAR-Create-File-from-Artifacts/m-p/581662 However, I don't know how I could take each individual file from the vault and add it to this archive before I sent it over. Any help would be appreciated! Thanks
I am building a Splunk dashboard that displays a table of content, once it's displayed I want to have couple of buttons as Stop All and Start All, while clicking the same this in turn execute a searc... See more...
I am building a Splunk dashboard that displays a table of content, once it's displayed I want to have couple of buttons as Stop All and Start All, while clicking the same this in turn execute a search to invoke a Python code to perform the actions. Please can someone guide if that's possible?
hi, Wondering if there is a document or guidance on how to estimate the  volume of data ingested in Splunk by pulling data from DNA Centre using the Splunk Add-on: Cisco DNA Center Add-on. Cheers, ... See more...
hi, Wondering if there is a document or guidance on how to estimate the  volume of data ingested in Splunk by pulling data from DNA Centre using the Splunk Add-on: Cisco DNA Center Add-on. Cheers, Ahmed.
Hi all, I just started a trial for Splunk Cloud , my URL looks similar to this: https://prd-p-s8qvw.splunkcloud.com/en-GB/app/launcher/home   I want to get data in with the HEC. I have read all t... See more...
Hi all, I just started a trial for Splunk Cloud , my URL looks similar to this: https://prd-p-s8qvw.splunkcloud.com/en-GB/app/launcher/home   I want to get data in with the HEC. I have read all the following documentation: https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/Data/UsetheHTTPEventCollector#Configure_HTTP_Event_Collector_on_Splunk_Cloud_Platform According to the documentation, my URL should look like this: https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event However this does not work. It seems the DNS cannot be resolved. My NodeJS gives "ENOTFOUND" I have tried different options (HHTP / HTTPS, host, port etc): HTTP: http://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event HTTPS: https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event GCP: http://http-inputs.prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs.prd-p-s8qvw.splunkcloud.com:8088/services/collector/event host: http://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs-p-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs-s8qvw.splunkcloud.com:8088/services/collector/event http://http-inputs.s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-p-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs-s8qvw.splunkcloud.com:8088/services/collector/event https://http-inputs.s8qvw.splunkcloud.com:8088/services/collector/event port: http://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs.prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs.prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-p-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs-s8qvw.splunkcloud.com:443/services/collector/event http://http-inputs.s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-p-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs-s8qvw.splunkcloud.com:443/services/collector/event https://http-inputs.s8qvw.splunkcloud.com:443/services/collector/event No prefix: http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event http://p-s8qvw.splunkcloud.com:8088/services/collector/event http://s8qvw.splunkcloud.com:8088/services/collector/event http://s8qvw.splunkcloud.com:8088/services/collector/event https://prd-p-s8qvw.splunkcloud.com:8088/services/collector/event https://p-s8qvw.splunkcloud.com:8088/services/collector/event https://s8qvw.splunkcloud.com:8088/services/collector/event https://s8qvw.splunkcloud.com:8088/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://prd-p-s8qvw.splunkcloud.com:443/services/collector/event http://p-s8qvw.splunkcloud.com:443/services/collector/event http://s8qvw.splunkcloud.com:443/services/collector/event http://s8qvw.splunkcloud.com:443/services/collector/event https://prd-p-s8qvw.splunkcloud.com:443/services/collector/event https://p-s8qvw.splunkcloud.com:443/services/collector/event https://hs8qvw.splunkcloud.com:443/services/collector/event https://s8qvw.splunkcloud.com:443/services/collector/event None of these work. All give one of the following errors: Error: getaddrinfo ENOTFOUND http-inputs-prd-p-s8qvw.splunkcloud.com Error: read ECONNRESET HTTP 400 Sent HTTP to port 443 HTTP 404 Not Found Can anybody help me get this working?   Regards,   Lawrence
Hi Splunk Community, I am looking to edit the inputs.conf file programmatically via the Splunk API. Specifically, I want to know: Is there an API endpoint available to update the inputs.conf file? ... See more...
Hi Splunk Community, I am looking to edit the inputs.conf file programmatically via the Splunk API. Specifically, I want to know: Is there an API endpoint available to update the inputs.conf file? If yes, what would be the correct method to achieve this (e.g., required endpoint, parameters, or payload)? I understand that inputs.conf primarily configures data inputs, and certain operations might have to be performed via the REST API or directly through configuration file updates. Any documentation or examples regarding: Supported Splunk API endpoints for modifying input configurations. Best practices for editing inputs.conf programmatically. Any necessary permissions or prerequisites to perform such updates.
I have been using the Splunk API from within a Python script to retrieve information about saved searches using a call to the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/<name_of_saved_s... See more...
I have been using the Splunk API from within a Python script to retrieve information about saved searches using a call to the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/<name_of_saved_search>?output_mode=json   The <name_of_saved_search> has been URL encoded to deal with some punctuation (including '/'), using the Python function:   name_of_searched_search = urllib.parse.quote(search_name, safe='')   It has been working so far, but recently I encountered an issue when the name of the saved search contains square brackets (e.g. "[123] My Search") Even after URL encoding, Splunk's API just does not accept the API call at the endpoint:   hxxps://<splunk_server>/-/-/saved/searches/%5B123%5D%20My%20Search?output_mode=json   and returns a response with HTTP status code of 404 (Not Found). I am not sure what else I should be doing to handle the square brackets in the name of the saved search to make the API call work.
I encountered an issue while trying to integrate a Python script into my Splunk dashboard to export Zabbix logs to a Splunk index. When I click the button on the dashboard, the following error is log... See more...
I encountered an issue while trying to integrate a Python script into my Splunk dashboard to export Zabbix logs to a Splunk index. When I click the button on the dashboard, the following error is logged in splunkd.log: 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': Traceback (most recent call last): 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': File "/opt/splunk/bin/runScript.py", line 72, in <module> 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': os.chdir(scriptDir) 01-16-2025 12:01:24.958 +0530 ERROR ScriptRunner [40857 TcpChannelThread] - stderr from '/opt/splunk/bin/python3.9 /opt/splunk/bin/runScript.py zabbix_handler.Zabbix_handler': FileNotFoundError: [Errno 2] No such file or directory: '' Setup: Python Script: Location: /opt/splunk/etc/apps/search/bin/zabbix_handler.py Function: Export Zabbix logs to a Splunk index using the HEC endpoint. JavaScript Code: Location: /opt/splunk/etc/apps/search/appserver/static/ Function: Adds a button to the dashboard, which triggers the Python script. Observed Behavior: When the button is clicked, the error indicates that the scriptDir variable in runScript.py is empty, leading to the os.chdir(scriptDir) call failing. Questions: Why might scriptDir be empty when runScript.py is executed? Is there a specific configuration required in the Splunk dashboard or app structure to ensure the ScriptPath is correctly passed to the ScriptRunner? How can I debug or fix this issue to ensure the Python script is executed properly? Any help or guidance would be greatly appreciated. Thank you!
I am referencing the following to create a custom command. https://github.com/splunk/splunk-app-examples/tree/master/custom_search_commands/python/reportingsearchcommands_app I am downloading the a... See more...
I am referencing the following to create a custom command. https://github.com/splunk/splunk-app-examples/tree/master/custom_search_commands/python/reportingsearchcommands_app I am downloading the app and running it. In makeresults, even if I generate 200000 lines and run it, only 1 result comes out. However, if I put the content in the index or lookup and run it, the number of results is 7~10, etc. The desired result is 1, but multiple results come out. Is it not possible to make it so that only one is shown?
Hello! I am getting this error when I am trying to authenticate to Splunk Enterprise. Could someone help me with this error? Below putting screenshot.  
I would like to understand if the following scenario would be possible: 1. Security detection queries/analytics relying on sysmon logs are onboarded and enabled. 2. When the logs of a certain endpo... See more...
I would like to understand if the following scenario would be possible: 1. Security detection queries/analytics relying on sysmon logs are onboarded and enabled. 2. When the logs of a certain endpoint matches the security analytic, it creates an alert and is sent to a case management system for the analyst to investigate. 3.  at this point, the analyst is not able to view the sysmon logs of that particular endpoint. he will need to manually trigger the sysmon log to be indexed from the case management platform, only then he will be able to search the sysmon log on splunk for the past X number of days  4. however, the analyst will not be able to search for sysmon logs of the other unrelated endpoints.    In summary, is there a way that we can deploy and have the security detection analytics to monitor and detect across all endpoints, yet only allowing the security analyst to only have the ability to search for the sysmon logs of the endpoint which triggered the alert based on an adhoc request via the case management system?
What's the difference between "Splunk VMware OVA for ITSI" and "Splunk OVA for VMware"?   The Splunk OVA for VMware appears to be more recent. Do they serve the same function? Can the "Splunk OVA f... See more...
What's the difference between "Splunk VMware OVA for ITSI" and "Splunk OVA for VMware"?   The Splunk OVA for VMware appears to be more recent. Do they serve the same function? Can the "Splunk OVA for VMware" be used with ITSI? 
Register Here. This thread is for the Community Office Hours session on Splunk AI: Ask me Anything on Thurs, March 20, 2025 at 1pm PT / 4pm ET.    Ask the experts at Community Office Hours! An ongo... See more...
Register Here. This thread is for the Community Office Hours session on Splunk AI: Ask me Anything on Thurs, March 20, 2025 at 1pm PT / 4pm ET.    Ask the experts at Community Office Hours! An ongoing series where technical Splunk experts answer questions and provide how-to guidance on various Splunk product and use case topics.   What can I ask in this AMA? How does Splunk use generative AI? What are some common use cases to get started with the Machine Learning Toolkit (MLTK) app? How can Splunk help me monitor and secure my generative AI applications? How do the Splunk AI Assistants keep customer data confidential? How can I leverage ML for anomaly detection? Can I develop and package a custom machine learning model in Splunk? Anything else you’d like to learn!   Please submit your questions at registration. You can also head to the #office-hours user Slack channel to ask questions (request access here).    Pre-submitted questions will be prioritized. After that, we will open the floor up to live Q&A with meeting participants.   Look forward to connecting!
At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of skill level or learning preference. Whether you’re just starting your journey with Splunk ... See more...
At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of skill level or learning preference. Whether you’re just starting your journey with Splunk or sharpening advanced skills, our broad range of educational resources ensures you’re prepared for every step.        Our Portfolio We offer Free eLearning to kickstart your learning, eLearning with Labs for hands-on practice, Instructor-led courses for interactive, expert guidance, and Splunk Certifications to validate your expertise. For quick tips and insights, explore our Splunk YouTube How-Tos and Splunk Lantern, where you'll find up-to-date guidance and best practices that reflect the latest in Splunk's capabilities. New Courses Available Every month, we release new courses designed to empower learners with the tools and knowledge they need to stay ahead in the evolving tech landscape. Whether you prefer self-paced eLearning or the structure of live instruction, there’s a course to fit your style. This month, we are excited to announce a new instructor-led course, a new eLearning with Labs course, a new free eLearning course to help you advance your Splunk skills.    These courses provide targeted insights into security operations and observability, essential for anyone looking to enhance their data-driven capabilities. Explore them today to stay ahead in your field! All courses are available through the Splunk Course Catalog, accessible via our banner or directly on our platform. Expanding Global Learning Access  As part of our commitment to accessibility and inclusion, we continue to translate eLearning courses into multiple languages and add non-English captions. This effort ensures that learners worldwide can grow their Splunk expertise in their preferred language, supporting our vision of an inclusive educational ecosystem. Each month presents new opportunities to expand your knowledge, boost your career, and enhance your contributions to enterprise resilience. Stay updated with the latest courses and continue your journey toward Splunk mastery – your next big career move could be just a course away. See you next month!  - Callie Skokos on behalf of the Splunk Education Crew
Is there a command or app that will decode base64 and detect the correct charset to output to? Currently, I'm currently unable to decode to UTF-16LE. Splunk wants to decode UTF-8.  In my curre... See more...
Is there a command or app that will decode base64 and detect the correct charset to output to? Currently, I'm currently unable to decode to UTF-16LE. Splunk wants to decode UTF-8.  In my current role, I cannot edit any .conf files. Those are administrated by a server team.  If there is an app, I can request it be installed, else I'm working solely out of the SPL. 
Hi All My issue is i have logstash data coming in splunk logs source type is Http Events and logs are coming in JSON format. I need to know how can i use this data to find something meaningful tha... See more...
Hi All My issue is i have logstash data coming in splunk logs source type is Http Events and logs are coming in JSON format. I need to know how can i use this data to find something meaningful that i can use also as we get event code in windows forwarders so i block unwanted  event codes giving repeated information but in logstash data what we can do if i want to do something like this. How to take out information which we can use in splunk?
I have an existing search head that is peered to 2 cluster mgrs. This SH has the ES app on it. I am looking to add additional data from a remote indexers. Do i just need to add the remote cluster mgr... See more...
I have an existing search head that is peered to 2 cluster mgrs. This SH has the ES app on it. I am looking to add additional data from a remote indexers. Do i just need to add the remote cluster mgr as a peer to my existing SH so that i can access the data in ES?
A Step-by-Step Guide to run the standalone on-premise controller as a service in the Linux environment. When you run a standalone on-premise controller manually, you can follow the steps described ... See more...
A Step-by-Step Guide to run the standalone on-premise controller as a service in the Linux environment. When you run a standalone on-premise controller manually, you can follow the steps described in the documentation below: https://docs.appdynamics.com/appd/onprem/24.x/latest/en/controller-deployment/administer-the-controller/start-or-stop-the-controller However, there might be situations where you need to run the standalone on-premise controller as a service in a Linux environment. If so, you can follow the steps below. Change the user to root sudo -i​ Install the library below (optional) apt install libxml2-utils -y​ OR yum install libxml2 -y​ Move to the directory below: cd /opt/appdynamics/platform/product/controller/controller-ha​ Set up the controller db password and validate it ./set_mysql_password_file.sh -p <controller-db-password>​ Output results: Checking if db credential is valid...​ Move to the directory below cd /opt/appdynamics/platform/product/controller/controller-ha/init​ Run the script below ./install-init.sh -s​ Output results: update-rc.d will be used for installing init installed /etc/sudoers.d/appdynamics installing /etc/init.d/appdcontroller-db installing /etc/default/appdcontroller-db installing /etc/init.d/appdcontroller installing /etc/default/appdcontroller​ Run the commands below to enable the newly created service: systemctl enable appdcontroller systemctl enable appdcontroller-db systemctl restart appdcontroller systemctl restart appdcontroller-db systemctl status appdcontroller systemctl status appdcontroller-db​ Additionally, you might create your own unit file with the start/stop commands to run the standalone on-premise controller as a service in a Linux environment without using our script.
Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if ... See more...
Hello everyone, I'm trying to collect data in JSON format from Splunk Cloud, and I understand that one of the options is using the REST API. However, I'm not sure which endpoint I should use, or if there's another recommended way to achieve this directly from Splunk Cloud. I've been testing with the following endpoints: /services/search/jobs/ /servicesNS/admin/search/search/jobs But in both cases, I only get a 404 error indicating that the URL is not valid. Could you guide me on how to configure data collection in this format? What would be the correct endpoint? Which key parameters should I include in my request? Or, if there's an easier or more direct method, I'd appreciate it if you could explain. The version of Splunk I'm using is 9.3.2408.104. Thank you in advance for your help!
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A cal... See more...
Hi all- I've seen older posts for this topic but nothing in past couple years so here goes.  Is there a way to export the application interactions/dependencies seen on the Flow Map?  e.g. Tier A calls Tier B with HTTP,  Tier C calls these specific backends on nnn ports.  Or some utility that recursively "walks" the tree of Tiers/Nodes/Backends using the Application Model API calls?