All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How to filter using text box with multiple keywords using comma separated.How to filter my table data. This is  my query      index=atvi_test sourcetype=ncc |rename hostname as Host component as ... See more...
How to filter using text box with multiple keywords using comma separated.How to filter my table data. This is  my query      index=atvi_test sourcetype=ncc |rename hostname as Host component as Component filename as FileName | eval source_list=split("*ORA*", ",")| search Environment=QTEST Component IN (*) |search NOT Message IN (null)| table PST_Time Environment Host Component FileName Message |sort PST_Time|search [| makemv delim="," source_list|eval search_condition=mvjoin(source_list, " OR Message=*")|eval search_condition="Message=*" . search_condition|return $search_condition]  
Hi, Perhaps this question has been asked before...  Is it possible to store events coming from the same source in different indexes, depending on their content? The use case is that some events are... See more...
Hi, Perhaps this question has been asked before...  Is it possible to store events coming from the same source in different indexes, depending on their content? The use case is that some events are more sensitive than others and need to be sent to different indexes. In our case, the index name would appear within the event, as a formatted field, like [index: SENSITIVE]. The input is a TCP port. Any help would be appreciated, and I prefer to take no as an answer than to be led into some intricate solution. Thank you, Jean
HI, Splunk is a new tool to me, so I apologize for the very basic question.  Could you please provide a query that includes email delivery status with reason, or detailed information if delivered/n... See more...
HI, Splunk is a new tool to me, so I apologize for the very basic question.  Could you please provide a query that includes email delivery status with reason, or detailed information if delivered/not delivered, as well as multiple specific subject sources from Postfix?
Hey Everyone,  i got information if Wazuh can send data to Splunk, i want reverse it.  Because i want to send data from Splunk to Wazuh, in my case because i have TI who have API that can be send d... See more...
Hey Everyone,  i got information if Wazuh can send data to Splunk, i want reverse it.  Because i want to send data from Splunk to Wazuh, in my case because i have TI who have API that can be send data to Splunk, then i want forward it to Wazuh.  Maybe if using third party like Logstash / Elastic / etc ?  Did anyone know about it? because i never read about it before..  Thanks    
Hello, from which Splunk Universal Forwarder version is Windows Server 2025 supported? Best regards and thanks for cooperation   M2024X_Ray
Haven installed splunk to this point what do i have to do next to get it running  
Hello. I'm getting trouble listing all my SavedSearches from a SHC, using a command line REST API get. I'm asking Splunk to list all savedsearches of user "admin" in "MYAPP" app. For some strange ... See more...
Hello. I'm getting trouble listing all my SavedSearches from a SHC, using a command line REST API get. I'm asking Splunk to list all savedsearches of user "admin" in "MYAPP" app. For some strange reason, i can't locate, list gets also some other apps Here we are,   curl -skL -u 'usr:pwd' 'https://SHC_NODE:8089/servicesNS/admin/MYAPP/saved/searches?count=-1' | egrep 'name="app"' | sort -u   ... and here what it came from,   <s:key name="app">MYAPP</s:key> <s:key name="app">MYAPP_backup</s:key> <s:key name="app">ANOTHER_APP</s:key> <s:key name="app">search</s:key>     I expect only "<s:key name="app">MYAPP</s:key>" entries, or not? What's wrong??? Linux OS SPLUNK ENTERPRISE 8.2.12 SHC 3 Nodes (all nodes reponses the same output) Thanks.
Hi Splunk Community,  Is there a way to capture the host of a UF as its passing through a HF to add the host right before the log messaging being processed. I have tried a few things with no luck ... See more...
Hi Splunk Community,  Is there a way to capture the host of a UF as its passing through a HF to add the host right before the log messaging being processed. I have tried a few things with no luck but asking here while i dig through the documentations. Thanks!
   !!! NEW !!! Upcoming Cloud Event Alert !!! Register Today !!!Accelerate Digital Resilience with Splunk’s AI-Driven Cloud Platform | Thursday's, January 30 & February 6 | 10AM PT / 1PM ET Join us... See more...
   !!! NEW !!! Upcoming Cloud Event Alert !!! Register Today !!!Accelerate Digital Resilience with Splunk’s AI-Driven Cloud Platform | Thursday's, January 30 & February 6 | 10AM PT / 1PM ET Join us to hear top experts share how migrating to Splunk Cloud enhances data security and visibility, while empowering your organization to boost digital resilience to stay ahead in the AI era. Don’t miss this opportunity to learn from the best!
I need to forward data from a heavy forwarder to two different indexer clusters. One of the clusters needs to have a field removed. If I use sedcmd in props.conf on the HF it removes it for both and ... See more...
I need to forward data from a heavy forwarder to two different indexer clusters. One of the clusters needs to have a field removed. If I use sedcmd in props.conf on the HF it removes it for both and putting sedcmd in props.conf on one of the indexers doesn't work (it does work if i bypass the HF).  Is there a way to do this? Edit: I was thinking of using an intermediate forwarder so heavy forwarder -> another heavy forwarder -> indexer cluster but the intermediate heavy forwarder props.conf does not work.
Hello Splunk experts, I’m currently trying to create a search using a multisearch command where I need to dynamically apply regex patterns from a lookup file to the Web.url field in a tstats search.... See more...
Hello Splunk experts, I’m currently trying to create a search using a multisearch command where I need to dynamically apply regex patterns from a lookup file to the Web.url field in a tstats search. When I use my current approach, it directly adds the regex value as a literal search condition instead of applying it as a regex filter. For example, instead of dynamically matching URLs with the regex, it ends up as if it’s searching for the literal pattern. I have a lookup that contains fields like url_regex and other filter parameters, and I need to: 1. Dynamically use these regex patterns in the search, so that only URLs matching the regex from the lookup get processed further. 2. Ensure that the logic integrates correctly within a multisearch, where the base search is filtered dynamically based on these values from the lookup. I’ve shared some screenshots showing the query and the resulting issue, where the regex appears to be used incorrectly. How can I properly use these regex values to match URLs instead of treating them as literal strings? Search :-  | inputlookup my_lookup_file | search Justification="Lookup Instructions" | fields url_regex, description | fillnull value="*" | eval url_regex="Web.url=\"" . url_regex . "\"" | eval filter="source=\"my_sourcetype\" " . "filter_field=" . " \"" | eval search="| tstats `summariesonly` prestats=true count from datamodel=Web where sourcetype=\"" . filter . " by Web.url Web.user" | stats values(search) as search | eval search=multisearch [ mvjoin(search, " | ") ] . " | stats count by search" As highlighted in the yellow from above I wanted to have the regex matching string instead of the direct regex search from events? Also, lastly, once the multisearch query generates another search as output, how can I automatically execute that resulting search within my main query? Any guidance would be greatly appreciated!
Hi there, I'm using this API: https://splunk.github.io/splunk-add-on-for-amazon-web-services/APIreference/ Whenever I send a POST request to create metadata inputs that already exists, I get a 500 I... See more...
Hi there, I'm using this API: https://splunk.github.io/splunk-add-on-for-amazon-web-services/APIreference/ Whenever I send a POST request to create metadata inputs that already exists, I get a 500 Internal Server Error. Error: Unable to create metadata inputs: Unexpected HTTP status: 500 Internal Server Error (500) Expected behaviour: Do not return 500, return a payload that indicates that the resource already exists.
I want my alert to trigger when the result count is between 250 and 500, trying to use the custom trigger condition in the alert setup with     search count => 250 AND search count <=500      b... See more...
I want my alert to trigger when the result count is between 250 and 500, trying to use the custom trigger condition in the alert setup with     search count => 250 AND search count <=500      but this is not working as expected. Even trying to to use the custom trigger condition for one condition like search count => 250 is not working. What is the right way to do this? 
The goal here is that windows logs that are moved off a system can be added to a NAS location that i can mount to the splunk instance. With this then i can ingest logs are normal maintaining the same... See more...
The goal here is that windows logs that are moved off a system can be added to a NAS location that i can mount to the splunk instance. With this then i can ingest logs are normal maintaining the same source as windows:security. However this is stated to be an API call so i am not sure if i apply the following stanza would work: [WinEventLog://Security] disabled = 0 index = test01   Some other details is that the logs are coming off a windows system that is isolated not connected to splunk. Splunk says you can't monitor .evtx files with a monitor stanza. The nas location is linux based so the logs would be dropped in a directory such as /Nas/Windows/Hostname. Any best practices to make this work?
Hello guys, We are getting on one heavyforwarder this message in splunkd.log, we are using TCP-SSL inputs.conf : “11-14-2024 16:59:44.129 +0100 WARN  SSLCommon [53742 FwdDataReceiverThread] - Recei... See more...
Hello guys, We are getting on one heavyforwarder this message in splunkd.log, we are using TCP-SSL inputs.conf : “11-14-2024 16:59:44.129 +0100 WARN  SSLCommon [53742 FwdDataReceiverThread] - Received fatal SSL3 alert. ssl_state='SSLv3 read client certificate A', alert_description='unknown CA'.”   How do you identify the sourceHost ? Is it blocking incoming data or just warning?   Maybe this can help? index=_* host=myhf1 source="/OPT/splunk/var/log/splunk/metrics.log" tcp_Kprocessed="0.000"   Thanks for your help.
We have currently configured to send the logs to splunk cloud also we are setting up a DR on-perm server, now the question is how to configure the UF to send to both the cloud and DR (On-Perm).  NO i... See more...
We have currently configured to send the logs to splunk cloud also we are setting up a DR on-perm server, now the question is how to configure the UF to send to both the cloud and DR (On-Perm).  NO issues with the cloud environment. Is it possible to send it to both? On the UF the certificate is for splunk cloud and i am not sure how to add our on-perm certificate.
Hi,  Our app is built upon Splunk Add-on builder. Builder's code is responsible for most of input and output for our app. We modified the pulling module to reach out to our server to pull data. Then... See more...
Hi,  Our app is built upon Splunk Add-on builder. Builder's code is responsible for most of input and output for our app. We modified the pulling module to reach out to our server to pull data. Then Builder will send the pulled data into Splunk engine to process.  Splunk cloud store has been updating their inspection criteria few times in past years. Almost every time, Builder needs to update to comply to the new criteria. We was told to import our app upon the new Builder and export to our app, to take in Builder's updates.  Unless last month.  We have got another notice from Splunk store, saying our app no longer apply to updated criteria and will be removed from Splunk store by 18th this month. Only this time, Splunk Add-on Builder no longer do its part to update to apply to the same rules in the same store.  Here is the cause: check_python_sdk_version If your app relies on the Splunk SDK for Python, we require you to use an acceptably-recent version in order to avoid compatibility issues between your app and the Splunk Platform or the Python language runtime used to execute your app’s code. Please update your Splunk SDK for Python version to the least 2.0.2. More information is available on this project’s GitHub page: https://github.com/splunk/splunk-sdk-python Versions affected by this check are: 1.6.1   We would like to seek some information about  1. Why Builder can violates the Splunk cloud criteria but can stay on Splunk store.  2. If Builder does follow new rules as everyone else, when do they update to new version to pass inspection test.  3. If Builder does NOT update. Is there any instructions for the apps that built upon Builder that can fix builder's issue and still allow to be hosted on Splunk store.    Thanks for any feedback and information.     Lixin
Trying to get a count of servers sending logs to an index "cloud_servers", Running this command to get the count: index=cloud_servers | search host="*server_name-h-nk01-*" | dedup host | stats count... See more...
Trying to get a count of servers sending logs to an index "cloud_servers", Running this command to get the count: index=cloud_servers | search host="*server_name-h-nk01-*" | dedup host | stats count The problem is, some servers it's counting twice because the server names appear with and without a fqdn depending on the type of log being sent.  So dedup doesn't work since technically it is a unique host.   Example of the same server appearing with two host names: host buffnybd1-h-nk01-555 host buffnybd1-h-nk01-555.nyny.os1.com Is there a way to count the server just using the first 20 or so digits, so it will ingore the fqdn? Thank you
Hello Community I need regex that can return extract the following fields only from event 4702: 1. <EventID></EventID> 2.<TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/> 3.<Computer>Host<... See more...
Hello Community I need regex that can return extract the following fields only from event 4702: 1. <EventID></EventID> 2.<TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/> 3.<Computer>Host</Computer> 4.<Data Name='TaskName'>\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask</Data> from the following raw event: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-a5ba-3e3b0328c30d}'/><EventID>4702</EventID><Version>1</Version><Level>0</Level><Task>12804</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/><EventRecordID>2470365</EventRecordID><Correlation ActivityID='{625186de-46eb-0000-1689-5162eb46db01}'/><Execution ProcessID='1408' ThreadID='1600'/><Channel>Security</Channel><Computer>Host</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>S-1-5-20</Data><Data Name='SubjectUserName'> Host $</Data><Data Name='SubjectDomainName'> Host </Data><Data Name='SubjectLogonId'>0x3e4</Data><Data Name='TaskName'>\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask</Data><Data Name='TaskContentNew'>&lt;?xml version="1.0" encoding="UTF-16"?&gt; &lt;Task version="1.6" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task"&gt; &lt;RegistrationInfo&gt; &lt;Source&gt;$(@%systemroot%\system32\sppc.dll,-200)&lt;/Source&gt; &lt;Author&gt;$(@%systemroot%\system32\sppc.dll,-200)&lt;/Author&gt; &lt;Version&gt;1.0&lt;/Version&gt; &lt;Description&gt;$(@%systemroot%\system32\sppc.dll,-201)&lt;/Description&gt; &lt;URI&gt;\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask&lt;/URI&gt; &lt;SecurityDescriptor&gt;D:P(A;;FA;;;SY)(A;;FA;;;BA)(A;;FA;;;S-1-5-80-123231216-2592883651-3715271367-3753151631-4175906628)(A;;FR;;;S-1-5-87-2912274048-3994893941-1669128114-1310430903-1263774323)&lt;/SecurityDescriptor&gt; &lt;/RegistrationInfo&gt; &lt;Triggers&gt; &lt;CalendarTrigger&gt; &lt;StartBoundary&gt;2024-12-10T07:54:44Z&lt;/StartBoundary&gt; &lt;Enabled&gt;true&lt;/Enabled&gt; &lt;ScheduleByDay&gt; &lt;DaysInterval&gt;1&lt;/DaysInterval&gt; &lt;/ScheduleByDay&gt; &lt;/CalendarTrigger&gt; &lt;/Triggers&gt; &lt;Principals&gt; &lt;Principal id="NetworkService"&gt; &lt;UserId&gt;S-1-5-20&lt;/UserId&gt; &lt;RunLevel&gt;LeastPrivilege&lt;/RunLevel&gt; &lt;/Principal&gt; &lt;/Principals&gt; &lt;Settings&gt; &lt;MultipleInstancesPolicy&gt;IgnoreNew&lt;/MultipleInstancesPolicy&gt; &lt;DisallowStartIfOnBatteries&gt;true&lt;/DisallowStartIfOnBatteries&gt; &lt;StopIfGoingOnBatteries&gt;true&lt;/StopIfGoingOnBatteries&gt; &lt;AllowHardTerminate&gt;false&lt;/AllowHardTerminate&gt; &lt;StartWhenAvailable&gt;true&lt;/StartWhenAvailable&gt; &lt;RunOnlyIfNetworkAvailable&gt;false&lt;/RunOnlyIfNetworkAvailable&gt; &lt;IdleSettings&gt; &lt;StopOnIdleEnd&gt;true&lt;/StopOnIdleEnd&gt; &lt;RestartOnIdle&gt;false&lt;/RestartOnIdle&gt; &lt;/IdleSettings&gt; &lt;AllowStartOnDemand&gt;true&lt;/AllowStartOnDemand&gt; &lt;Enabled&gt;true&lt;/Enabled&gt; &lt;Hidden&gt;true&lt;/Hidden&gt; &lt;RunOnlyIfIdle&gt;false&lt;/RunOnlyIfIdle&gt; &lt;DisallowStartOnRemoteAppSession&gt;false&lt;/DisallowStartOnRemoteAppSession&gt; &lt;UseUnifiedSchedulingEngine&gt;true&lt;/UseUnifiedSchedulingEngine&gt; &lt;WakeToRun&gt;false&lt;/WakeToRun&gt; &lt;ExecutionTimeLimit&gt;PT0S&lt;/ExecutionTimeLimit&gt; &lt;Priority&gt;7&lt;/Priority&gt; &lt;RestartOnFailure&gt; &lt;Interval&gt;PT1M&lt;/Interval&gt; &lt;Count&gt;3&lt;/Count&gt; &lt;/RestartOnFailure&gt; &lt;/Settings&gt; &lt;Actions Context="NetworkService"&gt; &lt;ComHandler&gt; &lt;ClassId&gt;{B1AEBB5D-EAD9-4476-B375-9C3ED9F32AFC}&lt;/ClassId&gt; &lt;Data&gt;&lt;![CDATA[timer]]&gt;&lt;/Data&gt; &lt;/ComHandler&gt; &lt;/Actions&gt; &lt;/Task&gt;</Data><Data Name='ClientProcessStartKey'>26177172834095606</Data><Data Name='ClientProcessId'>2408</Data><Data Name='ParentProcessId'>1368</Data><Data Name='RpcCallClientLocality'>0</Data><Data Name='FQDN'>Host</Data></EventData></Event>   I need to be able to validate via | makeresults rex mod=sed..... Thanks in advance
Hi Team, For some VMs, we need to install and configure the Splunk Universal Forwarder using Terraform since backend access is unavailable, and all actions must be automated. Typically, we follow th... See more...
Hi Team, For some VMs, we need to install and configure the Splunk Universal Forwarder using Terraform since backend access is unavailable, and all actions must be automated. Typically, we follow the Splunk documentation to install the credentials package on client machines: [Install the forwarder credentials on individual forwarders in *nix](https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/ConfigSCUFCredentials#Install_the_forwarder_credentials_on_individual_forwarders_in_.2Anix). We use the following command to install the credentials package: $SPLUNK_HOME/bin/splunk install app /tmp/splunkclouduf.spl After executing this command, it prompts for a username and password. Once the credentials are provided, the package is installed, and after restarting the Splunk services, the internal logs from the client begin flowing (provided the relevant ports are open). Current Challenge: Since we are automating the deployment of the credentials package app via Terraform, we require a single command that: 1. Installs the credentials app. 2. Includes the username and password directly in the command. 3. Automatically restarts the Splunk services after installation. This will enable us to deploy the package via Terraform without manual intervention. As this involves a Linux machine, we need a command that can be executed in this environment. Could you kindly assist us in crafting this single automated command?