All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Try something like this | rex mode=sed "s/(?ms).*(?<ei>\<EventID\>\d+\<\/EventID>).*(?<TimeCreated>\<TimeCreated SystemTime='[^']+'\/>).*(?<Computer>\<Computer\>[^\<]+\<\/Computer\>).*(?<TaskName>\<... See more...
Try something like this | rex mode=sed "s/(?ms).*(?<ei>\<EventID\>\d+\<\/EventID>).*(?<TimeCreated>\<TimeCreated SystemTime='[^']+'\/>).*(?<Computer>\<Computer\>[^\<]+\<\/Computer\>).*(?<TaskName>\<Data Name='TaskName'\>[^\<]+\<\/Data\>).*/\1\2\3\4/g" Caveat: XML sometimes has namespace aliases either embedded or used or both which a proper XML parser would understand but these are not shown in your sample and therefore not catered for in the regex
Hi @narenpg , yes, it's possible but you pay twice the Splunk license. You have to modify the outputs.conf to create a fork. For more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.... See more...
Hi @narenpg , yes, it's possible but you pay twice the Splunk license. You have to modify the outputs.conf to create a fork. For more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.2/Forwarding/Routeandfilterdatad Ciao. Giuseppe
We have currently configured to send the logs to splunk cloud also we are setting up a DR on-perm server, now the question is how to configure the UF to send to both the cloud and DR (On-Perm).  NO i... See more...
We have currently configured to send the logs to splunk cloud also we are setting up a DR on-perm server, now the question is how to configure the UF to send to both the cloud and DR (On-Perm).  NO issues with the cloud environment. Is it possible to send it to both? On the UF the certificate is for splunk cloud and i am not sure how to add our on-perm certificate.
Hi @Thomas2 , at first don't use the search command after the mai nsearch. then anyway, you can use the rex command to extract the first part of the field or eval to use the first 20 chars. index=... See more...
Hi @Thomas2 , at first don't use the search command after the mai nsearch. then anyway, you can use the rex command to extract the first part of the field or eval to use the first 20 chars. index=cloud_servers host="*server_name-h-nk01-*" | rex field=host "^(?<host>[^\.]+)" | stats dc(host) AS count Ciao. Giuseppe
Hi,  Our app is built upon Splunk Add-on builder. Builder's code is responsible for most of input and output for our app. We modified the pulling module to reach out to our server to pull data. Then... See more...
Hi,  Our app is built upon Splunk Add-on builder. Builder's code is responsible for most of input and output for our app. We modified the pulling module to reach out to our server to pull data. Then Builder will send the pulled data into Splunk engine to process.  Splunk cloud store has been updating their inspection criteria few times in past years. Almost every time, Builder needs to update to comply to the new criteria. We was told to import our app upon the new Builder and export to our app, to take in Builder's updates.  Unless last month.  We have got another notice from Splunk store, saying our app no longer apply to updated criteria and will be removed from Splunk store by 18th this month. Only this time, Splunk Add-on Builder no longer do its part to update to apply to the same rules in the same store.  Here is the cause: check_python_sdk_version If your app relies on the Splunk SDK for Python, we require you to use an acceptably-recent version in order to avoid compatibility issues between your app and the Splunk Platform or the Python language runtime used to execute your app’s code. Please update your Splunk SDK for Python version to the least 2.0.2. More information is available on this project’s GitHub page: https://github.com/splunk/splunk-sdk-python Versions affected by this check are: 1.6.1   We would like to seek some information about  1. Why Builder can violates the Splunk cloud criteria but can stay on Splunk store.  2. If Builder does follow new rules as everyone else, when do they update to new version to pass inspection test.  3. If Builder does NOT update. Is there any instructions for the apps that built upon Builder that can fix builder's issue and still allow to be hosted on Splunk store.    Thanks for any feedback and information.     Lixin
Hi @richgalloway, thank you for your reply. Apologies, I should have been a bit more descriptive. I am trying to implement a SEDCMD in transforms.conf to reduce a single raw event's size, specifica... See more...
Hi @richgalloway, thank you for your reply. Apologies, I should have been a bit more descriptive. I am trying to implement a SEDCMD in transforms.conf to reduce a single raw event's size, specifically by removing elements that will never be used while keeping the event intact for compliance purposes. My intent is not to extract fields but to ensure that only the necessary elements remain in the raw event. A single regex that can clean up the event by removing unused parts while leaving the required fields would be ideal. Thanks in advance for your guidance! Best regards, D Alex
Trying to get a count of servers sending logs to an index "cloud_servers", Running this command to get the count: index=cloud_servers | search host="*server_name-h-nk01-*" | dedup host | stats count... See more...
Trying to get a count of servers sending logs to an index "cloud_servers", Running this command to get the count: index=cloud_servers | search host="*server_name-h-nk01-*" | dedup host | stats count The problem is, some servers it's counting twice because the server names appear with and without a fqdn depending on the type of log being sent.  So dedup doesn't work since technically it is a unique host.   Example of the same server appearing with two host names: host buffnybd1-h-nk01-555 host buffnybd1-h-nk01-555.nyny.os1.com Is there a way to count the server just using the first 20 or so digits, so it will ingore the fqdn? Thank you
It would help to know what you've tried already and how those efforts failed to meet expectations. Are you looking for a single regex or one for each field? Do you plan to extract the fields at sea... See more...
It would help to know what you've tried already and how those efforts failed to meet expectations. Are you looking for a single regex or one for each field? Do you plan to extract the fields at search time or index time?  If search time, have you tried using spath to parse the event? rex mode=sed does not extract fields so it cannot be used to validate expressions.
What do you mean? Performance is limited by hardware.
Hello Community I need regex that can return extract the following fields only from event 4702: 1. <EventID></EventID> 2.<TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/> 3.<Computer>Host<... See more...
Hello Community I need regex that can return extract the following fields only from event 4702: 1. <EventID></EventID> 2.<TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/> 3.<Computer>Host</Computer> 4.<Data Name='TaskName'>\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask</Data> from the following raw event: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-a5ba-3e3b0328c30d}'/><EventID>4702</EventID><Version>1</Version><Level>0</Level><Task>12804</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2024-12-05T14:59:44.9923272Z'/><EventRecordID>2470365</EventRecordID><Correlation ActivityID='{625186de-46eb-0000-1689-5162eb46db01}'/><Execution ProcessID='1408' ThreadID='1600'/><Channel>Security</Channel><Computer>Host</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>S-1-5-20</Data><Data Name='SubjectUserName'> Host $</Data><Data Name='SubjectDomainName'> Host </Data><Data Name='SubjectLogonId'>0x3e4</Data><Data Name='TaskName'>\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask</Data><Data Name='TaskContentNew'>&lt;?xml version="1.0" encoding="UTF-16"?&gt; &lt;Task version="1.6" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task"&gt; &lt;RegistrationInfo&gt; &lt;Source&gt;$(@%systemroot%\system32\sppc.dll,-200)&lt;/Source&gt; &lt;Author&gt;$(@%systemroot%\system32\sppc.dll,-200)&lt;/Author&gt; &lt;Version&gt;1.0&lt;/Version&gt; &lt;Description&gt;$(@%systemroot%\system32\sppc.dll,-201)&lt;/Description&gt; &lt;URI&gt;\Microsoft\Windows\SoftwareProtectionPlatform\SvcRestartTask&lt;/URI&gt; &lt;SecurityDescriptor&gt;D:P(A;;FA;;;SY)(A;;FA;;;BA)(A;;FA;;;S-1-5-80-123231216-2592883651-3715271367-3753151631-4175906628)(A;;FR;;;S-1-5-87-2912274048-3994893941-1669128114-1310430903-1263774323)&lt;/SecurityDescriptor&gt; &lt;/RegistrationInfo&gt; &lt;Triggers&gt; &lt;CalendarTrigger&gt; &lt;StartBoundary&gt;2024-12-10T07:54:44Z&lt;/StartBoundary&gt; &lt;Enabled&gt;true&lt;/Enabled&gt; &lt;ScheduleByDay&gt; &lt;DaysInterval&gt;1&lt;/DaysInterval&gt; &lt;/ScheduleByDay&gt; &lt;/CalendarTrigger&gt; &lt;/Triggers&gt; &lt;Principals&gt; &lt;Principal id="NetworkService"&gt; &lt;UserId&gt;S-1-5-20&lt;/UserId&gt; &lt;RunLevel&gt;LeastPrivilege&lt;/RunLevel&gt; &lt;/Principal&gt; &lt;/Principals&gt; &lt;Settings&gt; &lt;MultipleInstancesPolicy&gt;IgnoreNew&lt;/MultipleInstancesPolicy&gt; &lt;DisallowStartIfOnBatteries&gt;true&lt;/DisallowStartIfOnBatteries&gt; &lt;StopIfGoingOnBatteries&gt;true&lt;/StopIfGoingOnBatteries&gt; &lt;AllowHardTerminate&gt;false&lt;/AllowHardTerminate&gt; &lt;StartWhenAvailable&gt;true&lt;/StartWhenAvailable&gt; &lt;RunOnlyIfNetworkAvailable&gt;false&lt;/RunOnlyIfNetworkAvailable&gt; &lt;IdleSettings&gt; &lt;StopOnIdleEnd&gt;true&lt;/StopOnIdleEnd&gt; &lt;RestartOnIdle&gt;false&lt;/RestartOnIdle&gt; &lt;/IdleSettings&gt; &lt;AllowStartOnDemand&gt;true&lt;/AllowStartOnDemand&gt; &lt;Enabled&gt;true&lt;/Enabled&gt; &lt;Hidden&gt;true&lt;/Hidden&gt; &lt;RunOnlyIfIdle&gt;false&lt;/RunOnlyIfIdle&gt; &lt;DisallowStartOnRemoteAppSession&gt;false&lt;/DisallowStartOnRemoteAppSession&gt; &lt;UseUnifiedSchedulingEngine&gt;true&lt;/UseUnifiedSchedulingEngine&gt; &lt;WakeToRun&gt;false&lt;/WakeToRun&gt; &lt;ExecutionTimeLimit&gt;PT0S&lt;/ExecutionTimeLimit&gt; &lt;Priority&gt;7&lt;/Priority&gt; &lt;RestartOnFailure&gt; &lt;Interval&gt;PT1M&lt;/Interval&gt; &lt;Count&gt;3&lt;/Count&gt; &lt;/RestartOnFailure&gt; &lt;/Settings&gt; &lt;Actions Context="NetworkService"&gt; &lt;ComHandler&gt; &lt;ClassId&gt;{B1AEBB5D-EAD9-4476-B375-9C3ED9F32AFC}&lt;/ClassId&gt; &lt;Data&gt;&lt;![CDATA[timer]]&gt;&lt;/Data&gt; &lt;/ComHandler&gt; &lt;/Actions&gt; &lt;/Task&gt;</Data><Data Name='ClientProcessStartKey'>26177172834095606</Data><Data Name='ClientProcessId'>2408</Data><Data Name='ParentProcessId'>1368</Data><Data Name='RpcCallClientLocality'>0</Data><Data Name='FQDN'>Host</Data></EventData></Event>   I need to be able to validate via | makeresults rex mod=sed..... Thanks in advance
It's not necessary to use the install app command to install an app.  Simply untar the .spl file into $SPLUNK_HOME/etc/apps and restart the forwarder.  No credentials needed (except to access the for... See more...
It's not necessary to use the install app command to install an app.  Simply untar the .spl file into $SPLUNK_HOME/etc/apps and restart the forwarder.  No credentials needed (except to access the forwarder). Even better, use the Splunky method.  Put the app in your Deployment Server's $SPLUNK_HOME/etc/deployment-apps directory and each forwarder will download and install it automatically.
Hi Team, For some VMs, we need to install and configure the Splunk Universal Forwarder using Terraform since backend access is unavailable, and all actions must be automated. Typically, we follow th... See more...
Hi Team, For some VMs, we need to install and configure the Splunk Universal Forwarder using Terraform since backend access is unavailable, and all actions must be automated. Typically, we follow the Splunk documentation to install the credentials package on client machines: [Install the forwarder credentials on individual forwarders in *nix](https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/ConfigSCUFCredentials#Install_the_forwarder_credentials_on_individual_forwarders_in_.2Anix). We use the following command to install the credentials package: $SPLUNK_HOME/bin/splunk install app /tmp/splunkclouduf.spl After executing this command, it prompts for a username and password. Once the credentials are provided, the package is installed, and after restarting the Splunk services, the internal logs from the client begin flowing (provided the relevant ports are open). Current Challenge: Since we are automating the deployment of the credentials package app via Terraform, we require a single command that: 1. Installs the credentials app. 2. Includes the username and password directly in the command. 3. Automatically restarts the Splunk services after installation. This will enable us to deploy the package via Terraform without manual intervention. As this involves a Linux machine, we need a command that can be executed in this environment. Could you kindly assist us in crafting this single automated command?    
Thanks, but how about the performance comparatively with ec2 and container deployment? 
What business problem are you trying to solve? There has been lately an "outbreak" of ideas in line of "I want to send my events to two destinations but not as a precise copy". Sending the same even... See more...
What business problem are you trying to solve? There has been lately an "outbreak" of ideas in line of "I want to send my events to two destinations but not as a precise copy". Sending the same events to two different indexer( cluster)?s induces extra license consumption but also blocks one output when the other one is blocked so it makes your environment sensitive to any problems. So back to the original question - what problem are you trying to solve?
A golden shovel award goes to you, sir/madam. This is a thread from 11 years ago. And the answer to the original question is probably "someone made a typo and mistakenly multiplied by 364 instea... See more...
A golden shovel award goes to you, sir/madam. This is a thread from 11 years ago. And the answer to the original question is probably "someone made a typo and mistakenly multiplied by 364 instead of 365".
The lookup table 'ucd_count_chars_lookup' does not exist or is not available. [...] [ucd_category_lookup] Are you sure those shouldn't match?
ES depends heavily on kvstore. If - for some reason - your installation didn't complete correctly and your kvstore isn't fully configured it won't run properly. If your kvstore is not running, ES (an... See more...
ES depends heavily on kvstore. If - for some reason - your installation didn't complete correctly and your kvstore isn't fully configured it won't run properly. If your kvstore is not running, ES (and much of the "general Splunk") won't run either. But as far as I remember you might get this kind of problems as well if your user simply doesn't have required capabilities and permissions for running ES.
There is nothing special about AWS as such. You need to deploy your machines in AWS as described in "normal" Linux Installation Manual or using docker containers.
Hi Ryan, Support provided query for which we are unable to create metrics . we have tweeked s below SELECT (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) AS daysleft FROM intune_dep WH... See more...
Hi Ryan, Support provided query for which we are unable to create metrics . we have tweeked s below SELECT (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) AS daysleft FROM intune_dep WHERE tokenName = "Wipro-EY-Intune" AND (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) >= 30 and when created metrics we are getting error Following fields (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) AS daysleft in the select clause are not supported. could you help on the query
It sounds like you only have two fields? The first one will be used for the x-axis and the second will be a series (of that name) using the (numeric) values for the height of the columns. If this is ... See more...
It sounds like you only have two fields? The first one will be used for the x-axis and the second will be a series (of that name) using the (numeric) values for the height of the columns. If this is not what you have, please share your search in full (obfuscated as necessary to obscure sensitive information).