All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a string in this form: sub = 13433 cf-ipcountry = US mail = abc.test@gmail.com ct-remote-user = testaccount elevatedsession = N iss = www.google.com user-agent = Apache-HttpClie... See more...
I have a string in this form: sub = 13433 cf-ipcountry = US mail = abc.test@gmail.com ct-remote-user = testaccount elevatedsession = N iss = www.google.com user-agent = Apache-HttpClient/4.5.8 (Java/1.8.0_322) I want to extracr iss fields value I tried this but did not work | rex max_match=0 field=_raw "\/sub \/user-agent \/(?<temp>.*)"
My log is like this: Time Event 3/23/22 11:00:00.000 AM Application 'AAA' is running Application 'BBB' is stopped Database 'CCC' is running Database 'DDD' is running 3/23/22 11:10:00... See more...
My log is like this: Time Event 3/23/22 11:00:00.000 AM Application 'AAA' is running Application 'BBB' is stopped Database 'CCC' is running Database 'DDD' is running 3/23/22 11:10:00.000 AM Application 'AAA' is running Application 'BBB' is running Database 'CCC' is stopped Database 'DDD' is running   I want to extract a table like Time Server Host Status 3/23/22 11:00:00.000 AM Application AAA running 3/23/22 11:00:00.000 AM Application BBB stopped 3/23/22 11:00:00.000 AM Database CCC running 3/23/22 11:00:00.000 AM Database DDD running 3/23/22 11:10:00.000 AM Application AAA running 3/23/22 11:10:00.000 AM Application BBB running 3/23/22 11:10:00.000 AM Database CCC stopped 3/23/22 11:10:00.000 AM Database DDD running   How to do this? If anyone has idea?
Dear Splunk Experts and Community, We are interested in receiving notifications as often as possible when an event is received into Splunk.  We have currently set up a Saved Search that has an act... See more...
Dear Splunk Experts and Community, We are interested in receiving notifications as often as possible when an event is received into Splunk.  We have currently set up a Saved Search that has an action of Webhook to send us alerts every few minutes which is working ok for us. However, as we are new to this system we aren't sure if there is a better way to implement a feed from Splunk to our API. Any additional suggestions? Thanks!    
hello As you can see, I use a table with one hour bin span and I need to drillwown on every row in order to display more details in another dashboard   how to do this please?
Hello Team, What capabilities are required for enabling and disabling the maintenance mode. Based on the following link https://community.splunk.com/t5/Security/Capabilities-needed-for-a-service-acco... See more...
Hello Team, What capabilities are required for enabling and disabling the maintenance mode. Based on the following link https://community.splunk.com/t5/Security/Capabilities-needed-for-a-service-account-to-enable-Maintenance/m-p/345744 , i did provide the following capability edit_indexer_cluster ,  but no breakthrough.
IHAC who are using SH/IDX on AWS and they want to enable encryption the volume (SSD Disk) which Splunk installed on running mode. but customer wonder if it will impact current system.  the volume i... See more...
IHAC who are using SH/IDX on AWS and they want to enable encryption the volume (SSD Disk) which Splunk installed on running mode. but customer wonder if it will impact current system.  the volume include indexed data and installed files.  Do we have any side impact  or condition to enable encryption the disk volume on running mode ?Thank you,
Hello. I would like to know if there any command from the dashboard code or from the address bar (chrome URL bar) to be able to automatically put a dashboard in full screen. I saw a command for a s... See more...
Hello. I would like to know if there any command from the dashboard code or from the address bar (chrome URL bar) to be able to automatically put a dashboard in full screen. I saw a command for a shortcut in chrome but it does not show me the dashboard in full screen, I would like to avoid hitting the button every time I open the dashboard. Thanks.
Cannot be retrieved after field extraction- If field extraction is classified as ` no search is performed after field extraction. However, if you classify | ,  in the same way, extraction and search ... See more...
Cannot be retrieved after field extraction- If field extraction is classified as ` no search is performed after field extraction. However, if you classify | ,  in the same way, extraction and search will work normally. Is there any problem? _raw  field1`field2`field3`field4`field5` ex) (?P<field_name>[^\`]*)`(?P<field_name2>[^\`]*)`(?P<field_name3>[^\`]*)`(?P<field_name4>[^\`]*)`(?P<field_name5>[^\`]*)`  
Search Query: index=winevent source="WinEventLog:Security" EventCode="4624 | stats count by user Source_Network_Address Output utilizing Sankey visualization:                             User... See more...
Search Query: index=winevent source="WinEventLog:Security" EventCode="4624 | stats count by user Source_Network_Address Output utilizing Sankey visualization:                             User A                                                                                      10.20.30.40 Target                                          Count                                    Source                                               Count - 10.20.30.40                                   26                                    User A                                                       26                                                                                                         User B                                                       30                             User B                                                                                      10.20.30.50 Target                                          Count                                    Source                                               Count - 10.20.30.40                                   30                                    User B                                                       10 - 10.20.30.50                                   10 How do I only identify Users that have connected to more than 1 box? I attempted to use the Where function/argument however I may inputting the syntax incorrectly since it provides no results.
I'm trying to setup the Splunk IT Essentials Work app in a single instance environment, and when I open up the app it says it is unable to retrieve subscription data. None of the documentation mentio... See more...
I'm trying to setup the Splunk IT Essentials Work app in a single instance environment, and when I open up the app it says it is unable to retrieve subscription data. None of the documentation mentions anything related to a subscription. I was under the impression that I just needed to download and install this app, what subscription is it looking for?
Hi, I am trying to create a simple app to onboard data from THOR application. First I deployed the UF in my W10 and I see the client in the forwarder management console. phone home 1 second ago. ... See more...
Hi, I am trying to create a simple app to onboard data from THOR application. First I deployed the UF in my W10 and I see the client in the forwarder management console. phone home 1 second ago. After I create a new app (barebones) and I create a simple inputs.conf in the local folder. Then, that I moved the folder from apps folder to deployment-apps folder. finally I created a server class containing only my computer and assigned the new app. The final result is that the application is never deployed. the UI shows a deployment error but not other information is shown. Another weird issue is that I dont find the UF service on my workstation, I have the path and I can open a CMD and restart splunk but for some reason there is no visible service in the services console. Could be that both issues are related? Could someone tell me what could be the reason? Thanks a lot.  
Workday add-on 1.1.0 showing blank page or stays on loading on splunk HF 8.2.2. Tried restarting several times, see attached screenshot. Did anyone face a similar issue?
I have a kvstore that I am writing results of a search to. I have a field in the kvstore called ASC_IDX, and this is defined as a number. If I now write a large number (like 1647976533000037808725) t... See more...
I have a kvstore that I am writing results of a search to. I have a field in the kvstore called ASC_IDX, and this is defined as a number. If I now write a large number (like 1647976533000037808725) to this field with outputlookup, I get the following number in the kvstore: -9223372036854775808 This same number appears for all large number I'm trying to put into the kvstore. Is there a limit to the number of digits for a number field in a kvstore?
I have a splunk indexer cluster with a single search head. I'm taking data in via HEC directly to the cluster. The events themselves are not JSON and look like this:   host_name audit[86]: key1=v... See more...
I have a splunk indexer cluster with a single search head. I'm taking data in via HEC directly to the cluster. The events themselves are not JSON and look like this:   host_name audit[86]: key1=val1, key2=val2, key3=val3, key4=[ subkey1: subval1 subkey2: subval2 subkey3: sub val3 subkey4: subval4]   Note that the value of "subkey3" does have a space. It is intentional. Splunk, by default will grab all the = deliminated fields... so key1,2,3,4 are grabbed nicely. BUT, it won't grab the subkeys as they are : deliminated fields and spaces are apparently allowed. I have a regex that will parse them for me:   (?<key>\w+):\s+(?<value>.*?)(?=\s+\w+:|]$)   It leverages some look ahead for the next field. I tried putting this in a props/transforms as such:   props.conf: [my_sourcetype] REPORT-key4 = parse_key4 transforms.conf: [parse_key4] REGEX = (\w+):\s+(.*?)(?=\s+\w+:|]$) FORMAT = $1::$2 REPEAT_MATCH = true   I deploy both files in their own app to the cluster master (and then apply cluster bundle) and the search head (and either use the debug/refresh or restart splunk). But, it is not extracting the fields. Any ideas on why it isn't doing the extraction? Note... the goal is that this should be a search time extraction. I don't need/want it being index time.
We have an on-prem Splunk Enterprise instance using a Deployment server, indexers, search head, etc.  The environment sits on Windows 2019 and Splunk is version 8.2.3. We have recently setup a HTTP... See more...
We have an on-prem Splunk Enterprise instance using a Deployment server, indexers, search head, etc.  The environment sits on Windows 2019 and Splunk is version 8.2.3. We have recently setup a HTTP Event Collector token for HEC Collection.  It is working correctly from Curl both in Health check and absorbing the manual calls from curl and postman, such as: curl -k http://deploymentserver.local:8088/services/collector/raw -H "Authorization:Splunk 1920123a-f2b1-4c46-b848-6fba456789fe7" -d '{"Sourcetype":"log4j","event":"test"}' These particular calls are ingested and searchable within Splunk.  We have opened a ticket with Splunk, but has been less than helpful as the are just directing us to the token setup which as noted above is working. The issue, it is not ingesting any of our actual application logs.  There are no errors, it has the correct token, it's like it's not getting there or rejected.    We've made changes to the sourcetype so there is no criteria as well as set for json, no difference either way.  So we suspect the formatting of our json is incorrect.  Are there any good samples out there? This is what the json looks like:     <?xml version="1.0" encoding="utf-8"?> <Configuration status="INFO" name="cloudhub" packages="com.appforsplunk.ch.logging.appender, com.splunk.logging ,org.apache.logging.log4j"> <Appenders> <Console name="Console" target="SYSTEM_OUT"> <PatternLayout pattern="%-5p %d [%t] [event: %X{correlationId}] %c: %m%n" /> </Console> <Console name="ConsoleLogUtil" target="SYSTEM_OUT"> <PatternLayout pattern="%m%n" /> </Console> <RollingFile name="file" fileName="${sys:splunkapp.home}${sys:file.separator}logs${sys:file.separator}splunkapp-custom-logging-api.log" filePattern="${sys:splunkapp.home}${sys:file.separator}logs${sys:file.separator}splunkapp-custom-logging-api-%i.log"> <PatternLayout pattern="%-5p %d [%t] [processor: %X{processorPath}; event: %X{correlationId}] %c: %m%n" /> <SizeBasedTriggeringPolicy size="10 MB" /> <DefaultRolloverStrategy max="10"/> </RollingFile> <SplunkHttp name="splunk" url="http://deploymentserver.local:8088/services/collector/raw" token="Splunk 50817720-52e2-4481-a2cf-eb519716354c" disableCertificateValidation="true"> <PatternLayout pattern="%-5p %d [%t] [event: %X{correlationId}] %c: %m%n"/> </SplunkHttp> <Log4J2CloudhubLogAppender name="CLOUDHUB" addressProvider="com.appforsplunk.ch.logging.DefaultAggregatorAddressProvider" applicationContext="com.appforsplunk.ch.logging.DefaultApplicationContext" appendRetryIntervalMs="${sys:logging.appendRetryInterval}" appendMaxAttempts="${sys:logging.appendMaxAttempts}" batchSendIntervalMs="${sys:logging.batchSendInterval}" batchMaxRecords="${sys:logging.batchMaxRecords}" memBufferMaxSize="${sys:logging.memBufferMaxSize}" journalMaxWriteBatchSize="${sys:logging.journalMaxBatchSize}" journalMaxFileSize="${sys:logging.journalMaxFileSize}" clientMaxPacketSize="${sys:logging.clientMaxPacketSize}" clientConnectTimeoutMs="${sys:logging.clientConnectTimeout}" clientSocketTimeoutMs="${sys:logging.clientSocketTimeout}" serverAddressPollIntervalMs="${sys:logging.serverAddressPollInterval}" serverHeartbeatSendIntervalMs="${sys:logging.serverHeartbeatSendIntervalMs}" statisticsPrintIntervalMs="${sys:logging.statisticsPrintIntervalMs}"> <PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n" /> </Log4J2CloudhubLogAppender> </Appenders> <Loggers> <AsyncLogger name="org.splunkapp.service.http" level="WARN"/> <AsyncLogger name="org.splunkapp.extension.http" level="WARN"/> <!-- splunkapp logger --> <AsyncLogger name="org.splunkapp.runtime.core.internal.processor.LoggerMessageProcessor" level="INFO"/> <AsyncRoot level="INFO"> <AppenderRef ref="splunk" /> <AppenderRef ref="CLOUDHUB" /> <AppenderRef ref="Console"/> <AppenderRef ref="file" /> </AsyncRoot> </Loggers> </Configuration>         Thanks in advance.
I am looking to create a search that will send an alert when the feed has been delayed for >=70min
Hi Folks, I'm using a query like below. But since subsearch returns more than 10K events, I'm not getting the expected result. Can someone advise me if there is an alternate way to replace subsearc... See more...
Hi Folks, I'm using a query like below. But since subsearch returns more than 10K events, I'm not getting the expected result. Can someone advise me if there is an alternate way to replace subsearch and to achieve the expected result? index="foo" sourcetype="xyz" user!="abc" method=POST (url="*search*aspx*" AND code!=302 AND code!=304 AND code!=401 AND code!=403 AND code!=0) [search index="foo" method_name=pqr message="*Response Time for method pqr*" | fields uniqid] | eval hour=strftime(_time,"%H") | where hour >=7 AND hour <=19 | timechart span=1d count(eval(time_took)) as Total , count(eval(time_took<2000)) as Success, count(eval(time_took>2000)) as misses | sort by "_time" desc Thanks in advance for the help.
I have some api response logs separated by pipe. However there is already field extraction on api response time. the field value is something like "100 ms". I want to create a table on windowed count... See more...
I have some api response logs separated by pipe. However there is already field extraction on api response time. the field value is something like "100 ms". I want to create a table on windowed count of api response time.  The final table I want to create is some thing like responseTime                     count 5 ms.                    {count where responseTime <= 5ms} 10 ms                    {count where responseTime <= 10ms but >5ms}   I can create a simple count table by " base search | stats count by responseTime" which results like responseTime               count 1 ms                                    100 2 ms                                     30   How can I create this windowed stats?
Hello all, It would seem a swift migration to Splunk Add-on for Microsoft Security is highly recommended: "Customers currently utilizing Microsoft 365 Defender Add-on for Splunk are strongly reco... See more...
Hello all, It would seem a swift migration to Splunk Add-on for Microsoft Security is highly recommended: "Customers currently utilizing Microsoft 365 Defender Add-on for Splunk are strongly recommended to migrate to this new Splunk supported add-on after reading the migration section of the documentation." I haven't been able to get this app to work with GCC, has anyone else? Anyone know when that support is coming?
Hi, Can the existing Splunk App(s) be read out with a search? I would like to assign the service to an app via dropdown in a service request (other tool) - so in a first step I have to read out ... See more...
Hi, Can the existing Splunk App(s) be read out with a search? I would like to assign the service to an app via dropdown in a service request (other tool) - so in a first step I have to read out the apps from Splunk and then integrate them into the other program via interface. Is this possible? or are associated with too much effort? thanks