All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How to Break the KOs out of the Search & Reporting app into a custom app ? Can you give an example ?
This is something normally handled by Professional Services as part of the migration. You can do it yourself, however.  Break the KOs out of the Search & Reporting app into a custom app.  Then uploa... See more...
This is something normally handled by Professional Services as part of the migration. You can do it yourself, however.  Break the KOs out of the Search & Reporting app into a custom app.  Then upload the app to Splunk Cloud.
Hi @power12  you could try using the API : https://community.splunk.com/t5/Splunk-Search/How-to-change-sharing-and-permissions-for-a-lookup-table-using/m-p/163257  
Hello @asimsk84 welcome! depending on your retention policy on indexes.conf old buckets won't be frozen and will be deleted : https://community.splunk.com/t5/Getting-Data-In/Do-I-need-to-define-cold... See more...
Hello @asimsk84 welcome! depending on your retention policy on indexes.conf old buckets won't be frozen and will be deleted : https://community.splunk.com/t5/Getting-Data-In/Do-I-need-to-define-coldToFrozenDir-in-indexes-conf-to-move-old/m-p/247162 Doc : https://docs.splunk.com/Documentation/Splunk/9.2.0/Admin/Indexesconf%20%22coldToFrozenDir%22 You should not delete any file yourself, leave Splunk manage.    
For me, it turned out to be an incorrect FMC IP. Post proper IP configuration it worked  Splunk - 9.2.0.1 eStreamer - 5.2.9  
how to cleanup splunk space through script. I don't have a script. How to create a script to help clean up splunk space and delete old files. not getting reporting; it's not uploading any logs.
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos... See more...
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos.logback:logback-classic:1.2.3' compile 'ch.qos.logback:logback-core:1.2.3' compile 'org.slf4j:slf4j-api:1.7.36' compile 'net.logstash.logback:logstash-logback-encoder:6.6' compile "uk.org.lidalia:sysout-over-slf4j:1.0.2" And below is error deadlock thread stack default task-340 Name[default task-340]Thread ID[13667] Deadlocked on Lock[ch.qos.logback.core.spi.LogbackLock@31309b21] held by thread [splunk-tcp-appender] Thread ID[209] Thread stack [ ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:85) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.append(TcpAppender.java:291) com.splunk.logging.TcpAppender.append(TcpAppender.java:40) ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:82) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.info(Logger.java:579) uk.org.lidalia.sysoutslf4j.context.LogLevel$3.log(LogLevel.java:62) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.log(LoggerAppenderImpl.java:81) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.logOrPrint(LoggerAppenderImpl.java:71) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.appendAndLog(LoggerAppenderImpl.java:58) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.appendAndLog(SLF4JPrintStreamDelegate.java:76) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.delegatePrintln(SLF4JPrintStreamDelegate.java:56) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.println(SLF4JPrintStreamImpl.java:111) com.lehman.elmo.admin.mvc.controller.ELMRestController.getFolder(ELMRestController.java:120) sun.reflect.GeneratedMethodAccessor116.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) javax.servlet.http.HttpServlet.service(HttpServlet.java:503) org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) javax.servlet.http.HttpServlet.service(HttpServlet.java:590) io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129) com.lehman.admin.servlet.EtgAdminServletFilter.doFilter(EtgAdminServletFilter.java:141) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:94) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:94) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.Filte default task-355 Name[default task-355]Thread ID[14086] Deadlocked on Lock[com.splunk.logging.TcpAppender@7a29da17] held by thread [default task-340] Thread ID[13667] Thread stack [ ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:63) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.log(Logger.java:765) org.apache.logging.slf4j.SLF4JLogger.logMessage(SLF4JLogger.java:234) org.apache.logging.log4j.spi.AbstractLogger.log(AbstractLogger.java:2117) org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205) org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159) org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142) org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2017) org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1983) org.apache.logging.log4j.spi.AbstractLogger.info(AbstractLogger.java:1320) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:90) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84) io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68) io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:117) io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) io.undertow.server.handlers.DisableCacheHandler.handleRequest(DisableCacheHandler.java:33) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.security.handlers.AuthenticationConstraintHandler.handleRequest(AuthenticationConstraintHandler.java:53) io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) io.undertow.servlet.handlers.security.ServletSecurityConstraintHandler.handleRequest(ServletSecurityConstraintHandler.java:60) io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50) io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68) io.undertow.servlet.handlers.SendErrorPageHandler.handleRequest(SendErrorPageHandler.java:52) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.server.handlers.MetricsHandler.handleRequest(MetricsHandler.java:64) io.undertow.servlet.core.MetricsChainHandler.handleRequest(MetricsChainHandler.java:59) io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:275) io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:79) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:134) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:131) io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.jav splunk-tcp-appender Name[splunk-tcp-appender]Thread ID[209] Deadlocked on Lock[uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl@5a3259b] held by thread [default task-340] Thread ID[13667] Thread stack [ uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.print(SLF4JPrintStreamImpl.java:246) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.print(OnPrintStreamStatusListenerBase.java:52) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.addStatusEvent(OnPrintStreamStatusListenerBase.java:58) ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:87) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.run(TcpAppender.java:130) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:750) ]
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? ... See more...
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? Any step based resolution will be helpful . Can i do it alone or would it require on-demand support from splunk to do these migration ?
Does the HEC support allow/deny for specific event types?  We have a LOT of data that we do not want to capture/forward from the cloud.
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What... See more...
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What we have observed when starting Wildfly 26.0.1 is the following:  The alert condition for 'keycloak.war failed to deploy' was triggered. Alert:Search String:Trigger Time: Keycloak.war failed to deploy sourcetype=btierservice "parameter 0 of method repositorySchemaController" 07:02:34 -0400 on March 26, 2024. This is an intermittent problem at the time.  The investigation considers several aspects of the environment, and we would like to control one pertaining to the start/stop of Appd.   So, is there a way to start Appd after the application has been initialized? Thus ruling out Appd.
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forward... See more...
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forwarder Management page So I tried to test it by uninstalling Splunk Enterprise 9.2.0.1 on Linux and installing Splunk Enterprise 9.1.3. Then I configured deployment clients from Universal Forwarder and found a list of devices. On the Forwarder Management page So I'm wondering why I can't find the device entry. On the Forwarder page Management on Splunk Enterprise 9.2.0.1
It worked ! Thank you so much I've been struggling with that problem for so long ! Have a wonderful day
Thanks, @Ryan.Paredez  for trying the suggested solution. It seems we're still encountering the same error referencing additional documentation. Regarding the issue with accessing the secret file '/o... See more...
Thanks, @Ryan.Paredez  for trying the suggested solution. It seems we're still encountering the same error referencing additional documentation. Regarding the issue with accessing the secret file '/opt/appdynamics/cluster-agent/secret-volume/api-user', it appears the file might be missing or inaccessible. Additionally, I attempted the examples provided in the link you shared (https://docs.appdynamics.com/appd/23.x/latest/en/infrastructure-visibility/monitor-kubernetes-with-the-cluster-agent/auto-instrument-applications-with-the-cluster-agent/auto-instrumentation-configuration-examples)
The first version  depends="$t1_token$,$t2_token$" should work (it does for me). Which version of Splunk are you using?
Hi  Your case should end with ,1=1, 100) and not 1==1,100
Thank You
^([A-Za-z0-9]\.|[A-Za-z0-9][A-Za-z0-9-]{0,61}[A-Za-z0-9]\.){1,3}[A-Za-z]{2,6}$
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Tr... See more...
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Traceback (most recent call last): File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\connection.py", line 175, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\util\connection.py", line 72, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): File "C:\Program Files\Splunk\Python-3.7\lib\socket.py", line 752, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno 11001] getaddrinfo failed         The official Akamai SIEM app was not designed to ingest the Prolexic API so unfortunately is of no use to me. Many thanks.
Thanks so much! This is exactly what I was trying to achieve. Apologies about the wrongly formatted data, but your dummy data is correct. For wildcard, I meant a field name that appears multiple tim... See more...
Thanks so much! This is exactly what I was trying to achieve. Apologies about the wrongly formatted data, but your dummy data is correct. For wildcard, I meant a field name that appears multiple times but can have any number of different subfields (i.e. `wildcard_field.*`) but I wasn't sure if this was the correct terminology, but your answer does work exactly for this field. Thanks again for your answer. It solves my problem, and I have also learnt a bit more about searching in Splunk, which I really appreciate.
Depending on what you are trying to do with the StartTime, you may need one or both of the evals eventtype=windows_index_windows eventtype=hostmon_windows host="///" (Name="///*") OR (Name="///*") O... See more...
Depending on what you are trying to do with the StartTime, you may need one or both of the evals eventtype=windows_index_windows eventtype=hostmon_windows host="///" (Name="///*") OR (Name="///*") OR (Name="///*") StartTime="*" | table Name, StartTime ``` Parse strptime the StartTime into an epoch time value ``` | eval epochStartTime=strptime(StartTime,"%Y%m%d%H%M%S.%6N%z") ``` Format strftime the epoch time into a string ``` | eval stringStartTime=strftime(epochStartTime,"%F %T.%6N")