All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Felipe.Windmoller, It seems the community was not able to jump in and help. Did you happen to find a solution or workaround you can share? 
Hi @Osama.Abbas, Thanks for letting me know. I'm still working with the Docs team to see if we can get any information on that Docs page clarified. 
Hola gracias por la respuesta, son eventos de seguridad como eventos de Windows y eventos de equipos perimetrales, ¿necesitamos pasar de elastic para obtener los datos a splunk o reenviar los datos ... See more...
Hola gracias por la respuesta, son eventos de seguridad como eventos de Windows y eventos de equipos perimetrales, ¿necesitamos pasar de elastic para obtener los datos a splunk o reenviar los datos de splunk a elastic, es posible visualizar más datos que el que está indexado? Y si no es posible sería ver mis eventos que se muestran en splunk para verlos en elástico.
I would like to ask about the server.conf and web.conf configuration files. how to place them in a clustered environment where there are 3 indexers and the cluster master stands alone? thanks for th... See more...
I would like to ask about the server.conf and web.conf configuration files. how to place them in a clustered environment where there are 3 indexers and the cluster master stands alone? thanks for the answers.
Assuming you have CLI access, it's easy to do by editing .conf files. Create the new app directory $SPLUNK_HOME/etc/apps/<new app>/default Edit $SPLUNK_HOME/etc/apps/search/local/*.conf Move your... See more...
Assuming you have CLI access, it's easy to do by editing .conf files. Create the new app directory $SPLUNK_HOME/etc/apps/<new app>/default Edit $SPLUNK_HOME/etc/apps/search/local/*.conf Move your custom stanzas from the search app to the corresponding file in the new app. Create the new app directory $SPLUNK_HOME/etc/apps/<new app>/metadata Edit $SPLUNK_HOME/etc/apps/search/metadata/local.meta Move your custom stanzas from the search app to the default.meta file in the new app. Restart the SH and test your KOs. Package the new app and upload it to Splunk Cloud.
You can get cleaner results by adding a table. |rest /services/search/jobs | search eventSorting=realtime | table label, author, dispatchState, eai:acl.owner, label, isRealTimeSearch, perf... See more...
You can get cleaner results by adding a table. |rest /services/search/jobs | search eventSorting=realtime | table label, author, dispatchState, eai:acl.owner, label, isRealTimeSearch, performance.dispatch.stream.local.duration_secs, runDuration, splunk_server, title
| streamstats count as row | eval group=floor((row - 1) / 6) | sort 0 group S_no | fields - group row
Done, Can you please below search in Splunk and confirm if this is something you want -  | makeresults | eval data="aaa,1 ccc,3 bbb,2 ddd,4 eee,5 fff,6 ggg,1 iii,3 hhh,2 jjj,4 kkk,5 lll,6 mmm,1 ooo... See more...
Done, Can you please below search in Splunk and confirm if this is something you want -  | makeresults | eval data="aaa,1 ccc,3 bbb,2 ddd,4 eee,5 fff,6 ggg,1 iii,3 hhh,2 jjj,4 kkk,5 lll,6 mmm,1 ooo,3 nnn,2 ppp,4 qqq,5 rrr,6" | makemv data delim=" " | mvexpand data | rex field=data "(?<Name>\w+),(?<S_no>\d+)" | streamstats count as row_num | eval GroupNum = floor((row_num - 1) / 6) | sort GroupNum S_no | fields - _time data row_num GroupNum Output -      Please accept the solution and hit Karma, if this helps!    
Hello Splunk Experts, Lets say i have a table that contains 2 columns as shown below: Name S_no aaa 1 ccc 3 bbb 2 ddd 4 eee 5 fff 6 ggg 1 iii 3 hhh 2 ... See more...
Hello Splunk Experts, Lets say i have a table that contains 2 columns as shown below: Name S_no aaa 1 ccc 3 bbb 2 ddd 4 eee 5 fff 6 ggg 1 iii 3 hhh 2 jjj 4 kkk 5 lll 6 mmm 1 ooo 3 nnn 2 ppp 4 qqq 5 rrr 6 Now, I need to sort every 6 rows of 's_no' column and populate the table. Something like this: Name S_no aaa 1 bbb 2 ccc 3 ddd 4 eee 5 fff 6 ggg 1 hhh 2 iii 3 jjj 4 kkk 5 lll 6 mmm 1 nnn 2 ooo 3 ppp 4 qqq 5 rrr 6 Could you please help me with the query? Much appreciated!
How to Break the KOs out of the Search & Reporting app into a custom app ? Can you give an example ?
This is something normally handled by Professional Services as part of the migration. You can do it yourself, however.  Break the KOs out of the Search & Reporting app into a custom app.  Then uploa... See more...
This is something normally handled by Professional Services as part of the migration. You can do it yourself, however.  Break the KOs out of the Search & Reporting app into a custom app.  Then upload the app to Splunk Cloud.
Hi @power12  you could try using the API : https://community.splunk.com/t5/Splunk-Search/How-to-change-sharing-and-permissions-for-a-lookup-table-using/m-p/163257  
Hello @asimsk84 welcome! depending on your retention policy on indexes.conf old buckets won't be frozen and will be deleted : https://community.splunk.com/t5/Getting-Data-In/Do-I-need-to-define-cold... See more...
Hello @asimsk84 welcome! depending on your retention policy on indexes.conf old buckets won't be frozen and will be deleted : https://community.splunk.com/t5/Getting-Data-In/Do-I-need-to-define-coldToFrozenDir-in-indexes-conf-to-move-old/m-p/247162 Doc : https://docs.splunk.com/Documentation/Splunk/9.2.0/Admin/Indexesconf%20%22coldToFrozenDir%22 You should not delete any file yourself, leave Splunk manage.    
For me, it turned out to be an incorrect FMC IP. Post proper IP configuration it worked  Splunk - 9.2.0.1 eStreamer - 5.2.9  
how to cleanup splunk space through script. I don't have a script. How to create a script to help clean up splunk space and delete old files. not getting reporting; it's not uploading any logs.
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos... See more...
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos.logback:logback-classic:1.2.3' compile 'ch.qos.logback:logback-core:1.2.3' compile 'org.slf4j:slf4j-api:1.7.36' compile 'net.logstash.logback:logstash-logback-encoder:6.6' compile "uk.org.lidalia:sysout-over-slf4j:1.0.2" And below is error deadlock thread stack default task-340 Name[default task-340]Thread ID[13667] Deadlocked on Lock[ch.qos.logback.core.spi.LogbackLock@31309b21] held by thread [splunk-tcp-appender] Thread ID[209] Thread stack [ ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:85) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.append(TcpAppender.java:291) com.splunk.logging.TcpAppender.append(TcpAppender.java:40) ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:82) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.info(Logger.java:579) uk.org.lidalia.sysoutslf4j.context.LogLevel$3.log(LogLevel.java:62) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.log(LoggerAppenderImpl.java:81) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.logOrPrint(LoggerAppenderImpl.java:71) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.appendAndLog(LoggerAppenderImpl.java:58) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.appendAndLog(SLF4JPrintStreamDelegate.java:76) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.delegatePrintln(SLF4JPrintStreamDelegate.java:56) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.println(SLF4JPrintStreamImpl.java:111) com.lehman.elmo.admin.mvc.controller.ELMRestController.getFolder(ELMRestController.java:120) sun.reflect.GeneratedMethodAccessor116.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) javax.servlet.http.HttpServlet.service(HttpServlet.java:503) org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) javax.servlet.http.HttpServlet.service(HttpServlet.java:590) io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129) com.lehman.admin.servlet.EtgAdminServletFilter.doFilter(EtgAdminServletFilter.java:141) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:94) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:94) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.Filte default task-355 Name[default task-355]Thread ID[14086] Deadlocked on Lock[com.splunk.logging.TcpAppender@7a29da17] held by thread [default task-340] Thread ID[13667] Thread stack [ ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:63) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.log(Logger.java:765) org.apache.logging.slf4j.SLF4JLogger.logMessage(SLF4JLogger.java:234) org.apache.logging.log4j.spi.AbstractLogger.log(AbstractLogger.java:2117) org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205) org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159) org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142) org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2017) org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1983) org.apache.logging.log4j.spi.AbstractLogger.info(AbstractLogger.java:1320) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:90) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84) io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68) io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:117) io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) io.undertow.server.handlers.DisableCacheHandler.handleRequest(DisableCacheHandler.java:33) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.security.handlers.AuthenticationConstraintHandler.handleRequest(AuthenticationConstraintHandler.java:53) io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) io.undertow.servlet.handlers.security.ServletSecurityConstraintHandler.handleRequest(ServletSecurityConstraintHandler.java:60) io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50) io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68) io.undertow.servlet.handlers.SendErrorPageHandler.handleRequest(SendErrorPageHandler.java:52) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.server.handlers.MetricsHandler.handleRequest(MetricsHandler.java:64) io.undertow.servlet.core.MetricsChainHandler.handleRequest(MetricsChainHandler.java:59) io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:275) io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:79) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:134) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:131) io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.jav splunk-tcp-appender Name[splunk-tcp-appender]Thread ID[209] Deadlocked on Lock[uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl@5a3259b] held by thread [default task-340] Thread ID[13667] Thread stack [ uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.print(SLF4JPrintStreamImpl.java:246) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.print(OnPrintStreamStatusListenerBase.java:52) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.addStatusEvent(OnPrintStreamStatusListenerBase.java:58) ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:87) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.run(TcpAppender.java:130) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:750) ]
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? ... See more...
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? Any step based resolution will be helpful . Can i do it alone or would it require on-demand support from splunk to do these migration ?
Does the HEC support allow/deny for specific event types?  We have a LOT of data that we do not want to capture/forward from the cloud.
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What... See more...
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What we have observed when starting Wildfly 26.0.1 is the following:  The alert condition for 'keycloak.war failed to deploy' was triggered. Alert:Search String:Trigger Time: Keycloak.war failed to deploy sourcetype=btierservice "parameter 0 of method repositorySchemaController" 07:02:34 -0400 on March 26, 2024. This is an intermittent problem at the time.  The investigation considers several aspects of the environment, and we would like to control one pertaining to the start/stop of Appd.   So, is there a way to start Appd after the application has been initialized? Thus ruling out Appd.
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forward... See more...
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forwarder Management page So I tried to test it by uninstalling Splunk Enterprise 9.2.0.1 on Linux and installing Splunk Enterprise 9.1.3. Then I configured deployment clients from Universal Forwarder and found a list of devices. On the Forwarder Management page So I'm wondering why I can't find the device entry. On the Forwarder page Management on Splunk Enterprise 9.2.0.1