All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, can someone help me to to extract field 'CmdSet' from cisco ISE accouting logs. string : '[ CmdAV=show CmdArgAV=license CmdArgAV=usage CmdArgAV=<cr> ]'
i have a file with CRT extension from the third party. I am trying to convert the file into PEM but unable to get it done. there were various steps we performed but unable to get it converted. Please... See more...
i have a file with CRT extension from the third party. I am trying to convert the file into PEM but unable to get it done. there were various steps we performed but unable to get it converted. Please suggest.
I would like to ask about the server.conf and web.conf configuration files. how to place them in a clustered environment where there are 3 indexers and the cluster master stands alone? thanks for th... See more...
I would like to ask about the server.conf and web.conf configuration files. how to place them in a clustered environment where there are 3 indexers and the cluster master stands alone? thanks for the answers.
Hello Splunk Experts, Lets say i have a table that contains 2 columns as shown below: Name S_no aaa 1 ccc 3 bbb 2 ddd 4 eee 5 fff 6 ggg 1 iii 3 hhh 2 ... See more...
Hello Splunk Experts, Lets say i have a table that contains 2 columns as shown below: Name S_no aaa 1 ccc 3 bbb 2 ddd 4 eee 5 fff 6 ggg 1 iii 3 hhh 2 jjj 4 kkk 5 lll 6 mmm 1 ooo 3 nnn 2 ppp 4 qqq 5 rrr 6 Now, I need to sort every 6 rows of 's_no' column and populate the table. Something like this: Name S_no aaa 1 bbb 2 ccc 3 ddd 4 eee 5 fff 6 ggg 1 hhh 2 iii 3 jjj 4 kkk 5 lll 6 mmm 1 nnn 2 ooo 3 ppp 4 qqq 5 rrr 6 Could you please help me with the query? Much appreciated!
how to cleanup splunk space through script. I don't have a script. How to create a script to help clean up splunk space and delete old files. not getting reporting; it's not uploading any logs.
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos... See more...
Hi Team, Our application is having a jvm deadlock and stopping indefinetly after splunk-library-javalogging upgrade from 1.6.1 to 1.11.1. Below is the logback, slf4j versions used. compile 'ch.qos.logback:logback-classic:1.2.3' compile 'ch.qos.logback:logback-core:1.2.3' compile 'org.slf4j:slf4j-api:1.7.36' compile 'net.logstash.logback:logstash-logback-encoder:6.6' compile "uk.org.lidalia:sysout-over-slf4j:1.0.2" And below is error deadlock thread stack default task-340 Name[default task-340]Thread ID[13667] Deadlocked on Lock[ch.qos.logback.core.spi.LogbackLock@31309b21] held by thread [splunk-tcp-appender] Thread ID[209] Thread stack [ ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:85) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.append(TcpAppender.java:291) com.splunk.logging.TcpAppender.append(TcpAppender.java:40) ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:82) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.info(Logger.java:579) uk.org.lidalia.sysoutslf4j.context.LogLevel$3.log(LogLevel.java:62) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.log(LoggerAppenderImpl.java:81) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.logOrPrint(LoggerAppenderImpl.java:71) uk.org.lidalia.sysoutslf4j.context.LoggerAppenderImpl.appendAndLog(LoggerAppenderImpl.java:58) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.appendAndLog(SLF4JPrintStreamDelegate.java:76) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamDelegate.delegatePrintln(SLF4JPrintStreamDelegate.java:56) uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.println(SLF4JPrintStreamImpl.java:111) com.lehman.elmo.admin.mvc.controller.ELMRestController.getFolder(ELMRestController.java:120) sun.reflect.GeneratedMethodAccessor116.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:498) org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) javax.servlet.http.HttpServlet.service(HttpServlet.java:503) org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) javax.servlet.http.HttpServlet.service(HttpServlet.java:590) io.undertow.servlet.handlers.ServletHandler.handleRequest(ServletHandler.java:74) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:129) com.lehman.admin.servlet.EtgAdminServletFilter.doFilter(EtgAdminServletFilter.java:141) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:94) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:94) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.Filte default task-355 Name[default task-355]Thread ID[14086] Deadlocked on Lock[com.splunk.logging.TcpAppender@7a29da17] held by thread [default task-340] Thread ID[13667] Thread stack [ ch.qos.logback.core.AppenderBase.doAppend(AppenderBase.java:63) ch.qos.logback.core.spi.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:51) ch.qos.logback.classic.Logger.appendLoopOnAppenders(Logger.java:270) ch.qos.logback.classic.Logger.callAppenders(Logger.java:257) ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:421) ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383) ch.qos.logback.classic.Logger.log(Logger.java:765) org.apache.logging.slf4j.SLF4JLogger.logMessage(SLF4JLogger.java:234) org.apache.logging.log4j.spi.AbstractLogger.log(AbstractLogger.java:2117) org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205) org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159) org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142) org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2017) org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1983) org.apache.logging.log4j.spi.AbstractLogger.info(AbstractLogger.java:1320) com.lehman.elmo.admin.ui.filter.ElmoRootFilter.doFilter(ElmoRootFilter.java:90) io.undertow.servlet.core.ManagedFilter.doFilter(ManagedFilter.java:67) io.undertow.servlet.handlers.FilterHandler$FilterChainImpl.doFilter(FilterHandler.java:131) io.undertow.servlet.handlers.FilterHandler.handleRequest(FilterHandler.java:84) io.undertow.servlet.handlers.security.ServletSecurityRoleHandler.handleRequest(ServletSecurityRoleHandler.java:62) io.undertow.servlet.handlers.ServletChain$1.handleRequest(ServletChain.java:68) io.undertow.servlet.handlers.ServletDispatchingHandler.handleRequest(ServletDispatchingHandler.java:36) org.wildfly.extension.undertow.security.SecurityContextAssociationHandler.handleRequest(SecurityContextAssociationHandler.java:78) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.servlet.handlers.RedirectDirHandler.handleRequest(RedirectDirHandler.java:68) io.undertow.servlet.handlers.security.SSLInformationAssociationHandler.handleRequest(SSLInformationAssociationHandler.java:117) io.undertow.servlet.handlers.security.ServletAuthenticationCallHandler.handleRequest(ServletAuthenticationCallHandler.java:57) io.undertow.server.handlers.DisableCacheHandler.handleRequest(DisableCacheHandler.java:33) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.security.handlers.AuthenticationConstraintHandler.handleRequest(AuthenticationConstraintHandler.java:53) io.undertow.security.handlers.AbstractConfidentialityHandler.handleRequest(AbstractConfidentialityHandler.java:46) io.undertow.servlet.handlers.security.ServletConfidentialityConstraintHandler.handleRequest(ServletConfidentialityConstraintHandler.java:64) io.undertow.servlet.handlers.security.ServletSecurityConstraintHandler.handleRequest(ServletSecurityConstraintHandler.java:60) io.undertow.security.handlers.AuthenticationMechanismsHandler.handleRequest(AuthenticationMechanismsHandler.java:60) io.undertow.servlet.handlers.security.CachedAuthenticatedSessionHandler.handleRequest(CachedAuthenticatedSessionHandler.java:77) io.undertow.security.handlers.NotificationReceiverHandler.handleRequest(NotificationReceiverHandler.java:50) io.undertow.security.handlers.AbstractSecurityContextAssociationHandler.handleRequest(AbstractSecurityContextAssociationHandler.java:43) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.security.jacc.JACCContextIdHandler.handleRequest(JACCContextIdHandler.java:61) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) org.wildfly.extension.undertow.deployment.GlobalRequestControllerHandler.handleRequest(GlobalRequestControllerHandler.java:68) io.undertow.servlet.handlers.SendErrorPageHandler.handleRequest(SendErrorPageHandler.java:52) io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43) io.undertow.server.handlers.MetricsHandler.handleRequest(MetricsHandler.java:64) io.undertow.servlet.core.MetricsChainHandler.handleRequest(MetricsChainHandler.java:59) io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:275) io.undertow.servlet.handlers.ServletInitialHandler.access$100(ServletInitialHandler.java:79) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:134) io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:131) io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48) io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.jav splunk-tcp-appender Name[splunk-tcp-appender]Thread ID[209] Deadlocked on Lock[uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl@5a3259b] held by thread [default task-340] Thread ID[13667] Thread stack [ uk.org.lidalia.sysoutslf4j.system.SLF4JPrintStreamImpl.print(SLF4JPrintStreamImpl.java:246) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.print(OnPrintStreamStatusListenerBase.java:52) ch.qos.logback.core.status.OnPrintStreamStatusListenerBase.addStatusEvent(OnPrintStreamStatusListenerBase.java:58) ch.qos.logback.core.BasicStatusManager.fireStatusAddEvent(BasicStatusManager.java:87) ch.qos.logback.core.BasicStatusManager.add(BasicStatusManager.java:59) ch.qos.logback.core.spi.ContextAwareBase.addStatus(ContextAwareBase.java:80) ch.qos.logback.core.spi.ContextAwareBase.addInfo(ContextAwareBase.java:85) com.splunk.logging.TcpAppender.run(TcpAppender.java:130) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:750) ]
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? ... See more...
Hi Team Can some one help me ,what i should do to migrate items from on-premise SH to Splunk cloud ? Like Lookup,ES use cases ,alerts created in search and reporting , reports and dashboards ? Any step based resolution will be helpful . Can i do it alone or would it require on-demand support from splunk to do these migration ?
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What... See more...
Request: Dynamically start and stop the java agent.   We are running version 20.10.0.31173 on Java 8.  Please, we know we are behind on versions this is a separate issue.  ------------------- What we have observed when starting Wildfly 26.0.1 is the following:  The alert condition for 'keycloak.war failed to deploy' was triggered. Alert:Search String:Trigger Time: Keycloak.war failed to deploy sourcetype=btierservice "parameter 0 of method repositorySchemaController" 07:02:34 -0400 on March 26, 2024. This is an intermittent problem at the time.  The investigation considers several aspects of the environment, and we would like to control one pertaining to the start/stop of Appd.   So, is there a way to start Appd after the application has been initialized? Thus ruling out Appd.
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forward... See more...
I have tried installing Splunk Enterprise 9.2.0.1 on my Linux to use as a Forwarder tier But when I configure deployment clients from Universal Forwarder, the device list is not found. On the Forwarder Management page So I tried to test it by uninstalling Splunk Enterprise 9.2.0.1 on Linux and installing Splunk Enterprise 9.1.3. Then I configured deployment clients from Universal Forwarder and found a list of devices. On the Forwarder Management page So I'm wondering why I can't find the device entry. On the Forwarder page Management on Splunk Enterprise 9.2.0.1
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Tr... See more...
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Traceback (most recent call last): File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\connection.py", line 175, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\util\connection.py", line 72, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): File "C:\Program Files\Splunk\Python-3.7\lib\socket.py", line 752, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno 11001] getaddrinfo failed         The official Akamai SIEM app was not designed to ingest the Prolexic API so unfortunately is of no use to me. Many thanks.
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, a... See more...
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, asset centre and the Frothly environment network diagram. However none of these are working for me. The Frothly environment shows a blank screen. The Identity and Asset centres show an error in 'inputlookup' command: External command based lookup 'identity_lookup_expanded is not available because KV store initialisation has failed. Does anyone have any idea how to get around this, or has anyone else encountered this error?
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone ... See more...
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone have a solution ? I'm getting nowhere with the splunk docs or other solved problems. Thanks !
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id... See more...
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id` that may or may not match. I want to get all events of `type_B` where there is an event of `type_A` with a matching `entity_id`.  From this result, in `type_B` I have some wildcard fields (a common `wildcard_field` name with different sub-fields, such as `wildcard_field.field1`, `wildcard_field.field2`) and I want to extract the data for those fields into a table for visualisation. Example of event structure:     { event: type_A; entity_id: 123; } { event: type_B; entity_id: 123; // Matches a type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; } { event: type_B; entity_id: 345; // This one won't have a matching type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; }     Thank you for any suggestions  
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depen... See more...
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depends="$t1_token$" searchWhenChanged="false"> I have tried as below for two values : But it not working <input type="dropdown" token="sourceToken" depends="$t1_token$,$t2_token$" searchWhenChanged="false"> <input type="dropdown" token="sourceToken" depends="t1_token,t2_token" searchWhenChanged="false"> it is not working. Please advise
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but n... See more...
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but not able to get the SLA count using below query, Can some one help me where I am doing wrong in the below query,   index=* uri=validate | eval SLA=1000| stats count as total_calls count(eval(execution-time < SLA)) as sla_compliant_count
this is the query, so i'm still a baby in this world (so I'm sorry if there is a dummy mistakes that might drive you crazy when you read this query). However, I'm trying to Join the Source Process Id... See more...
this is the query, so i'm still a baby in this world (so I'm sorry if there is a dummy mistakes that might drive you crazy when you read this query). However, I'm trying to Join the Source Process Id (from event code 10) with the Process Id ( from event code 1) and then print the command line, I tried to use `type=inner` but it gave me nothing which is wired, because when I look for the first query there is result and the same for the inner query.     index="main" sourcetype="WinEventLog:Sysmon" EventCode=10 lsass SourceImage="C:\\Windows\\system32\\rundll32.exe" | join left=L right=R type=left where L.SourceProcessId=R.ProcessId [search EventCode=1 lsass "C:\\Windows\\system32\\rundll32.exe"] | table L.TargetImage, R.ProcessId, R.commandLine  
Hi Community,  Please help me out, I am trying to monitor a path on the splunk search head in a Splunk enterprise environment. What would be the best practice to implement this? Would it be advisa... See more...
Hi Community,  Please help me out, I am trying to monitor a path on the splunk search head in a Splunk enterprise environment. What would be the best practice to implement this? Would it be advisable to install a UF on the search head server ? If not, what are the other ways by which we can monitor a path on the splunk search head server.   Thanks,
We need to update the threshold of a KPI, the KPI is used by 100+ services and some of these services have the thresholding unlinked from the Service Template. Is there a macro or saved search tha... See more...
We need to update the threshold of a KPI, the KPI is used by 100+ services and some of these services have the thresholding unlinked from the Service Template. Is there a macro or saved search that we can use to do a bulk update of the KPI threshold settings?  this is for the Services which thresholding is already unlinked to the Service Template to avoid manually opening each service to edit the KPI thresholds.  TIA. 
Hey all, Tech stack: Next.js 13 (pages router) I've been following the guide https://docs.appdynamics.com/display/GOVAPM234/Add+Custom+User+Data+to+a+Page+Browser+Snapshot to set custom attribute... See more...
Hey all, Tech stack: Next.js 13 (pages router) I've been following the guide https://docs.appdynamics.com/display/GOVAPM234/Add+Custom+User+Data+to+a+Page+Browser+Snapshot to set custom attributes.   On the initial page I load the AppDynamics script provided below window['adrum-start-time'] = new Date().getTime() ;((config) => { config.appKey = 'XXX' config.adrumExtUrlHttp = 'http://cdn.appdynamics.com' config.adrumExtUrlHttps = 'https://cdn.appdynamics.com' config.beaconUrlHttp = 'http://syd-col.eum-appdynamics.com' config.beaconUrlHttps = 'https://syd-col.eum-appdynamics.com' config.useHTTPSAlways = true config.xd = { enable: true } config.resTiming = { bufSize: 200, clearResTimingOnBeaconSend: true } config.maxUrlLength = 512; config.userEventInfo = { PageView: getAppDynamicsUserInfo(), VPageView: getAppDynamicsUserInfo(), } })(window['adrum-config'] || (window['adrum-config'] = {})) getAppDynamicsUserInfo is a function attached to window and will return the attribute sessionId always and if available, another attribute called customerId. On the initial page load, the sessionId is sent and viewable on AppDynamics Analyze view. When I get to the page where the customerId is available, it is not sent to AppDynamics.  If I inspect window["adrum-config"] or use ADRUM.conf.userConf, I can see both sessionId and customerId. In the above script I've tried just setting PageView and just setting VPageView.  In terms of methods of loading the above script, I've used the Next.js Script component and tried the following: Load the above as an external script file on different pages (different react components) Load the above in different versions of the same script file (different names) on different pages Added the above script into a React component and loaded the component on different pages I've also tried to use the AJAX method to intercept http calls. It intercepts the http call but does not result in sending the user data to AppDynamics.  In addition to trying to set it via config.userInfo as above, I've tried to use the following options as well.  (function (info) { info.PageView = getAppDynamicsUserInfo info.VPageView = getAppDynamicsUserInfo })(config.userEventInfo || (config.userEventInfo = {})) (function (info) { info.PageView = getAppDynamicsUserInfo() info.VPageView = getAppDynamicsUserInfo() })(config.userEventInfo || (config.userEventInfo = {})) ​ Any help is appreciated, thank you  
from the below query,  i am running for 2 to 3 and posted the output and ran again same query from 3 to 4 and posted the output. i want a query where i can compare pervious hour(2 to 3 data) with (3... See more...
from the below query,  i am running for 2 to 3 and posted the output and ran again same query from 3 to 4 and posted the output. i want a query where i can compare pervious hour(2 to 3 data) with (3 to 4) data  and i want to calculate the difference percentage  |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application output: 02:00 to 03:00 hours data _time application Trans 2022-01-22 02:00 app1 3456.000000 2022-01-22 02:00 app2 5632.000000 2022-01-22 02:00 app3 5643.000000 2022-01-22 02:00 app4 16543.00000   03:00 to 04:00 hours data output: _time application Trans 2022-01-22 03:00 app1 8753.000000 2022-01-22 03:00 app2 342.000000 2022-01-22 03:00 app3 87653.000000 2022-01-22 03:00 app4 8912.00000