All Topics

Top

All Topics

I have a deployment, where 2 HF's are acting as DS. and they are both connected to MC for licensing at port 8089.   In HF 1 i tried to connect some Deployment Client (DC) and they were successfully... See more...
I have a deployment, where 2 HF's are acting as DS. and they are both connected to MC for licensing at port 8089.   In HF 1 i tried to connect some Deployment Client (DC) and they were successfully connected to HF. In HF 2 I tried the same method, bur DC are connecting to Monitoring console instead of DS. Why is this behaviour happening.
I would like to calculate the success rate of the Toup transaction via Channel( APP Or Web) in 4 API calls( E.g 4 Levels,Request will submit 1 do the validation and pass on level 2 and then at level ... See more...
I would like to calculate the success rate of the Toup transaction via Channel( APP Or Web) in 4 API calls( E.g 4 Levels,Request will submit 1 do the validation and pass on level 2 and then at level 2 will do business validation and pass the transaction to next level and so on) in that few transactions may fail at level 1/2/3/4.  The channel method will be available only in the Level 1 not in the Other level. Transaction ID is the only field comman in all the levels. If I apply filter on Channel the output only the list of transaction in Level 1 since Channel field available in level1. 1. If apply filter on Web/APP Channel I should get the list of transaction IDs respective of channel 2. Taking the transaction IDs as a input it should the validate the status of the transaction at each level (2/3/4).   Note: In level 2/3/4 the log has both App and web logs only based on the transaction ID from level 1 need to differentiate. Https status -200(Success); 500(Failure)
Hello, As an admin, I deleted a user in Splunk Web, but when I try to add a user during an investigation, I still see the deleted user in the list. Why is this happening? Is there a conflict betwee... See more...
Hello, As an admin, I deleted a user in Splunk Web, but when I try to add a user during an investigation, I still see the deleted user in the list. Why is this happening? Is there a conflict between deleting users in Splunk Enterprise and Splunk ES?  
We have recently tried installing Machine Agent on Azure linux machine. Using Linuz Zip bundle Linux Install Using ZIP with Bundled JRE (appdynamics.com) Installation is successful,  Appd machine a... See more...
We have recently tried installing Machine Agent on Azure linux machine. Using Linuz Zip bundle Linux Install Using ZIP with Bundled JRE (appdynamics.com) Installation is successful,  Appd machine agent Service is running & active at the OS End, but noticed that the registration request was failed yyyyyyyy000==> [system-thread-0] 29 Aug 2024 16:33:08,128 INFO ApacheClientImpl - Sending registration request: POST https://xxxxxxx.saas.appdynamics.com:443/controller/sim/v2/agent/machines HTTP/1.1 yyyyyyyy000==> [system-thread-0] 29 Aug 2024 16:33:08,193 ERROR ManagedHttpClient - Request failed with exception javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake From the Server, we are able to reach the SAAS endpoint,, without any proxy and default SSL enabled settings is activated, no certificated manually imported on either side. Any direction on this issue, please
Hi I want to ingest data from cisco ips to splunk, can the following add-on (https://splunkbase.splunk.com/app/1903) still be used? and how to configure it Thank you.
I have a sample data pushed to splunk as below: Help me with splunk query where I want only unique server names with final status as second column. compare both horizantally & vertically for each ser... See more...
I have a sample data pushed to splunk as below: Help me with splunk query where I want only unique server names with final status as second column. compare both horizantally & vertically for each server second column status, if any of the second column value is No for that server then consider No as final status for that server, if all the second column values are Yes for a Server, then consider that server final status as Yes. sample.csv: ServerName,Status,Department,Company,Location Server1,Yes,Government,DRDO,Bangalore Server1,No,Government,DRDO,Bangalore Server1,Yes,Government,DRDO,Bangalore Server2,No,Private,TCS,Chennai Server2,No,Private,TCS,Chennai Server3,Yes,Private,Infosys,Bangalore Server3,Yes,Private,Infosys,Bangalore Server4,Yes,Private,Tech Mahindra,Pune Server5,No,Government,IncomeTax India, Mumbai Server6,Yes,Private,Microsoft,Hyderabad Server6,No,Private,Microsoft,Hyderabad Server6,Yes,Private,Microsoft,Hyderabad Server6,No,Private,Microsoft,Hyderabad Server7,Yes,Government,GST Council,Delhi Server7,Yes,Government,GST Council,Delhi Server7,Yes,Government,GST Council,Delhi Server7,Yes,Government,GST Council,Delhi Server8,No,Private,Apple,Bangalore Server8,No,Private,Apple,Bangalore Server8,No,Private,Apple,Bangalore Server8,No,Private,Apple,Bangalore Output should looks similar to below: ServerName,FinalStatus Server1,No Server2,No Server3,Yes Server4,Yes Server5,No Server6,No Server7,Yes Server8,No The Status count of any server should show based on search of any of the fields Department, Company, Location. The Department , Company, Location value wont change for any given server. Only status value will change.  I already have a query to get the output. Below query gives me unique status of each server. | eval FinalStatus = if(Status="Yes", 1, 0) | eventstats min(FinalStatus) as FinalStatus by ServerName | stats min(FinalStatus) as FinalStatus by ServerName | eval FinalStatus = if(FinalStatus=1, "Yes", "No") | table ServerName, FinalStatus   But what I want is whenever I search a department, or Company or Location, I need to get the Final Status count of each server based on these fields search.  for say, based on Location search, I need to get the final status count for a servers. if i search a Company, I should be able to get final status count for servers based on company.  I think its like  | search department="$department$"  Company="$Company$"  Location="$Location$"   Please help with spunk query. 
How can I always hide a panel unconditionally? (f.i. a basic search panel)
Hi Splunk Experts, I've been trying to group "WARN" logs, but they have a pattern (Dynamic/ Argument values) in them. I'm aware of rex, but I don't want to manually rex for 1000s of such different e... See more...
Hi Splunk Experts, I've been trying to group "WARN" logs, but they have a pattern (Dynamic/ Argument values) in them. I'm aware of rex, but I don't want to manually rex for 1000s of such different events. I've even tried cluster, but that doesn't suits well my usecase. Any assistance would be much appreciated.!! Thanks in advance. 2024-08-31 12:34:56 WARN ConfigurationLoader - Deprecated configuration detected in path /xx/yy/zz. Please update your settings to use the latest configuration options. 2024-08-31 12:34:56 WARN ConfigurationLoader - Deprecated configuration detected in path /aa/dd/jkl. Please update your settings to use the latest configuration options. 2024-08-31 14:52:34 WARN QueryExecutor - Query execution time exceeded the threshold: 12.3 seconds. Query: SELECT * FROM users WHERE last_login > '2024-01-01'. 2024-08-31 14:52:34 WARN QueryExecutor - Query execution time exceeded the threshold: 21.9 seconds. Query: SELECT * FROM contacts WHERE contact_id > '252’. 2024-08-31 14:52:34 WARN QueryExecutor - Query execution time exceeded the threshold: 9.5 seconds. Query: SELECT * FROM users WHERE user_id = '123024001'. 2024-08-31 13:45:10 WARN MemoryMonitor - High memory usage detected: 85% of allocated memory is in use. Consider increasing the available memory. 2024-08-31 13:45:10 WARN MemoryMonitor - High memory usage detected: 58% of allocated memory is in use. Consider increasing the available memory. 2024-08-31 14:52:34 WARN QueryExecutor - Query execution time exceeded the threshold: 32.3 seconds. Query: SELECT * FROM users WHERE last_login > '2024-01-01'.   I wish to group them something like below to group similar events!! WARN  ConfigurationLoader Deprecated configuration detected in path. Please update your settings to use the latest configuration options  2 WARN  QueryExecutor Query execution time exceeded the threshold: . Query:  4 WARN  MemoryMonitor High memory usage detected: of allocated memory is in use. Consider increasing the available memory.  2
 Hi All, Can anbody help us with the Regex expression to extract the feild of Channel: values will be either APP or Web which was highlighted in Sample logs below. Sample Log1: \\\":\\\"8E4B381542... See more...
 Hi All, Can anbody help us with the Regex expression to extract the feild of Channel: values will be either APP or Web which was highlighted in Sample logs below. Sample Log1: \\\":\\\"8E4B3815425627\\\",\\\"channel\\\":\\\"APP\\\"}\"","call_res_body":{}, Sample Log2: 4GksYUB7HGIfhfvs_iLtSc8EFCzOzbAJBze8wjXSDnwmgdhwjjxjsghqsxvhv\\\",\\\"channel\\\":\\\"web\\\"}\"","call_res_body":{},"additional_fields":{}}
Java version openjdk 21-ea 2023-09-19 OpenJDK Runtime Environment (build 21-ea+23-1988) OpenJDK 64-Bit Server VM (build 21-ea+23-1988, mixed mode, sharing) Startup flags  java -Dappdynamics.jvm.... See more...
Java version openjdk 21-ea 2023-09-19 OpenJDK Runtime Environment (build 21-ea+23-1988) OpenJDK 64-Bit Server VM (build 21-ea+23-1988, mixed mode, sharing) Startup flags  java -Dappdynamics.jvm.shutdown.mark.node.as.historical=true -Dappdynamics.agent.log4j2.disabled=true -javaagent:/appdynamics/javaagent.jar From what I understand this version of the agent should work with openjdk21 but please correct me if i'm wrong.. any suggestions on what I can do to get this to startup? At startup I see below log. Which to me means the agent can't startup because of an incompatible java version Class with name [com.ibm.lang.management.internal.ExtendedOperatingSystemMXBeanImpl] is not available in classpath, so will ignore export access. java.lang.ClassNotFoundException: Unable to load class io.opentelemetry.sdk.autoconfigure.spi.ResourceProvider at com.singularity.ee.agent.appagent.kernel.classloader.Post19AgentClassLoader.findClass(Post19AgentClassLoader.java:88) at com.singularity.ee.agent.appagent.kernel.classloader.AgentClassLoader.loadClassInternal(AgentClassLoader.java:456) at com.singularity.ee.agent.appagent.kernel.classloader.Post17AgentClassLoader.loadClassParentLast(Post17AgentClassLoader.java:81) at com.singularity.ee.agent.appagent.kernel.classloader.AgentClassLoader.loadClass(AgentClassLoader.java:354) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:526) at java.base/java.lang.Class.forName0(Native Method) at java.base/java.lang.Class.forName(Class.java:497) at java.base/java.lang.Class.forName(Class.java:476) at com.singularity.ee.agent.appagent.AgentEntryPoint.createJava9Module(AgentEntryPoint.java:800) at com.singularity.ee.agent.appagent.AgentEntryPoint.premain(AgentEntryPoint.java:639) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:578) at java.instrument/sun.instrument.InstrumentationImpl.loadClassAndStartAgent(InstrumentationImpl.java:491) at java.instrument/sun.instrument.InstrumentationImpl.loadClassAndCallPremain(InstrumentationImpl.java:503) [AD Agent init] Fri Aug 30 20:35:48 UTC 2024[DEBUG]: JavaAgent - Setting AgentClassLoader as Context ClassLoader [AD Agent init] Fri Aug 30 20:35:48 UTC 2024[DEBUG]: JavaAgent - Setting AgentClassLoader as Context ClassLoader java.lang.IllegalArgumentException: Unsupported class file major version 65 at com.appdynamics.appagent/com.singularity.asm.org.objectweb.asm.ClassReader.<init>(ClassReader.java:199) at com.appdynamics.appagent/com.singularity.asm.org.objectweb.asm.ClassReader.<init>(ClassReader.java:180) at com.appdynamics.appagent/com.singularity.asm.org.objectweb.asm.ClassReader.<init>(ClassReader.java:166) at com.appdynamics.appagent/com.singularity.ee.agent.appagent.services.bciengine.asm.PreTransformer.preTransform(PreTransformer.java:49) at com.appdynamics.appagent/com.singularity.ee.agent.appagent.kernel.JavaAgent.preloadAgentClassesForDeadlockProneJVM(JavaAgent.java:656) at com.appdynamics.appagent/com.singularity.ee.agent.appagent.kernel.JavaAgent.initialize(JavaAgent.java:404) at com.appdynamics.appagent/com.singularity.ee.agent.appagent.kernel.JavaAgent.initialize(JavaAgent.java:347) at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) at java.base/java.lang.reflect.Method.invoke(Method.java:578) at com.singularity.ee.agent.appagent.AgentEntryPoint$1.run(AgentEntryPoint.java:656)
Hi, suppose a server with Splunk Forwarder on it, where lots of logs that haven't yet shipped to Splunk. Is there any way to get an output which lists the files/dirs, the current status (e.g. 50% sen... See more...
Hi, suppose a server with Splunk Forwarder on it, where lots of logs that haven't yet shipped to Splunk. Is there any way to get an output which lists the files/dirs, the current status (e.g. 50% sent to Splunk), etc.? I know I can see a list of files which are being monitored, but I'd like to get an idea of how much data the forwarded has yet to ship.
I have a standard printed statement that shows something like this: [29/Aug/2024:23:59:48 +0000] "GET /rest/LMNOP [29/Aug/2024:23:59:48 +0000] "POST /rest/LMNOP [29/Aug/2024:23:59:48 +0000] "PUT... See more...
I have a standard printed statement that shows something like this: [29/Aug/2024:23:59:48 +0000] "GET /rest/LMNOP [29/Aug/2024:23:59:48 +0000] "POST /rest/LMNOP [29/Aug/2024:23:59:48 +0000] "PUT /rest/LMNOP [29/Aug/2024:23:59:48 +0000] "DELETE /rest/LMNOP I don't have a defined field called  "ActionTaken" in the sense, was the user doing a put, post or get etc.. Is there a simple regex that would give me something to add to a query that would define a variable called  "ActionTaken" tried this: rex "\//rest/s*(?<ActionTaken>\d{3})"  But it comes back with nothing 
I  am trying to use a lookup of "known good" filenames that are within FTP transfer logs, to add extra data to files that are found in the logs, but also need to show  when files are not found in the... See more...
I  am trying to use a lookup of "known good" filenames that are within FTP transfer logs, to add extra data to files that are found in the logs, but also need to show  when files are not found in the logs, but expected. The lookup has a lookup definition defined, so that FileName can contain wildcards, and this works for matching the wildcarded filename to  existing events, with other SPL. lookup definition with wildcard on FileName for csv: FTP-Out FileName Type Direction weekday File1.txt fixedfilename Out monday File73*.txt variablefilename Out thursday File95*.txt variablefilename Out friday   example events: 8/30/24 9:30:14.000AM FTPFileName=File1.txt Status=Success Size=14kb 8/30/24 9:35:26.000AM FTPFileName=File73AABBCC.txt Status=Success Size=15kb 8/30/24 9:40:11.000AM FTPFileName=File73XXYYZZ.txt Status=Success Size=23kb 8/30/24 9:45:24.000AM FTPFileName=garbage.txt Status=Success Size=1kb current search (simplified): | inputlookup FTP-Out | join type=left FileName [ search index=ftp_logs sourcetype=log:ftp | rename FTPFileName as FileName] results I get: 8/30/24 9:30:14.000AM File1.txt fixedfilename Out monday Success 14kb File73*.txt variablefilename Out thursday File95*.txt variablefilename Out friday desired output: 8/30/24 9:30:14.000AM File1.txt fixedfilename Out monday Success 14kb 8/30/24 9:35:26.000AM File73AABBCC.txt variablefilename Out thursday Success 15kb 8/30/24 9:40:11.000AM File73XXYYZZ.txt variablefilename Out thursday Success 23kb File95*.txt variablefilename Out friday Essentially I want the full filename and results for anything the wildcard in the lookup matches, but also show any time the wildcard filename in the lookup doesn't match an event in the  search window. I've tried various other queries with append/appendcols and transaction and the closest I've gotten  so far is still with the left join, however that doesn't appear to join with wildcarded  fields from a lookup. It also doesn't seem that the where  clause with a join off a lookup  supports like() I'm hoping that someone else might have an idea on how I can get the  matched files as well as missing files in  an  output similar to my desired output above. This is within a splunkcloud deployment not  enterprise.
On a Dashboard Studio dashboard I have a dropdown input and a rectangle that can be clicked. When the rectangle is clicked, the token value of the dropdown input token should be changed to a specifi... See more...
On a Dashboard Studio dashboard I have a dropdown input and a rectangle that can be clicked. When the rectangle is clicked, the token value of the dropdown input token should be changed to a specified value. Is that possible in Dashboard Studio?
Is there an option in Dashboard Studio to set/reset a token that was previously set by a "Click Event" to a new value when a specific search in the Dashboard has finished running?   Just to clarify... See more...
Is there an option in Dashboard Studio to set/reset a token that was previously set by a "Click Event" to a new value when a specific search in the Dashboard has finished running?   Just to clarify: I know that I can access tokens from the search with $search name:result.<field>$ or other tokens like $search name:job.done$. What I would need is to set a token when a search is done    Example: Token "tok_example" has the default value 0 With the click on a button (click event) in the dashboard the token "tok_example" is set to value 1 This (the value 1 of the token "tok_example") triggers a search in the dashboard to run After the search is finished, I want to set the token "tok_example" back to it's original value 0 (without any additional interaction by the user with the dashboard)   Step 4 is the part I don't know how to do in Dashboard Studio. Is there a solution for that?
Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Ente... See more...
Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Enterprise Free trails but getting this error while configuration    Your response will help to resolve this issue
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for t... See more...
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for the last 7 days. The tickets which were raised before 7 days are also populating because of another field called sys_updated. This sys_updated field will store all the updates in the tickets, so if an old ticket is updated within last 7 days, it will be populated when i keep timerange picker as last 7 days. Is there a way to consider "sys_created"  as "_time" ?
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows... See more...
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows Steal or Forge Kerberos Tickets Klist ServicePrincipalNames Discovery with SetSPN Rubeus Command Line Parameters Mimikatz PassTheTicket CommandLine Parameters In all cases above, I get two errors:  "Must have data in data model Endpoint.Processes" is in red even though I have installed several Add-ons suggested as compatible such as Splunk Add-on for Microsoft Windows 8.9.0 Palo Alto Networks Add-on for Splunk 8.1.1 Error in 'SearchParser': The search specifies a macro 'summariesonly_config' that cannot be found.  I searched that missing macro and indeed it does not exist. Should I create it manually? With which value? Do you have any idea how to fix those two errors? Many thanks
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some ... See more...
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some document to do this or the workflow template that work with the splunk enterprise solution?
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see... See more...
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see the dashboard working fine  2 - If i select time range last 20 days i can the dashboard is not working 3 - Started trouble shooting the issue and found the below  Spl query The below works fine when the time range is last 30 days  working - index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" Note- The same spl query dont work when time range is last 20 days.  Trouble shooting - Splunk receiving data till date however i have notice few thing,  When i select last 30 days i can see the by fields in the search  UPS Name , UPS Model , Runtime Remaining , Source When i select last 20 days the below fields are missing not sure why?  Missing fields - UPS Name , UPS Model , Runtime Remaining , Source . So the below SPL query is not showing any data  index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" -  |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" The highlighted part not pulling any data due to missing field.   Thanks