All Topics

Top

All Topics

Hi, How to call a External URL from Splunk Search and read the JSON results obtained from it in Splunk. Basically i want to hit the URL and the results obtained by hitting the URL is in JSON Format... See more...
Hi, How to call a External URL from Splunk Search and read the JSON results obtained from it in Splunk. Basically i want to hit the URL and the results obtained by hitting the URL is in JSON Format. I then need to read this JSON result in splunk
Hi All, We would like to know for an alternative solution to website monitoring app in Splunkbase. Could someone please let us know if there is an alternate way to monitor website availability in Sp... See more...
Hi All, We would like to know for an alternative solution to website monitoring app in Splunkbase. Could someone please let us know if there is an alternate way to monitor website availability in Splunk? Thanks!
Hi all, I am uploading a csv which has two columns, Status and Flag. I am having issues where the Flag field is being populated with the value which is set in the status field even when flag is blan... See more...
Hi all, I am uploading a csv which has two columns, Status and Flag. I am having issues where the Flag field is being populated with the value which is set in the status field even when flag is blank. i.e. If status is O and Flag is blank then Flag is being populated with O as well.   Can you help?
we are running an integration with Azure  app TA-MS-AAD sourcetype azure:eventhub  and getting the RoleAsssigment as ID  only from activity logs  what is the best way to import RoleAsssigment  displ... See more...
we are running an integration with Azure  app TA-MS-AAD sourcetype azure:eventhub  and getting the RoleAsssigment as ID  only from activity logs  what is the best way to import RoleAsssigment  display name  ?  
As the title says. I cant find a way to delete an old post. 
I currently have multiple entries in the VALUES column for each host. The table currently looks like:  hostname VALUES HOST1 ENV1 APP1 LOC1   HOST2 ENV2 APP2 LOC2   I woul... See more...
I currently have multiple entries in the VALUES column for each host. The table currently looks like:  hostname VALUES HOST1 ENV1 APP1 LOC1   HOST2 ENV2 APP2 LOC2   I would like the table to read as: hostname ENV APP LOC HOST1 ENV1 APP1 LOC1 HOST2 ENV2 APP2 LOC2   I am essentially trying to transpose the column "VALUE" and create 3 separate columns with the custom headings "ENV,APP and LOC" 
Hi all i wrote a dummy java/quarkus app that fetches html data from any web page. my goal is to see the tier pointing to the endpoints in appdy's flowmap my code @GET @Path("/1") @Produce... See more...
Hi all i wrote a dummy java/quarkus app that fetches html data from any web page. my goal is to see the tier pointing to the endpoints in appdy's flowmap my code @GET @Path("/1") @Produces(MediaType.TEXT_PLAIN) public String test1(){ try { CloseableHttpClient httpClient = HttpClients .custom() .setSSLContext(new SSLContextBuilder().loadTrustMaterial(null, TrustAllStrategy.INSTANCE).build()) .build(); HttpGet request = new HttpGet("https://nylen.io/d3-spirograph/"); CloseableHttpResponse response = httpClient.execute(request); System.out.println(response.getProtocolVersion()); // HTTP/1.1 System.out.println(response.getStatusLine().getStatusCode()); // HTTP/1.1 System.out.println(response.getStatusLine().getReasonPhrase()); // OK System.out.println(response.getStatusLine().toString()); // HTTP/1.1 200 OK HttpEntity entity = response.getEntity(); if (entity != null){ String result = EntityUtils.toString(entity); response.close(); return result; } } catch (ClientProtocolException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } catch (NoSuchAlgorithmException e) { e.printStackTrace(); } catch (KeyStoreException e) { e.printStackTrace(); } catch (KeyManagementException e) { e.printStackTrace(); } return "ok"; } path /2 @GET @Path("/2") @Produces(MediaType.TEXT_PLAIN) public String test2() throws Exception{ SSLContext sslcontext = SSLContext.getInstance("TLS"); sslcontext.init(null, new TrustManager[]{new X509TrustManager() { public void checkClientTrusted(X509Certificate[] arg0, String arg1) throws CertificateException {} public void checkServerTrusted(X509Certificate[] arg0, String arg1) throws CertificateException {} public X509Certificate[] getAcceptedIssuers() { return new X509Certificate[0]; } }}, new java.security.SecureRandom()); // Client client = ClientBuilder.newClient() Client client = ClientBuilder.newBuilder() .sslContext(sslcontext) .hostnameVerifier((s1, s2) -> true) .build(); String ssb = "https://self-signed.badssl.com/"; String response = client.target(ssb) //.queryParam("query", "q") .request() .accept("text/html") .get(String.class); // .post(Entity.entity("e", "text/plain"), String.class); client.close(); return response; } start app java -javaagent:/opt/appdynamics-agent/ver21.8.0.32958/javaagent.jar \ -jar /root/quarkus/vintageStore/rest-book/target/quarkus-app/quarkus-run.jar starting logs ... Agent runtime conf directory set to /opt/appdynamics-agent/ver21.8.0.32958/conf [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: AgentInstallManager - Agent runtime conf directory set to /opt/appdynamics-agent/ver21.8.0.32958/conf [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - JDK Compatibility: 1.8+ [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - Using Java Agent Version [Server Agent #21.8.0.32958 v21.8.0 GA compatible with 4.4.1.0 r38646896978b0b95298354a38b015eaede619691 release/21.8.0] [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - Running IBM Java Agent [No] [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - Java Agent Directory [/opt/appdynamics-agent/ver21.8.0.32958] [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - Java Agent AppAgent directory [/opt/appdynamics-agent/ver21.8.0.32958] Agent logging directory set to [/opt/appdynamics-agent/ver21.8.0.32958/logs] [AD Agent init] Tue Oct 19 01:26:51 BRT 2021[INFO]: JavaAgent - Agent logging directory set to [/opt/appdynamics-agent/ver21.8.0.32958/logs] getBootstrapResource not available on ClassLoader Registered app server agent with Node ID[234307] Component ID[94762] Application ID [55102] Started AppDynamics Java Agent Successfully. ____ _ problem is i can not see "Service Endponits" being discovered.
Hello Splunk Community,  Can anyone help me build a query based on the below;   I have built a query which calculates the total duration of multiple events over a period of time.  What I am tryin... See more...
Hello Splunk Community,  Can anyone help me build a query based on the below;   I have built a query which calculates the total duration of multiple events over a period of time.  What I am trying to do now is create a timechart to show the _time on x-axis and duration on y-axis.  I think I need to convert the duration from hours to minutes but not sure how to do this. Below is an example of the output from my original query I am trying to visualise in a timechart;  _time  duration (in hours) 2021-10-12 03:56:30 2021-10-13 04:27:25 2021-10-14 04:21:03 2021-10-18 07:11:04   THANK YOU (in advance)
Hi All, After a bit of googling I've come up empty with regards to being able to identify security issues that have been addressed as part of each Splunk Enterprise version update. Just wondering i... See more...
Hi All, After a bit of googling I've come up empty with regards to being able to identify security issues that have been addressed as part of each Splunk Enterprise version update. Just wondering if there anyone has a link or can provide some details on where I can find this information? I have looked through the release notes pages, but these seem to only list functional improvements / fixes. Appreciate anyone that can help with this.
  Hello, I have an issue writing props configuration for text source file which contains first 2 line (including "----" line) as header info. Please see 3 sample events along with 2 header lines be... See more...
  Hello, I have an issue writing props configuration for text source file which contains first 2 line (including "----" line) as header info. Please see 3 sample events along with 2 header lines below. I also included the props that I wrote for this source file, but not working as expected....getting some error message "failed to parse timestamp". Any help will he highly appreciated. Thank you so much. Sample data Event_id  user_id   group_id  create_date  create_login  company_event_id  event_name   ----------------- ----------- ----------- ----------------------- ------------ ------------------------- -------------- 105  346923 NULL  2021-10-07 14:13:21.160 783923 45655234 User Login  250 165223 NULL 2021-10-07 15:33:54.857   566923  92557239 User Login  25 1168923 NULL 2021-10-07 16:44:05.257   346923  34558242 User Login    props config file I wrote SHOULD_LINEMERGE=false INDEXED_EXTRACTIONS=csv TIMESTAMP_FIELDS=create_date TIME_FORMAT=%Y-%m-%d  %H:%M:%S.%3N HEADERFIELD_LINE_NUMBER=1
Hi, This is my first time setting up Splunk in Kubernetes by using Splunk Operator. I have set up the cluster just fine. One challenge I'm having now is to deploy my Splunk Apps to our search head ... See more...
Hi, This is my first time setting up Splunk in Kubernetes by using Splunk Operator. I have set up the cluster just fine. One challenge I'm having now is to deploy my Splunk Apps to our search head cluster. Here is the docs that I followed: https://splunk.github.io/splunk-operator/AppFramework.html The issues are: 1. My deployer keeps getting undeployed everytime I make changes to the SHC CRD. idk why? 2. The app is simply not getting deployed. The app's .tgz file is already in my S3 bucket. Here's the spec of my SHC   ... spec: appRepo: appSources: - location: searchHeadApps/ name: assettrackerapp.tgz appsRepoPollIntervalSeconds: 30 defaults: scope: cluster volumeName: volume_app_repo_us volumes: - endpoint: https://dev-splunk-operator.s3.amazonaws.com name: volume_app_repo_us path: dev-splunk-operator provider: aws secretRef: s3-secret storageType: s3 ...   Here are some of the splunk-operator logs:   {"level":"info","ts":1634593053.3164997,"logger":"splunk.enterprise.ValidateAppFrameworkSpec","msg":"App framework configuration is valid"} {"level":"info","ts":1634593053.3165247,"logger":"splunk.enterprise.initAndCheckAppInfoStatus","msg":"Checking status of apps on remote storage...","name":"sh","namespace":"splunk"} {"level":"info","ts":1634593053.3165333,"logger":"splunk.enterprise.GetAppListFromS3Bucket","msg":"Getting the list of apps from remote storage...","name":"sh","namespace":"splunk"} {"level":"info","ts":1634593053.3198195,"logger":"splunk.enterprise.GetRemoteStorageClient","msg":"Creating the client","name":"sh","namespace":"splunk","volume":"volume_app_repo_us","bucket":"dev-splunk-operator","bucket path":"searchHeadApps/"} {"level":"info","ts":1634593053.3199255,"logger":"splunk.client.InitAWSClientSession","msg":"AWS Client Session initialization successful.","region":"","TLS Version":"TLS 1.2"} {"level":"info","ts":1634593053.319938,"logger":"splunk.client.GetAppsList","msg":"Getting Apps list","AWS S3 Bucket":"dev-splunk-operator"} {"level":"error","ts":1634593053.3199534,"logger":"splunk.client.GetAppsList","msg":"Unable to list items in bucket","AWS S3 Bucket":"dev-splunk-operator","error":"MissingRegion: could not find region configuration"   Please advise, thank you.
I have some data like the following: NAME Code Suzy 0 John 0 Adam 1 Suzy 1 John 0 Adam 1   I am trying to calculate the ratio of code=1 to code=0, by Name, and displ... See more...
I have some data like the following: NAME Code Suzy 0 John 0 Adam 1 Suzy 1 John 0 Adam 1   I am trying to calculate the ratio of code=1 to code=0, by Name, and display these ratios by hour. The name values are dynamic and unknown at query time. I can get halfway there, using a dynamic eval field name, like this: index=SOME_INDEX sourcetype=SOME_SOURCETYPE code | eval counterCode0{name} = if(code=0, 1, 0) | eval counterCode1{name} = if(code=1, 1, 0) | bin _time span=1m | stats sum(counterCode0*), sum(counterCode1*) by _time   But I can't figure out how to get the ratios of counterCode1* to counterCode0*. Any ideas? Or do I need to approach this problem differently?  
| inputlookup file1.csv field1 field2 1 a 2 b 3 c   it is necessary so 1 2 3 a b c     help! Thanks   
Why am I getting a “PKIX path building failed” error when my extension tries to connect to an application server? Sometimes a “PKIX path building failed” error is reported in Machine Agent logs fo... See more...
Why am I getting a “PKIX path building failed” error when my extension tries to connect to an application server? Sometimes a “PKIX path building failed” error is reported in Machine Agent logs for extensions that are trying to connect to an HTTPS endpoint. Contents What does the PKIX error mean? How do I resolve a PKIX error? How do I manually import the certificates? If I continue to see PKIX errors, how else can I troubleshoot? Additional resources What does the PKIX error mean? PKIX stands for Public Key Infrastructure X509. Whenever Java attempts to connect to another application over SSL, the connection will only succeed if it can trust the application. If the extension is not able to establish trust with the configured server then it returns the “PKIX path building failed” error. How do I resolve a PKIX error? The most convenient resolution for this error is to configure SSL parameters in the config.yml file of the extension. You can add a “connection” property in the config file with relevant SSL parameters as mentioned below:   Make sure that you have correct certificates in truststore and keystore before configuring the path. How do I manually import the certificates? Following are the steps to be followed on how to manually import the certificates which are required for SSL configuration in the extension   Download the full certification path: echo | openssl s_client -showcerts -connect <host>:<port> 2>&1 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > cert.pem Import the CA hierarchy in the truststore keytool -import -alias <Alias_for_your_certificate> -file “<Path_to_certificate_in_quotes>” -keystore cacerts.jks -storepass <truststore_password> You can configure the “connection” property in the config file only for extensions which are HTTP based extensions. More details on HTTP client and “connection” properties can be found in the Advanced Troubleshooting Document for HTTP Client. JMX-based extension import options For JMX based extensions (for which mbeans configuration is required), you can either: Follow extension-specific SSL configurations if any SSL guidelines are provided in the documentation, or  Pass SSL parameters to the Machine Agent startup command, as below: java -Djavax.net.ssl.trustStore=/path/to/truststore/cacert.jks -Djavax.net.ssl.trustStorePassword=changeit -jar machineagent.jar Note that two flags are available in JMX:  Securing server communication to use SSL: This is the default SSL configuration (com.sun.management.jmxremote.ssl) that must be set to true. Setting this configuration to true secures the communications via SSL by using a server certificate. JMX RMI registry SSL secured: Starting with JDK 6, an additional parameter (com.sun.management.jmxremote.registry.ssl) was added to force the creation of an SSL-secured Remote Method Invocation (RMI) registry.   The extension does not support SSL encryption of the RMI Registry, it does support SSL encryption of the JMX connections themselves.   If I continue to see PKIX errors, how else can I troubleshoot? The error might continue to come up if incorrect certificates are imported and configured, or if correct SSL parameters are not supplied in the extension.   Please reach out to your application team for the correct certificates, and configure them in the extension.    Also, verify whether all the relevant SSL parameters are correctly configured in the extension or provided as Java arguments.   Additional resources Why am I getting PKIX 'path building failed' errors in the Agent log file? JMX-based extension connectivity: How do I troubleshoot issues? 
Hi how can i extract table like this: (“myserver” is a field that already extracted) source        destination   duration    V server1      myserver        0.001       9288 myserver   server2   ... See more...
Hi how can i extract table like this: (“myserver” is a field that already extracted) source        destination   duration    V server1      myserver        0.001       9288 myserver   server2           0.002       9288 server2       myserver       0.032       0298 myserver    server1           0.004       9298 FYI: duration calculate as described below: Line1 (duration  00:00:00.001) = (12:00:59.853) - (12:00:59.852) Line2 (duration 00:00:00.002) = (start_S 12:00:59.855) - (start_S 12:00:59.853) Line3 (duration 00:00:00.110) = (forWE_APP_AS: G 12:00:59.994) - (forWE_APP_AS: P   12:00:59.884) Line4 (duration 00:00:00.004) = (end_E 12:01:00.007) - (end_E 12:01:00.003)   Here is the log:  (G=get, P=push) 12:00:59.852 app     module1: G[server1]Q[000]V[9288] 12:00:59.853 app     start_S: A_B V[9288]X[000000]G[0]L: 12:00:59.855 app     module2: A_B V[9288]X[000000]G[0]L: 12:00:59.855 app     start_S: C_D V[9288]X[000000]G[0]L: 12:00:59.881 app     module3: A_B V[9288]X[000000]G[0]L: 12:00:59.884 app     forWE_APP_AS: P[server2]K[000]V[0288] 12:00:59.994 app     forWE_APP_AS: G[server2]K[000]V[0298] 12:00:59.995 app     module2: A_B V[9298]X[000000]G[0]K: 12:01:00.003 app     end_E: A_B V[9298]X[000000]G[0]K: 12:01:00.007 app     module1: P[server1]K[458]V[9298] 12:01:00.007 app     end_E: C_D V[9298]X[000000]G[0]K:   any idea?  Thanks 
Have any of you upgraded Windows-based Universal Forwarders using WSUS? If so, what kind of syntax did you use when deploying it? I've tried this once in the past and don't believe it was totally suc... See more...
Have any of you upgraded Windows-based Universal Forwarders using WSUS? If so, what kind of syntax did you use when deploying it? I've tried this once in the past and don't believe it was totally successful: msiexec.exe /i <file path><file name>.msi AGREETOLICENSE=Yes DEPLOYMENT_SERVER=<our_deployment_server>:8089 /quiet Is there anything wrong in this syntax, or anything that I may have missed that should be corrected for next time?
Is it possible to have a Splunk sandbox in the cloud, and to occasionally refresh it with a few weeks of data from a terrestrial instance from a physical DC?
Hi, Say we have an action (lets call it Action1) that returns this under data: [ {"type": "type1", "target": "target value1"}, {"type": "type2", "target": "target value2"} ] I want to pass the ... See more...
Hi, Say we have an action (lets call it Action1) that returns this under data: [ {"type": "type1", "target": "target value1"}, {"type": "type2", "target": "target value2"} ] I want to pass the target to another action (Action2) as parameter so I use action_result.data.*.target datapath to do it. the action returns this: [ {"result_from_action": "result_for target value1"}, {"result_from_action": "result_for target value2"} ] Each row corresponds to the input row. We have a third action (lets call it Action3) that accepts two parameters - the type from Action1 and the result_from_action from Action2 , So i pass: - action_result.data.*.type from Action1 - action_result.data.*.result_from_action from Action2 I want the Action3 to be executed 2 times - for two pairs "type1", "result_for target value1" and  "type2", "result_for target value2" but in reality the action will be executed 4 times for all the possible permutations. I understand why is this happening but im curious if there's a good way to force the platform to do what i need (without using custom functions to build another list and use it as input).   Thanks!  
i am trying to integrate dashboard studio with our external app using splunk react components. i am able to see graphs and other components.   only problem is time range component with is giving f... See more...
i am trying to integrate dashboard studio with our external app using splunk react components. i am able to see graphs and other components.   only problem is time range component with is giving following error.   "Cannot access splunkweb."     below is my definition.json { "visualizations": {}, "dataSources": { }, "inputs": { "input_1": { "type": "input.timerange", "title": "Select Time", "options": { "defaultValue": "-5m,now", "token": "trp" } } }, "layout": { "type": "absolute", "options": {}, "structure": [], "globalInputs": [ "input_1" ] }, "description": "", "title": "TRP Input Dash" }   Thanks Shailendra
I am trying to get the 14-day free trial for Splunk Cloud and keep getting the "An internal error was detected when creating the stack" error. I saw that this has been an issue for several other peop... See more...
I am trying to get the 14-day free trial for Splunk Cloud and keep getting the "An internal error was detected when creating the stack" error. I saw that this has been an issue for several other people. How to I get this trial? I need it for a school assignment.