All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the i... See more...
Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the input phase or search? what do you mean with dictionary? what do you mean with data validation? what do you mean with configuration time? Ciao. Giuseppe
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case(... See more...
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case( $row_count_tok$=24 AND Page<=12, 0, $row_count_tok$=24 AND Page>12, 1, $row_count_tok$=40 AND Page<=20, 0, $row_count_tok$=40 AND Page>20, 1 ) But I`m not sure why, as the token is in a title and I can see it`s values being extracted correctly (24 if in landscape mode or 40 if in portrait). If I refresh the browser the issue persists, if I go into editxml mode and cancel without making any changes the panel displays fine.
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name ... See more...
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name field the account name is sent, but during the data validation, it does not present in the dictionary, Basically what i want is  I want to have the account name during configuration time.
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in... See more...
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in Classic dashboards it showing as received from events I want to see in dashboard studio exact date without any T
A bit less ugly is to use `| sed -n '3,$p' > export.csv `
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which wi... See more...
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which will show also failure logins which are determined as follow: 1. If there is evenr_id 1000 and there is no following event_id 1001 in 1s for the same username - the login is FAILURE 2. If there is evenr_id 1000 and there is following event_id 1001 in 1s for the same username - the login is SUCESS And the query doesn't take into account condition in second point as  it displays FAILURE even there is following event_if 1001 in 1s for the same user name. In other words the FAILURE part doesn't work. I hope I've explained this clearly.
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAIL... See more...
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAILURE because the second step didn't already happen. If you add a "| search event_id=1001" at the end of the search, does that solve your problem? You will have only the success event for user "test", but you will not have all the 1000 events that don't have a 1001 after. What do you want to keep exactly ?
Thanks @PrewinThomas , we are going to use the workaround and let you know. Nicola
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/set... See more...
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/setSplunkEnv btool inputs list  without sourcing the Splunk Env you get missing libraries error: /opt/splunkforwarder/bin/btool inputs list /opt/splunkforwarder/bin/splunkd: error while loading shared libraries: libmongoc-1.0.so.0: cannot open shared object file: No such file or directory  
Thank you for your advice, I ignored the token content and cloesd it.
Yep,I can use useraccount&password to do it.
@sylviee_o  It appears you're upgrading from a much older version of Splunk to 9.4.x, which is causing the issue shown in your screenshot. To resolve this, you need to follow the supported upgrade p... See more...
@sylviee_o  It appears you're upgrading from a much older version of Splunk to 9.4.x, which is causing the issue shown in your screenshot. To resolve this, you need to follow the supported upgrade path to ensure all components, including KV Store, are properly updated. Skipping intermediate versions can result in compatibility problems and failed upgrades. Before upgrading to 9.4.x, verify that your KV Store server version is at least 4.2. If it isn't, first upgrade to an intermediate version (such as 9.3.x) that brings KV Store to the required level, then proceed to 9.4.x. Also refer, #https://docs.splunk.com/Documentation/Splunk/9.4.2/Admin/MigrateKVstore Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@jessieb_83  Do your proxy log events include fields that identify a user or a device (such as src, dest, src_ip, dest_ip, host ...)? Typically, proxy logs should be mapped to the Web data model. C... See more...
@jessieb_83  Do your proxy log events include fields that identify a user or a device (such as src, dest, src_ip, dest_ip, host ...)? Typically, proxy logs should be mapped to the Web data model. Check that your logs contain the necessary fields for proper mapping. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@HA-01  Looks like app doesn't support fetching Dynamic test data.Ref #https://docs.appdynamics.com/appd/24.x/latest/en/end-user-monitoring/thousandeyes-integration-with-browser-real-user-monitoring... See more...
@HA-01  Looks like app doesn't support fetching Dynamic test data.Ref #https://docs.appdynamics.com/appd/24.x/latest/en/end-user-monitoring/thousandeyes-integration-with-browser-real-user-monitoring/thousandeyes-network-metrics-in-browser-rum #https://docs.thousandeyes.com/product-documentation/integration-guides/custom-built-integrations/splunk-app You can consider using ThousandEyes API calls to pull Dynamic data. #https://developer.cisco.com/docs/thousandeyes/create-endpoint-dynamic-test/ #https://docs.thousandeyes.com/product-documentation/end-user-monitoring/viewing-data/endpoint-agent-automated-session-tests-view Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
1. Is it a fresh installation or an upgrade? 2. You have the immediate debugging steps on screen.
Could be. I didn't copy-paste it but written here by hand so there might have been a typo.
Thank you for your reply. I checked the internal log, but there were no errors related to ThousandEyes Dynamic tests. Therefore, I checked whether the App is configured to retrieve Dynamic test dat... See more...
Thank you for your reply. I checked the internal log, but there were no errors related to ThousandEyes Dynamic tests. Therefore, I checked whether the App is configured to retrieve Dynamic test data in the first place. Upon reviewing the thousandeyes_constant.py file, I found that ENDPOINT_TEST_TYPES = ["agent-to-server", "http-server"] does not include “Dynamic.” This indicates that the current specification does not support retrieving Dynamic test data.
I have a feeling that using tokens in the count part of the XML config was broken at some point. It used to work, then it stopped working, but now I tested again, it does work - what version are you ... See more...
I have a feeling that using tokens in the count part of the XML config was broken at some point. It used to work, then it stopped working, but now I tested again, it does work - what version are you on?  
Hello everyone, I use a Dell Windows laptop, and after downloading the Splunk enterprise 9.4.3 app for Windows, I'm unable to install it because of an error prompt. Please, can I get a step by step a... See more...
Hello everyone, I use a Dell Windows laptop, and after downloading the Splunk enterprise 9.4.3 app for Windows, I'm unable to install it because of an error prompt. Please, can I get a step by step approach on fixing this?  
This did work, but had to remove the s on optimizations and presto. Thank you.