All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@DineshElumalai  Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello  export the result to splunk folder... See more...
@DineshElumalai  Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello  export the result to splunk folder and create a script to move to your folder. Also you can consider using exporting data using rest api with curl. curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search index=test sourcetype=test earliest=-7d@d latest=now" \ -d output_mode=csv > /external/path/to/destination/results.csv To append new results to an existing file, use >> instead of > curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search savedsearch test_weekly_export" \ -d output_mode=csv >> /path/to/your/target/folder/test_report.csv #https://help.splunk.com/en/splunk-enterprise/search/search-manual/9.3/export-search-results/export-data-using-the-splunk-rest-api Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're run... See more...
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're running your Splunk instance on.  
Yes let me explain it clear Im using Python add on     This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be a... See more...
Yes let me explain it clear Im using Python add on     This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be added in the inputs.con file with the global account provided. Its all happening in the custom json validator file. I want to get the global account name for saving the input details in the inputs.conf file. I could not able to get it in the data def validate(self,value, data  
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. ... See more...
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. If you export using the same name the file is overwritten, if the file is saved in a different folder maybe there is some customization (e.g. a script that moves the file). Ciao. Giuseppe  
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which... See more...
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which runs every week and data has been exported to the folder but it creates a new folder. I need to append the search results to the existing file or else I need to replace the file with the new data.  If I get result for any one of the things mentioned above. I'm good. Thanks.
Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the i... See more...
Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the input phase or search? what do you mean with dictionary? what do you mean with data validation? what do you mean with configuration time? Ciao. Giuseppe
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case(... See more...
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case( $row_count_tok$=24 AND Page<=12, 0, $row_count_tok$=24 AND Page>12, 1, $row_count_tok$=40 AND Page<=20, 0, $row_count_tok$=40 AND Page>20, 1 ) But I`m not sure why, as the token is in a title and I can see it`s values being extracted correctly (24 if in landscape mode or 40 if in portrait). If I refresh the browser the issue persists, if I go into editxml mode and cancel without making any changes the panel displays fine.
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name ... See more...
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name field the account name is sent, but during the data validation, it does not present in the dictionary, Basically what i want is  I want to have the account name during configuration time.
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in... See more...
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in Classic dashboards it showing as received from events I want to see in dashboard studio exact date without any T
A bit less ugly is to use `| sed -n '3,$p' > export.csv `
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which wi... See more...
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which will show also failure logins which are determined as follow: 1. If there is evenr_id 1000 and there is no following event_id 1001 in 1s for the same username - the login is FAILURE 2. If there is evenr_id 1000 and there is following event_id 1001 in 1s for the same username - the login is SUCESS And the query doesn't take into account condition in second point as  it displays FAILURE even there is following event_if 1001 in 1s for the same user name. In other words the FAILURE part doesn't work. I hope I've explained this clearly.
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAIL... See more...
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAILURE because the second step didn't already happen. If you add a "| search event_id=1001" at the end of the search, does that solve your problem? You will have only the success event for user "test", but you will not have all the 1000 events that don't have a 1001 after. What do you want to keep exactly ?
Thanks @PrewinThomas , we are going to use the workaround and let you know. Nicola
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/set... See more...
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/setSplunkEnv btool inputs list  without sourcing the Splunk Env you get missing libraries error: /opt/splunkforwarder/bin/btool inputs list /opt/splunkforwarder/bin/splunkd: error while loading shared libraries: libmongoc-1.0.so.0: cannot open shared object file: No such file or directory  
Thank you for your advice, I ignored the token content and cloesd it.
Yep,I can use useraccount&password to do it.
@sylviee_o  It appears you're upgrading from a much older version of Splunk to 9.4.x, which is causing the issue shown in your screenshot. To resolve this, you need to follow the supported upgrade p... See more...
@sylviee_o  It appears you're upgrading from a much older version of Splunk to 9.4.x, which is causing the issue shown in your screenshot. To resolve this, you need to follow the supported upgrade path to ensure all components, including KV Store, are properly updated. Skipping intermediate versions can result in compatibility problems and failed upgrades. Before upgrading to 9.4.x, verify that your KV Store server version is at least 4.2. If it isn't, first upgrade to an intermediate version (such as 9.3.x) that brings KV Store to the required level, then proceed to 9.4.x. Also refer, #https://docs.splunk.com/Documentation/Splunk/9.4.2/Admin/MigrateKVstore Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@jessieb_83  Do your proxy log events include fields that identify a user or a device (such as src, dest, src_ip, dest_ip, host ...)? Typically, proxy logs should be mapped to the Web data model. C... See more...
@jessieb_83  Do your proxy log events include fields that identify a user or a device (such as src, dest, src_ip, dest_ip, host ...)? Typically, proxy logs should be mapped to the Web data model. Check that your logs contain the necessary fields for proper mapping. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@HA-01  Looks like app doesn't support fetching Dynamic test data.Ref #https://docs.appdynamics.com/appd/24.x/latest/en/end-user-monitoring/thousandeyes-integration-with-browser-real-user-monitoring... See more...
@HA-01  Looks like app doesn't support fetching Dynamic test data.Ref #https://docs.appdynamics.com/appd/24.x/latest/en/end-user-monitoring/thousandeyes-integration-with-browser-real-user-monitoring/thousandeyes-network-metrics-in-browser-rum #https://docs.thousandeyes.com/product-documentation/integration-guides/custom-built-integrations/splunk-app You can consider using ThousandEyes API calls to pull Dynamic data. #https://developer.cisco.com/docs/thousandeyes/create-endpoint-dynamic-test/ #https://docs.thousandeyes.com/product-documentation/end-user-monitoring/viewing-data/endpoint-agent-automated-session-tests-view Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
1. Is it a fresh installation or an upgrade? 2. You have the immediate debugging steps on screen.