All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @kfsplunk  What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction t... See more...
Hi @kfsplunk  What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction to determine if the FTD code is in the 43k range like you mentioned.  I would avoid onboarding it as one sourcetype and then using props/transforms to overwrite the sourcetype because you risk breaking the built-in field extractions and CIM mappings you get from the app's configuration. However, If you want to segregate into a separate index, or change the source to distinguish them apart then you could do this with props/transforms. The Cisco Security Cloud app does look a lot richer in terms of functionality and dashboards (if that helps you) but also gets much more frequent updates than the ASA app, not that this should necessarily sway your decision but might help!   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token v... See more...
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token value was a bit of a red herring. It showed that the default model was being updated, but it didn't reflect the state of the submitted token model.
Alright, thank you for your answer!
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA event... See more...
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA events are best handled with cisco:asa sourcetype whereas the FTD events are handled by cisco:ftd:syslog. However, all events in our environment use %FTD to tag their events, so this makes it harder to differentiate. What Add-On is the preferred Add-On (I'd expect the Cisco Security Cloud, but it still has some flaws)? And how should we get these events in with the correct sourcetype. My suggestion would be to send all events with cisco:asa sourcetype and include a transform which checks if the FTD code is in the 43k range, e.g. REGEX=%FTD-\d-43\d+.  
python.version = python3.10 is not a valid setting.  Allowed values are default, python3, python3.7, or python3.9.  I don't know how you were able to add Python 3.10 to Splunk, but doing so does not ... See more...
python.version = python3.10 is not a valid setting.  Allowed values are default, python3, python3.7, or python3.9.  I don't know how you were able to add Python 3.10 to Splunk, but doing so does not change the validation of python.version settings.  I strongly recommend using only the versions of Python that ship with Splunk (3.7 or 3.9).
Hello, I am trying to use a different python version for my external lookup. The global version is 3.7 and my custom one is 3.10 my /opt/splunk/bin contains both 3.7 and 3.10   In my transforms.c... See more...
Hello, I am trying to use a different python version for my external lookup. The global version is 3.7 and my custom one is 3.10 my /opt/splunk/bin contains both 3.7 and 3.10   In my transforms.conf i changed the python version: [externallookup] python.version = python3.10   However I am getting the following error:   When I use  [externallookup] python.version = python3.7   , it does not give the error. Also I am able to use the new python version, when I change the symlink from 3.7 to my 3.10 (for debugging)   But why doesnt it work when I set the python.version to pyhon.3.10?   Thanks in advance!
@DineshElumalai  Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello  export the result to splunk folder... See more...
@DineshElumalai  Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello  export the result to splunk folder and create a script to move to your folder. Also you can consider using exporting data using rest api with curl. curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search index=test sourcetype=test earliest=-7d@d latest=now" \ -d output_mode=csv > /external/path/to/destination/results.csv To append new results to an existing file, use >> instead of > curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search savedsearch test_weekly_export" \ -d output_mode=csv >> /path/to/your/target/folder/test_report.csv #https://help.splunk.com/en/splunk-enterprise/search/search-manual/9.3/export-search-results/export-data-using-the-splunk-rest-api Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're run... See more...
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're running your Splunk instance on.  
Yes let me explain it clear Im using Python add on     This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be a... See more...
Yes let me explain it clear Im using Python add on     This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be added in the inputs.con file with the global account provided. Its all happening in the custom json validator file. I want to get the global account name for saving the input details in the inputs.conf file. I could not able to get it in the data def validate(self,value, data  
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. ... See more...
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. If you export using the same name the file is overwritten, if the file is saved in a different folder maybe there is some customization (e.g. a script that moves the file). Ciao. Giuseppe  
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which... See more...
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which runs every week and data has been exported to the folder but it creates a new folder. I need to append the search results to the existing file or else I need to replace the file with the new data.  If I get result for any one of the things mentioned above. I'm good. Thanks.
Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the i... See more...
Hi @PoojaDevi , could you better detail your request? because nothing that said is clear for me: what is the product you're using (Splunk Enterprise, an app, what else)? are you speaking of the input phase or search? what do you mean with dictionary? what do you mean with data validation? what do you mean with configuration time? Ciao. Giuseppe
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case(... See more...
Done some further testing and the "Search is waiting for input.." is caused by the query below and not the count part in XML: | head $row_count_tok$ | streamstats count as Page | eval Page = case( $row_count_tok$=24 AND Page<=12, 0, $row_count_tok$=24 AND Page>12, 1, $row_count_tok$=40 AND Page<=20, 0, $row_count_tok$=40 AND Page>20, 1 ) But I`m not sure why, as the token is in a title and I can see it`s values being extracted correctly (24 if in landscape mode or 40 if in portrait). If I refresh the browser the issue persists, if I go into editxml mode and cancel without making any changes the panel displays fine.
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name ... See more...
I have custom validator class in which, Based on the input selected by the customer, i will update in the inputs conf file during configuration. But I encounter that, during configuration under name field the account name is sent, but during the data validation, it does not present in the dictionary, Basically what i want is  I want to have the account name during configuration time.
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in... See more...
Splunk is in gmt and server is in est time. But when displayed in dashboard studio the date format is showing based on the servers time ex: 2025-06-30T20:00:00-04:00 But the same when displayed in Classic dashboards it showing as received from events I want to see in dashboard studio exact date without any T
A bit less ugly is to use `| sed -n '3,$p' > export.csv `
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which wi... See more...
Hi @malix_la_harpe  Again, many thanks for your help. Unfortunately adding  a "| search event_id=1001" won't resolve the issue as it will show only successful logins. I'm looking for query which will show also failure logins which are determined as follow: 1. If there is evenr_id 1000 and there is no following event_id 1001 in 1s for the same username - the login is FAILURE 2. If there is evenr_id 1000 and there is following event_id 1001 in 1s for the same username - the login is SUCESS And the query doesn't take into account condition in second point as  it displays FAILURE even there is following event_if 1001 in 1s for the same user name. In other words the FAILURE part doesn't work. I hope I've explained this clearly.
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAIL... See more...
Hello @PiotrAp , In the query you can see the "SUCCESS" status on the second step of the authentication, so on the event 1001. The status of the event 1000 corresponding to this auth is still FAILURE because the second step didn't already happen. If you add a "| search event_id=1001" at the end of the search, does that solve your problem? You will have only the success event for user "test", but you will not have all the 1000 events that don't have a 1001 after. What do you want to keep exactly ?
Thanks @PrewinThomas , we are going to use the workaround and let you know. Nicola
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/set... See more...
btool is a own program in $SPLUNK_HOME/bin It is a bit more tricky to use because you have to be in splunk env. I tested successful following procedure on UF 9.2.2 . /opt/splunkforwarder/bin/setSplunkEnv btool inputs list  without sourcing the Splunk Env you get missing libraries error: /opt/splunkforwarder/bin/btool inputs list /opt/splunkforwarder/bin/splunkd: error while loading shared libraries: libmongoc-1.0.so.0: cannot open shared object file: No such file or directory