Hi @Jayanthan There are a number of approaches you could take to do this such as Edge Processor, Ingest Actions, props/transforms or segregating at source. What tools/apps/processes are you curren...
See more...
Hi @Jayanthan There are a number of approaches you could take to do this such as Edge Processor, Ingest Actions, props/transforms or segregating at source. What tools/apps/processes are you currently using bring the data in to Splunk? The most optimum way to reduce the amount of data ingested in to Splunk is to omit it at source (e.g. not send/pull it)! Please let us know and we can hopefully drill further into options for you. Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
what is the correct format for domain users please? if i curl from a HF i get get the desired 200 response using : curl -v http://mywebsite.com --ntlm -u username@mydomain.ad.ltd.com.au If i use t...
See more...
what is the correct format for domain users please? if i curl from a HF i get get the desired 200 response using : curl -v http://mywebsite.com --ntlm -u username@mydomain.ad.ltd.com.au If i use this format in the TA i see the error message in the logs asking for format in domain\\username I have tried several connotations of mydomain\\username but have not been successful what should be the format for this domain? Or is the issue with --ntlm ? as if we use the --negotiate flag or remove --ntlm we get 401 ? cheers
Thank you @livehybrid But, I am using Splunk Enterprise. Is there any way to add and filter log from applications hosted the cloud in Splunk Enterprise.
Hi Everyone, We are using Splunk Enterprise in our company. We want to ingest logs from applications hosted on the cloud. But, when we try to connect we get a lot of logs which is unrelated to our a...
See more...
Hi Everyone, We are using Splunk Enterprise in our company. We want to ingest logs from applications hosted on the cloud. But, when we try to connect we get a lot of logs which is unrelated to our application which in turn causes high License utilization. Is there any method where we can filter out the logs that we want (like logs of specific application or log source) before ingesting in Splunk so as to reduce the License Utilization while getting required security logs for the application.
Hi @Jayanthan Data Manager is only available in Splunk Cloud. Please see https://help.splunk.com/en/splunk-cloud-platform/ingest-data-from-cloud-services/data-manager-user-manual/1.11/introduction/...
See more...
Hi @Jayanthan Data Manager is only available in Splunk Cloud. Please see https://help.splunk.com/en/splunk-cloud-platform/ingest-data-from-cloud-services/data-manager-user-manual/1.11/introduction/about-data-manager for more information. There is a useful page at https://lantern.splunk.com/Splunk_Success_Framework/Data_Management/GDI_-_Getting_data_in which links out to various methods to onboard different datasources. Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi everyone, I want to ingest logs from applications hosted in cloud (such as AWS, Azure). In our Company we are using Splunk Enterprise. Can Data Manager be used to ingest and filter out logs on...
See more...
Hi everyone, I want to ingest logs from applications hosted in cloud (such as AWS, Azure). In our Company we are using Splunk Enterprise. Can Data Manager be used to ingest and filter out logs only pertaining to that application's security in Splunk Enterprise. Splunk Enterprise Security
Yes Im using ucc framework to build the app. Im using custom validator class to the validate the configuration field. This is the data value sent to validate function. As I debug the code only ...
See more...
Yes Im using ucc framework to build the app. Im using custom validator class to the validate the configuration field. This is the data value sent to validate function. As I debug the code only the field data is set to the data during validate method invocation. When I see the account validator python file where name is set as None. Basically my need is during the configuration validation, I want to know the account name. How to fetch this account name ?
Hi @Vasavi29 Please can you share a screenshot of where/how you are seeing this? Can you confirm the timezone you have set in the user preferences? Did this answer help you? If so, please con...
See more...
Hi @Vasavi29 Please can you share a screenshot of where/how you are seeing this? Can you confirm the timezone you have set in the user preferences? Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @PoojaDevi Are you using UCC Framework to build this app? Can you provide a little more of your code? Have you tried adding a log method to print out what is passed to the function? Can you shar...
See more...
Hi @PoojaDevi Are you using UCC Framework to build this app? Can you provide a little more of your code? Have you tried adding a log method to print out what is passed to the function? Can you share what is sent? Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @kfsplunk What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction t...
See more...
Hi @kfsplunk What distinction do you need to make between the logs? You mention that they become hard to differentiate but I think you could probably create an eventtype or use a field extraction to determine if the FTD code is in the 43k range like you mentioned. I would avoid onboarding it as one sourcetype and then using props/transforms to overwrite the sourcetype because you risk breaking the built-in field extractions and CIM mappings you get from the app's configuration. However, If you want to segregate into a separate index, or change the source to distinguish them apart then you could do this with props/transforms. The Cisco Security Cloud app does look a lot richer in terms of functionality and dashboards (if that helps you) but also gets much more frequent updates than the ASA app, not that this should necessarily sway your decision but might help! Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token v...
See more...
Managed to get this resolved by ensuring the submitted token model was updated by adding submittedTokenModel.set() and submittedTokenModel.trigger() to the code. The title displaying the token value was a bit of a red herring. It showed that the default model was being updated, but it didn't reflect the state of the submitted token model.
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA event...
See more...
Onboarding Cisco FTD firewalls presents the choice of which Add-On to use. Apparently Cisco FTD firewalls run both ASA core and FTD core which means they send different types of events. The ASA events are best handled with cisco:asa sourcetype whereas the FTD events are handled by cisco:ftd:syslog. However, all events in our environment use %FTD to tag their events, so this makes it harder to differentiate. What Add-On is the preferred Add-On (I'd expect the Cisco Security Cloud, but it still has some flaws)? And how should we get these events in with the correct sourcetype. My suggestion would be to send all events with cisco:asa sourcetype and include a transform which checks if the FTD code is in the 43k range, e.g. REGEX=%FTD-\d-43\d+.
python.version = python3.10 is not a valid setting. Allowed values are default, python3, python3.7, or python3.9. I don't know how you were able to add Python 3.10 to Splunk, but doing so does not ...
See more...
python.version = python3.10 is not a valid setting. Allowed values are default, python3, python3.7, or python3.9. I don't know how you were able to add Python 3.10 to Splunk, but doing so does not change the validation of python.version settings. I strongly recommend using only the versions of Python that ship with Splunk (3.7 or 3.9).
Hello, I am trying to use a different python version for my external lookup. The global version is 3.7 and my custom one is 3.10 my /opt/splunk/bin contains both 3.7 and 3.10 In my transforms.c...
See more...
Hello, I am trying to use a different python version for my external lookup. The global version is 3.7 and my custom one is 3.10 my /opt/splunk/bin contains both 3.7 and 3.10 In my transforms.conf i changed the python version: [externallookup] python.version = python3.10 However I am getting the following error: When I use [externallookup] python.version = python3.7 , it does not give the error. Also I am able to use the new python version, when I change the symlink from 3.7 to my 3.10 (for debugging) But why doesnt it work when I set the python.version to pyhon.3.10? Thanks in advance!
@DineshElumalai Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello export the result to splunk folder...
See more...
@DineshElumalai Are you using splunk native csv export? or using any script or rest api to export the results? If you are using outputcsv i agree with @gcusello export the result to splunk folder and create a script to move to your folder. Also you can consider using exporting data using rest api with curl. curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search index=test sourcetype=test earliest=-7d@d latest=now" \ -d output_mode=csv > /external/path/to/destination/results.csv To append new results to an existing file, use >> instead of > curl -k -u <username>:<password> https://<splunk-host>:8089/services/search/jobs/export \ -d search="search savedsearch test_weekly_export" \ -d output_mode=csv >> /path/to/your/target/folder/test_report.csv #https://help.splunk.com/en/splunk-enterprise/search/search-manual/9.3/export-search-results/export-data-using-the-splunk-rest-api Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're run...
See more...
You can use this app - https://splunkbase.splunk.com/app/5738 But it seems to have support for many destinations... except local file. You can get around it by connecting back to the host you're running your Splunk instance on.
Yes let me explain it clear Im using Python add on This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be a...
See more...
Yes let me explain it clear Im using Python add on This page will be shown during the configuration phase. Where Im getting the Input Data, Once customer chosen the input data, i will be added in the inputs.con file with the global account provided. Its all happening in the custom json validator file. I want to get the global account name for saving the input details in the inputs.conf file. I could not able to get it in the data def validate(self,value, data
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. ...
See more...
Hi @DineshElumalai , I suppose that you're speaking of exportcsv, that is usually exported in the $SPLUNK_HOME/var/run/splunk/csv folder (export folder isn't configurable) and than you can use it. If you export using the same name the file is overwritten, if the file is saved in a different folder maybe there is some customization (e.g. a script that moves the file). Ciao. Giuseppe
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which...
See more...
Hello Everyone I need to export the search results to a folder outside the Splunk. To do this job we've exportresults in Splunk which works fine. Basically in my scenario, it is a saved search which runs every week and data has been exported to the folder but it creates a new folder. I need to append the search results to the existing file or else I need to replace the file with the new data. If I get result for any one of the things mentioned above. I'm good. Thanks.