All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@ASGrover  Can you check bundle deployment status on the CM splunk show cluster-bundle-status Verify your indexes.conf is placed correctly Eg: $SPLUNK_HOME/etc/master-apps/<your_app>/local/index... See more...
@ASGrover  Can you check bundle deployment status on the CM splunk show cluster-bundle-status Verify your indexes.conf is placed correctly Eg: $SPLUNK_HOME/etc/master-apps/<your_app>/local/indexes.conf Verify index config is available in the indexer, run this in one of the indexer and verify splunk btool indexes list bmc --debug Does your new index have any data? If not, try with some test data | makeresults | eval foo="bar" | collect index=bmc Also did you find any errors on the CM _internal? Lastly perform a restart on CM as well. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @CyberSamurai , you have two solutions: you can create a lookup (called e.g. perimeter.csv and containing two columns: host and sourcetype) listing all the sourcetypes and hosts to monitor (bewa... See more...
Hi @CyberSamurai , you have two solutions: you can create a lookup (called e.g. perimeter.csv and containing two columns: host and sourcetype) listing all the sourcetypes and hosts to monitor (beware: in the lookup you have to list all the copuple of sourcetype and host to monitor), and the run a search like this: index=sw tag=MemberServers sourcetype="windows PFirewall Log" | stats count BY sourcetype host | append [ | inputlookup perimeter.csv | eval count=0 | fields sourcetype host count] | stats sum(count) AS total BY sourcetype host | where total=0 otherwise, if you don't want to manage a lookup, you could check the couples of sourcetype and host that were present e.g. in the last 30 days and aren't present in tha last hour, running a search like this: index=sw tag=MemberServers sourcetype="windows PFirewall Log" | stats latest(_time) AS _time count BY sourcetype host | where _time<now()-3600 obviously to customize on your situation. Ciao. Giuseppe
@Andre_  Did you create database outputs first? The alert action does not prompt for parameters because it uses the mapping and connection you set up in the DB Connect app’s Outputs. #https://help.... See more...
@Andre_  Did you create database outputs first? The alert action does not prompt for parameters because it uses the mapping and connection you set up in the DB Connect app’s Outputs. #https://help.splunk.com/en/splunk-cloud-platform/connect-relational-databases/deploy-and-use-splunk-db-connect/3.18/configure-and-manage-splunk-db-connect/create-and-manage-database-outputs#id_8af48766_8b49_4f27_8138_a2cdf208e86c__Create_a_database_output If you want to test it manually, use | dbxoutput output="output_to_test_table" in your SPL Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@CyberSamurai  Try with lookup Eg: | inputlookup memberservers.csv | rename host as lookup_host | join type=left lookup_host [ | tstats count as totalEvents count(eval(sourcetype=="windows PFirewa... See more...
@CyberSamurai  Try with lookup Eg: | inputlookup memberservers.csv | rename host as lookup_host | join type=left lookup_host [ | tstats count as totalEvents count(eval(sourcetype=="windows PFirewall Log")) as fwCount WHERE index=sw tag=MemberServers BY host | rename host as lookup_host ] | fillnull value=0 totalEvents fwCount | where fwCount=0 | table lookup_host totalEvents fwCount | rename lookup_host as host Also try with tstats | tstats count as totalEvents count(eval(sourcetype=="windows PFirewall Log")) as fwCount WHERE index=sw tag=MemberServers BY host | where fwCount=0 | table host,totalEvents,fwCount Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
I use 'SEDCMD-rm<fieldname>'   WHY my sedcmd is not work? SEDCMD-rm-appname = s/app_name\=.*/\s// SEDCMD-rm_appsaas = s/app_saas\=\w+\s//
@silverKi  Try below config to remove highlighted fields from the _raw event. Since they’re not in the raw, Splunk won’t auto-extract them at search time. props.conf [secui:fw] TRANSFORMS-rem... See more...
@silverKi  Try below config to remove highlighted fields from the _raw event. Since they’re not in the raw, Splunk won’t auto-extract them at search time. props.conf [secui:fw] TRANSFORMS-removefields = remove_unwanted_fields transforms.conf [remove_unwanted_fields] REGEX = \s?(fw_rule_name|app_saas|nat_rule_name|is_ssl|user_id|is_sslvpn|app_name|host|app_protocol|src_country|app_category|dst_country)=[^ ]* FORMAT = DEST_KEY = _raw Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
To be specific. I want to disable these two button below that appear on the top part of my dashboard and the export button under every stats display.        
I am trying to exclude unnecessary fields from the firewall log collection. I am trying to delete the fields by excluding them, but they are not reflected well, so I am curious about the related ... See more...
I am trying to exclude unnecessary fields from the firewall log collection. I am trying to delete the fields by excluding them, but they are not reflected well, so I am curious about the related collection exclusion process.
_raw data [fw4_deny] [ip-address] start_time="1998-07-07 11:21:09" end_time="1998-07-07 11:21:09" machine_name=test_chall_1 fw_rule_id=11290 fw_rule_name=auto_ruleId_1290 nat_rule_id=0 nat_rule_name... See more...
_raw data [fw4_deny] [ip-address] start_time="1998-07-07 11:21:09" end_time="1998-07-07 11:21:09" machine_name=test_chall_1 fw_rule_id=11290 fw_rule_name=auto_ruleId_1290 nat_rule_id=0 nat_rule_name= src_ip=1xx.1xx.0.x user_id=- src_port=63185 dst_ip=192.168.0.2 dst_port=16992 protocol=6 app_name=- app_protocol=- app_category=- app_saas=no input_interface=eth212 bytes_forward=70 bytes_backward=0 packets_total=1 bytes_total=70 flag_record=S terminate_reason=Denied by Deny Rule is_ssl=no is_sslvpn=no host=- src_country=X2 dst_country=X2 [resource_cnt] [10.10.10.10] time="1998-07-07 11:24:50" machine_name=test_boby_1 cpu_usage=7.0 mem_usage=19.8 disk_usage=5.6 cpu_count=32, cpu_per_usage=3.0-2.9-2.0-2.0-2.0-2.0-0.0-0.0-23.0-7.9-7.0-6.9-19.4-19.0-8.0-7.0-1.0-1.0-16.0-1.0-2.0-2.0-1.0-2.0-24.8-9.0-16.2-8.0-9.0-9.9-5.0-8.1 my props.conf [secui:fw] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true SEDCMD-duration = s/duration=\d+\s// SEDCMD-fragment_info = s/fragment_info=\S*\s// SEDCMD-ingres_if = s/ingres_if=\S*\s// SEDCMD-input = s/input\sinterface/interface/ SEDCMD-packets_backward = s/packets_backward=\S*\s// SEDCMD-packets_forward = s/packets_forward=\S*\s// SEDCMD-pre = s/^[^\[]+// SEDCMD-terminate_reason = s/\sterminate_reason=-// SEDCMD-user_auth = s/user_auth=\S*\s// SEDCMD-userid = s/user_id=\S*\s// TRANSFORMS-secui_nullq = secui_nullq TRANSFORMS-stchg7 = secui_resource TRANSFORMS-stchg8 = secui_session category = Custom description = test disabled = false pulldown_type = true <Fields you want to exclude> fw_rule_name, app_saas nat_rule_name, is_ssl user_id, is_sslvpn app_name, host app_protocol, src_country app_category, dst_country I want to exclude fields that I want to exclude from being extracted at index time. Currently, fields that I want to exclude are automatically extracted when searching for fields of interest. Is there a way to do this?  
Suppose you have a lookup called myhosts.csv; it has a field called host.  You use this as primary input, then find which host has zero count compared with index search. | inputlookup myhosts.csv | ... See more...
Suppose you have a lookup called myhosts.csv; it has a field called host.  You use this as primary input, then find which host has zero count compared with index search. | inputlookup myhosts.csv | append [search index=sw tag=MemberServers sourcetype="windows PFirewall Log" | stats count by sourcetype,host] | stats values(sourcetype) as not_missing by host | where isnull(not_missing)  
Thank you for the reply. I've used lookup tables a little before and can probably figure out that piece of it. Once I have that comparison list working, how would I say where events for that sourcety... See more...
Thank you for the reply. I've used lookup tables a little before and can probably figure out that piece of it. Once I have that comparison list working, how would I say where events for that sourcetype is zero? I've tried something like this without success: ... | stats count by sourcetype,host | where sourcetype="windows PFirewall Log" | where "count">="1"
Splunk is not good at reporting on things that don't exist. To get around this, you need to provide a list (of the hosts you are interested in) and compare that to the number of event you have for ea... See more...
Splunk is not good at reporting on things that don't exist. To get around this, you need to provide a list (of the hosts you are interested in) and compare that to the number of event you have for each host, and then just keep those where the number of events is less than 1. This is often done using a lookup file, for example (if the hosts are "new"), or some historic data (if the hosts are "old").
Hello, I have Database Connect setup and it's working all fine. But I can't wrap my head around how the Alert Action works.  The Alert action "Output results to databases" has no parameters - what ... See more...
Hello, I have Database Connect setup and it's working all fine. But I can't wrap my head around how the Alert Action works.  The Alert action "Output results to databases" has no parameters - what am I missing? I have a DB table "test_table" with columns col1, col2 and want to setup | makeresults | eval col1 = "test", col2 = "result" as an alert that pushes the results into the "test_table". I would expect the Alert action to at least need to know what DB Output to use? Any help appreciated, Kind Regards Andre 
Hi @Cleffa  Looking at the limited docs, it doesnt look like you can inject a UUID into the filename, however it does access timestamp variables so you could perhaps add a millisecond (%f) to your f... See more...
Hi @Cleffa  Looking at the limited docs, it doesnt look like you can inject a UUID into the filename, however it does access timestamp variables so you could perhaps add a millisecond (%f) to your filename to make it more unique? Check ou the timestamp docs at https://docs.python.org/3.7/library/datetime.html#strftime-strptime-behavior:~:text=Microsecond%20as%20a%20decimal%20number%2C%20zero%2Dpadded%20on%20the%20left.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi, sometimes there are 3 new data and I need JSON separate, but they overwritten, I find no way to add a UUID to the file name /results_%H%M%S.json      
Hello Splunk Community. I'd like to use a query to find a host which is a member of a tag group and has 0 events for a specific sourcetype. Here's the search that gets me most of the way there: inde... See more...
Hello Splunk Community. I'd like to use a query to find a host which is a member of a tag group and has 0 events for a specific sourcetype. Here's the search that gets me most of the way there: index=sw tag=MemberServers sourcetype="windows PFirewall Log" | stats count by sourcetype,host But I'd like to return only hosts which have 0 events (aka. are missing firewall data). How can I do this?
I have taken the file and deleted and repopulated it.  I have used a new file created in notepad++ and another file created in excel.  still no luck.  I am beyond frustrated because I know it is some... See more...
I have taken the file and deleted and repopulated it.  I have used a new file created in notepad++ and another file created in excel.  still no luck.  I am beyond frustrated because I know it is something simple somewhere just cannot figure out where. 
The data is a simple CSV file so the props just need to specify that. [sap:systemlog] INDEXED_EXTRACTIONS = csv DATETIME_CONFIG = CURRENT No need for REPORT or EXTRACT.
Hi @Akhanda  That is fine, just create the file and then make sure the permissions allow for whichever user Splunk runs as.  by default the local directory is empty (the defaults are in the “defaul... See more...
Hi @Akhanda  That is fine, just create the file and then make sure the permissions allow for whichever user Splunk runs as.  by default the local directory is empty (the defaults are in the “default” directory alongside the local directory so wouldn’t necessarily expect it to exist already.  Regards Will
Dear splunk community, After successfully implementing the input from @afx : "How to Splunk the SAP Security Audit Log" I was encouraged to implement the SAP system log (SM21) on my own. So far, ... See more...
Dear splunk community, After successfully implementing the input from @afx : "How to Splunk the SAP Security Audit Log" I was encouraged to implement the SAP system log (SM21) on my own. So far, I have managed to send the log to SPLUNK, but given the log's encoding system, I am unable to process it correctly in SPLUNK. Most likely, my error lies in the transforms.conf or props.conf.  props.conf [sap:systemlog] category = Custom REPORT-SYS = REPORT-SYS EXTRACT-fields = ^(?<Prefix>.{3})(?<Date>.{8})(?<Time>.{6})(?<Code>\w\w)(?<Field1>.{5})(?<Field2>.{2})(?<Field3>.{3})(?<Field4>.)(?<Field5>.)(?<Field6>.{8})(?<Field7>.{12})(?<Field8>.{20})(?<Field9>.{40})(?<Field10>.{3})(?<Field11>.)(?<Field12>.{64})(?<Field13>.{20}) LOOKUP-auto_sm21 = sm21 message_id AS message_id OUTPUTNEW area AS area subid AS subid ps_posid AS ps_posid transforms.conf [REPORT-SYS] DELIMS = "|" FIELDS = "message_id","date","time","term1","os_process_id","term2","work_process_number","type_process","term3","term4","user","term5","program","client","session","variable","term6","term7","term8","term9","id_tran","id_cont","id_cone" [sm21] batch_index_query = 0 case_sensitive_match = 1 filename = sm21.csv Has anyone experienced a similar issue to mine?  Best Regards.