If I have understand right this is doable also on IHF + DS combination, but it could be tricky as those functions are different. Also if you have more than 50 client then you should/must have a separa...
See more...
If I have understand right this is doable also on IHF + DS combination, but it could be tricky as those functions are different. Also if you have more than 50 client then you should/must have a separate DS server for those. Since 9.2.x DS server expects that there are some local indexes where it stores information about DS actions. If you haven't those or you are sending all events into your real indexers then this didn't work. If I recall right there are some instructions how this can do, but I prefer that you will install one new dedicated server for DS and use those local indexes for DS function. That way it will be much easier to get it to work. Other option is look from community, docs and Splunk usergroup Slack how this can do in combined IHF + DS. It needs some additional tuning for outputs.conf at least, maybe it was some other conf files too?
That is an interesting issue since you will find everything ok related to configuration and best guess is that you are hitting the maximum limit of knowledge bundle replication and max_content_leng...
See more...
That is an interesting issue since you will find everything ok related to configuration and best guess is that you are hitting the maximum limit of knowledge bundle replication and max_content_length Below is the new recommended setting as per splunk document on search head distesearch.conf [replicationSettings]
maxBundleSize = 4096 you must also increase max_content_length in server.conf on the indexers (search peers) to at least 4GB and also on search head and deployer. [httpServer]
max_content_length = 4294967296
There may be something in splunkd.log(not sure) find this in $SPLUNK_HOME\var\log\splunk Whats the output of this? (I'm starting to think the root cacert.pem has something to do with this.) openssl...
See more...
There may be something in splunkd.log(not sure) find this in $SPLUNK_HOME\var\log\splunk Whats the output of this? (I'm starting to think the root cacert.pem has something to do with this.) openssl x509 -in "c:\Program Files\Splunk\etc\auth\cacert.pem" -text -noout Does it show its expired? may be this has something to do with it. Try and rename that file cacert.pem or it could be ca.pem and do a restart
Hi @joe06031990 , it's a request from many of us, go in Splunk ideas and vote for it: maybe someone in the Splunk project will consider the request! Ciao. Giuseppe
Hi @BB_MW in the interesting fields you have fields contained at least in 20% of the events, if you add action=* to you search do you see the field? Then, did youchecked if the sourcetype that you...
See more...
Hi @BB_MW in the interesting fields you have fields contained at least in 20% of the events, if you add action=* to you search do you see the field? Then, did youchecked if the sourcetype that you associated to the field extraction is in the vents that you're searching? the new field extraction is related only tho the declared sourcetype. Ciao. Giuseppe
We have splunk installed in linux machine under /opt/splunk. We have created add on and added python code and that is getting saved in modalert_test_webhook_helper.py under "/opt/splunk/etc/apps/sp...
See more...
We have splunk installed in linux machine under /opt/splunk. We have created add on and added python code and that is getting saved in modalert_test_webhook_helper.py under "/opt/splunk/etc/apps/splunk_app_addon-builder/local/validation/TA-splunk-webhook-final/bin/ta_splunk_webhook_final" We wanted to create one parameter in any config file with value in the form of list of rest api endpoints and read that in python code. If rest api endpoint entered by user while adding the action to alert is present in the list added in config file then only need to proceed with process_data action in python else display a message saying rest api endpoint is not present So now we wanted to know In which conf to define the parameter and what changes to make in python file and which python file to be used as there are many python files under /bin directory. Also after making changes in any conf or python files and restarting the changes are not getting saved. How to get it saved after restarting splunk? PFA screenshots of conf and python files. Kindly help with any solution.
Hi there, thank you for your idea, but unfortunately it was not working: The path is correct. Is there any way to find out, why the generation is failing? Checked some logs, but couldn't find a...
See more...
Hi there, thank you for your idea, but unfortunately it was not working: The path is correct. Is there any way to find out, why the generation is failing? Checked some logs, but couldn't find anything that was helping...
Hello @VatsalJagani I have the lookup table file and definition where the permissions are set to app level and when i am running the search inside app, it is fetching results but when i am runnin...
See more...
Hello @VatsalJagani I have the lookup table file and definition where the permissions are set to app level and when i am running the search inside app, it is fetching results but when i am running it in the main search and reporting app i am getting the error this lookup table requires a .csv or kvstore lookup defintion, how do we need to fix this error any idea? Thanks
Hi Giuseppe, I am not talking about XML tags, but HTML tags. HTML tags are used to format the text and do not give any information about fields. Text between <b> and </b> will be formatted in bold a...
See more...
Hi Giuseppe, I am not talking about XML tags, but HTML tags. HTML tags are used to format the text and do not give any information about fields. Text between <b> and </b> will be formatted in bold and <br> is a line break. I would like to remove these unnecessary characters from my inputs. Ciao! Tommaso
You could try to do it using REST API but I'd say it's not a best idea. If you enable too many searches, you're gonna kill your servers. So it's best to enable those you need, not just all there are.
Hi there, Sorry, I should also have added that I'm searching in Smart Mode. The results, though, are the same for Verbose Mode. I hadn't thought of doing a stats on the fields but I can confirm tha...
See more...
Hi there, Sorry, I should also have added that I'm searching in Smart Mode. The results, though, are the same for Verbose Mode. I hadn't thought of doing a stats on the fields but I can confirm that count(action) is still 0 and the count(change_type) has a positive value.
Don't bother with the Interesting Fields sidebar. It contains only _extracted fields_ (so if you're searching in fast mode you'll get just the basic metadata fields or the ones explicitly used) which...
See more...
Don't bother with the Interesting Fields sidebar. It contains only _extracted fields_ (so if you're searching in fast mode you'll get just the basic metadata fields or the ones explicitly used) which are present in at least 20% of the results. So this is not the way to verify if the field is properly extracted. Also remember that when using fast mode only the fields explicitly used are extracted. BTW, try your search with | stats count count(action) count(change_type)
Hi, I appreciate that there are numerous questions on here for similar problems but, after reading quite a few of them, nothing seems to quite fit my scenario / issue. I am trying to extract a fie...
See more...
Hi, I appreciate that there are numerous questions on here for similar problems but, after reading quite a few of them, nothing seems to quite fit my scenario / issue. I am trying to extract a field from an event and call it 'action'. The entry in the props.conf looks like : EXTRACT-pam_action = (Action\: (?P<action>\[[^:\]]+]) ) I know that the extraction is working as there is a field alias later in the props.conf : FIELDALIAS-aob_gen_syslog_alias_32 = action AS change_type When I run a basic generating search on the index & sourcetype, the field 'action' does not appear in the 'Interesting Fields' but the 'change_type' alias does appear! The regex is fine as I can create the 'action' field OK if I add the rex to the search. I have also added the exact same regex to the props.conf file but called the field 'action1' and that field is displayed OK. Another test I tried is to create a field alias for the action1 field name called 'action' : FIELDALIAS-aob_gen_syslog_alias_30 = action1 AS action FIELDALIAS-aob_gen_syslog_alias_32 = action1 AS change_type 'change_type' is visible but, again 'action' is not visible. Finally my search "index=my_index action=*" produces 0 results whereas "index=my_index change_type-*" produces an accurate output. I have looked in the props and transforms configs across my searchhead and can't see anything that might be 'removing' my field extraction but, I guess my question is..... how can I debug the creation ( or not ) of a field name? I have a deep suspicion that it is something to do with one one the Windows TA's apps that we have installed but am struggling to locate the offending configuration Many thanks for any help. Mark
@isoutamo : Thanks for the links you provided. I see that my old DS lists all clients contacting. It is running 9.0.2. Where as the new one which I am trying to setup is running 9.2.1. I see from...
See more...
@isoutamo : Thanks for the links you provided. I see that my old DS lists all clients contacting. It is running 9.0.2. Where as the new one which I am trying to setup is running 9.2.1. I see from the links that, it is because of the version difference. However , I tried the steps provided in the link. Still no luck. I also should mention that I am configuring this DS to act as log forwarder as well. So, it is that both of these setup is making use of same splunk service. Does this have any effect on proper working of Deployment Server. Do you have any comments ? Apart from the steps in above link , do you have any other suggestion. Thanks in Advance, PNV Regards, PNV
You're pretty much there with the first method using the eval. Its a calculated field you need, not a field extraction or field transformation. Settings > Fields > Calculated Fields > Create New....
See more...
You're pretty much there with the first method using the eval. Its a calculated field you need, not a field extraction or field transformation. Settings > Fields > Calculated Fields > Create New. Then set your scope for index/sourcetype Name: MacAddr Eval Expression : replace(CL_MacAddr,“-”,“:”)