All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

This app is supported by my company. And we have released latest version of app providing additional support to pull Audit logs from Zabbix and added more details also.  If you still facing issue wit... See more...
This app is supported by my company. And we have released latest version of app providing additional support to pull Audit logs from Zabbix and added more details also.  If you still facing issue with add-on please send us and email at splunk.support@dataelicit.com
Hi @Alex_Rus , yes, it's possible modifying inputs.conf on your Splunk_TA_Windows adding whitelists and/or blacklists to filter your events. Otherwise it's possible to filter events, using props.co... See more...
Hi @Alex_Rus , yes, it's possible modifying inputs.conf on your Splunk_TA_Windows adding whitelists and/or blacklists to filter your events. Otherwise it's possible to filter events, using props.conf and transfroms.conf on the Indexers following the instrauctions at https://docs.splunk.com/Documentation/SplunkCloud/latest/Forwarding/Routeandfilterdatad#Filter_event_data_and_send_to_queues If possible it's better the first solution (inputs.conf), otherwise you can use the second one. Ciao. Giuseppe
App on deployment-server. I think, filtering by props and transform is better, but maybe I'm wrong)
Hi @msalghamdi , could you better describe your requirement, eventually with an example? Ciao. Giuseppe
Hi @jroedel , if the Add Data feature doesn't permit to use this feature I suppose that it isn't possible event if it's strange. I tried but I have the same result Ciao. Giuseppe
Hi @Alex_Rus , let me understand: you want to filter events on the Universal Forwarder, is it correct? see blacklists and whiteslists in Splunk_TA_Windows documentation that guides you: https://do... See more...
Hi @Alex_Rus , let me understand: you want to filter events on the Universal Forwarder, is it correct? see blacklists and whiteslists in Splunk_TA_Windows documentation that guides you: https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#Event_Log_filtering Ciao. Giuseppe
Hello Splunkers How can i utilize a lookup in a correlation search showing the detected keyword in the search result ? its a requirement that the analyst shouldn't have the capability to view l... See more...
Hello Splunkers How can i utilize a lookup in a correlation search showing the detected keyword in the search result ? its a requirement that the analyst shouldn't have the capability to view lookups Thanks in advance.
Thanks for your second attempt. I tried, but still no luck. Might there be the possibility, that the "Add Data" WebUI Wizard does not support this correctly?
Hi Splunk community!  I need to filter events from the Splunk_ta_Windows application by the EventCode, Account_Name and Source_Network_Address fields. Tell me, in what form should props.conf and tra... See more...
Hi Splunk community!  I need to filter events from the Splunk_ta_Windows application by the EventCode, Account_Name and Source_Network_Address fields. Tell me, in what form should props.conf and transform.conf be written and in what folder should they be located?
Hi @jroedel , please try this: TIME_FORMAT=%s,\n\s*\"nanoOfSecond\"\s*:\s*%9N TIME_PREFIX=\"epochSecond\"\s*:\s* MAX_TIMESTAMP_LOOKAHEAD=500 Ciao. Giuseppe
I tried, but still no luck  
Hi @jroedel , are you sure about the number of spaces? please try this: TIME_FORMAT=%s,\n\s*"nanoOfSecond"\s*:\s*%9N TIME_PREFIX="epochSecond"\s*:\s* MAX_TIMESTAMP_LOOKAHEAD=500 Ciao. Giuseppe
After upgrading Splunk from 8 to 9 version I've started to receive messages : " The Upgrade Readiness App detected 1 app with deprecated Python: splunk-rolling-upgrade " Can't find this app Splunkb... See more...
After upgrading Splunk from 8 to 9 version I've started to receive messages : " The Upgrade Readiness App detected 1 app with deprecated Python: splunk-rolling-upgrade " Can't find this app Splunkbase | apps. As far as I understand it's Splunk buit-in app? Should I delete it or how can I resolve this issue ? P"lease help.
I have to parse the timestamp of JSON logs and I would like to include subsecond precision. My JSON-Events start like this:     { "instant" : { "epochSecond" : 1727189281, "nanoOfSecond"... See more...
I have to parse the timestamp of JSON logs and I would like to include subsecond precision. My JSON-Events start like this:     { "instant" : { "epochSecond" : 1727189281, "nanoOfSecond" : 202684061 }, ...       Thus I tried as config in props.conf:   TIME_FORMAT=%s,\n "nanoOfSecond" : %9N TIME_PREFIX="epochSecond" :\s MAX_TIMESTAMP_LOOKAHEAD=500     That did unfortunately not work.   What is the right way to parse this time stamp with subsecond precision?
How can we send a file as input to an API endpoint from custom spl commands developed for both Splunk Enterprise and Splunk Cloud, ensuring the API endpoint returns the desired enrichment details?
I agree with what @KendallW shared its hard to comment anything without checking actual data but this type of ERRORs can mainly happen due to mismatch in timestamps.
Hi, regarding test 1 your assmption is correct. regarding test 2 if the test is executed at 11:00 am for example and fails at this time. the alert will be triggered immediately after the failed exe... See more...
Hi, regarding test 1 your assmption is correct. regarding test 2 if the test is executed at 11:00 am for example and fails at this time. the alert will be triggered immediately after the failed execution when the  configured trigger threshold is reached at this time.  If the test is successful at 11:00 am and the next execution of the test fails at 11:30 am.  the alert will be triggered immediately after the failed execution when the  configured trigger threshold is reached.
I have provided the sample data. I have huge data in few thousand lines. Which is pushed to Splunk. Query should be generic to accept any data size. Its not just 10 values.
I have provided the sample data. I have huge data in few thousand lines. Which is pushed to Splunk. Query should be generic to accept any data size.
I faced this issue, found that server.pem under /etc/auth had expired.  1) renamed server.pem 2) ran splunk restart 3) new cert got generated with 3 year extension on expiry date. Do not change a... See more...
I faced this issue, found that server.pem under /etc/auth had expired.  1) renamed server.pem 2) ran splunk restart 3) new cert got generated with 3 year extension on expiry date. Do not change any java settings if it was working before and suddenly stopped working,  check cert expiry first.