All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have been attempting to add a file path in data inputs as well as in the inputs.conf file as a "monitor".  Each time I implement this Splunk ingestion latency spikes to over 300ms and the service b... See more...
I have been attempting to add a file path in data inputs as well as in the inputs.conf file as a "monitor".  Each time I implement this Splunk ingestion latency spikes to over 300ms and the service becomes effectively unusable. My intention is to monitor file additions, deletions, and modifications within a specific filepath.   Any ideas?
In Splunkd.service, there are parameters like  TimeoutStopSec=360 LimitNOFILE=65536 Could you please explain how they are used? Can I double the value of them?
I am trying to set up the cloud trail log data add on / input and the IT and performance Data of my ec2 instance running Splunk input. For the second one, the input isn't even coming up on the list, ... See more...
I am trying to set up the cloud trail log data add on / input and the IT and performance Data of my ec2 instance running Splunk input. For the second one, the input isn't even coming up on the list, and the first one,  there are no matches coming up for the SQS queue name. How can I fix this?
Hello Community, I stumbled across a scenario where I have events present in the JSON format as follows       Event 1: { "severity": "INFO", "message": "msg", "details": { "k... See more...
Hello Community, I stumbled across a scenario where I have events present in the JSON format as follows       Event 1: { "severity": "INFO", "message": "msg", "details": { "key1": "val1", "key2": "val2", "key3": "val3" } } . . . Event n: { "severity": "INFOn", "message": "msgn", "details": { "key1n": "val1", "key2n": "val2", "key3n": "val3" } }         I want to list all the unique keys present under the path "details." I tried querying it using mvexpand and json_keys, but nothing seems to be working. I would greatly appreciate some assistance. The expected output should be as follows:  uniqueKeys: key1 key2 key3 . . . key1n key2n key3n Basically, I want to list down all the unique keys present under the "details" JSON path across all the events..
Hi,  I have got a requirement to enhance the UI of a simple Splunk classic dashboard. I need to add different color for the dashboard background and different color for the dashboard panel. The ref... See more...
Hi,  I have got a requirement to enhance the UI of a simple Splunk classic dashboard. I need to add different color for the dashboard background and different color for the dashboard panel. The reference image is attached below. Can anyone help me with this.
hi Expert Splunkers, really appriciate if you take a look at bottom.   splunkforwarder running on UBUNTU Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root... See more...
hi Expert Splunkers, really appriciate if you take a look at bottom.   splunkforwarder running on UBUNTU Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root /opt/splunkforwarder" Splunk> Now with more code! Checking prerequisites... Management port has been set disabled; cli support for this configuratio n is currently incomplete. Checking conf files for problems... Invalid key in stanza [webhook] in /opt/splunkforwarder/etc/syst em/default/alert_actions.conf, line 229: enable_allowlist (value: false). Your indexes and inputs configurations are not internally consis tent. For more information, run 'splunk btool check --debug' Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunkforwarder/spl unkforwarder-9.0.5-e9494146ae5c-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate valida tion for the httplib and urllib libraries shipped with the embedded Python inter preter; must be set to "1" for increased security Done       1. installed splunk with root user 2. keep on getting that lines at the end start with PYTHONHTTPSVERIFY (first time seeing it) 3. inputs issue can be ignored.
We've moved from a testing and sandbox mode with workday into production. Previously I had no issues with the Workday Add-on and the company we work with to manage our workday instance has provided m... See more...
We've moved from a testing and sandbox mode with workday into production. Previously I had no issues with the Workday Add-on and the company we work with to manage our workday instance has provided me with working credentials each time I've moved tenants. A few days ago I deleted the old credentials and re-pointed the Add on at our new production tenant. I'm getting the following error. ERROR pid=2108866 tid=MainThread file=base_modinput.py:log_error:309 | Request failed with error code (400), retries exhausted and requests.exceptions.HTTPError: 400 Client Error: Bad Request for url:  My guess is the URL they're providing me is incorrect, but I wanted to confirm as they're insisting all their info is accurate.
Hi Team,   I have a field name domain with value "www.microsoft.com"; how I can reverse that and make it to "com.microsoft.www"   domain=www.microsoft.com required: domain=com.microsoft.www   ... See more...
Hi Team,   I have a field name domain with value "www.microsoft.com"; how I can reverse that and make it to "com.microsoft.www"   domain=www.microsoft.com required: domain=com.microsoft.www   Many Thanks.
Hi! I'm currently working on a project where I aim to integrate the OpenCTI platform with Splunk in order to receive intelligence feeds, how can i configure the ingestion of this intelligence feeds... See more...
Hi! I'm currently working on a project where I aim to integrate the OpenCTI platform with Splunk in order to receive intelligence feeds, how can i configure the ingestion of this intelligence feeds ?   Any advice, tips, or resources you can provide will be highly appreciated
A question regarding the splunk attack range I am using the attack_range and not the attack_range_local repository as it says the local repository has been archived, should I be using the attack_ran... See more...
A question regarding the splunk attack range I am using the attack_range and not the attack_range_local repository as it says the local repository has been archived, should I be using the attack_range_local even though it says the attack_range now support a local install. I have specified the provider as local in the configuration but I keep getting errors related to azure or aws I can configure the attack range no problem and I select the local option but I get errors when trying to build the attack range. Any help is much appreciated!!!
Hi, Is there a known procedure/ documentation for instrumenting AppD agents on DataPower on-prem?
Hi. Question: is there a way to add the classic /g option for RegEX in INLINE RegEX extractor for Splunk (props), without using command rex or other tranformations? Example,   SerialNumber=12345... See more...
Hi. Question: is there a way to add the classic /g option for RegEX in INLINE RegEX extractor for Splunk (props), without using command rex or other tranformations? Example,   SerialNumber=12345,SerialNumber=67890   With a classical regex, "/SerialNumber=(?P<sn>\d+)/g" i can found "12345" & "67890". Same with an SPL "rex max-match=0 "SerialNumber=(?P<sn>\d+)". But how to do it in INLINE extraction? I got rid of the "problem" using extraction of "sn1" & "sn2" fields and transforming them with an eval transformation ("sn = sn1.' , '.sn2") and it works fine. But if, tomorrow, i'll find something like   SerialNumber=12345,SerialNumber=67890,SerialNumber=09876,SerialNumber=54321   Without the rex i would be in trouble! Thanks.
Hi, I want to sc4s to receive syslog and I want sc4s to write raw message into a directory.  However, it doesn't write the raw message. There are only export messages (json) write into archive folde... See more...
Hi, I want to sc4s to receive syslog and I want sc4s to write raw message into a directory.  However, it doesn't write the raw message. There are only export messages (json) write into archive folder. What is my mistake?  And which the directory was written to. Thank you ########### env_file ############ SPLUNK_HEC_URL=https://xx.xx.xx.xx:8088 SPLUNK_HEC_TOKEN=xxxxxxxxxxxxxxxxxxxxx SC4S_DEST_SPLUNK_HEC_TLS_VERIFY=no #SC4S_USE_REVERSE_DNS=yes SC4S_LISTEN_FORTINET_UDP_PORT=514 SC4S_GLOBAL_ARCHIVE_MODE=compliance SC4S_ARCHIVE_GLOBAL=yes SC4S_SOURCE_STORE_RAWMSG=yes SC4S_DEST_GLOBAL_ALTERNATES=d_hec_debug,d_archive,d_rawmsg
I have a Splunk 9.0.4 estate on Windows 2019 with the following: Search head 2 x indexers Cluster master/deployment server I'm trying to automate all deployments of apps to forwarders and all c... See more...
I have a Splunk 9.0.4 estate on Windows 2019 with the following: Search head 2 x indexers Cluster master/deployment server I'm trying to automate all deployments of apps to forwarders and all configuration on indexers (transforms/prop.conf) etc.  For the apps that go to the universal forwarders this has been straightforward and I simply add them to deployment server and they push out. What I am not clear on is how I might manage pushing out configuration to my indexers in centralised controlled manner. For example, say I have an app that has a component that needs to be pushed to the forwarder to gather events but then a prop.conf modification to increase the TRUNCATE size. How can I do this centrally?  PS: Apologies if this is somewhat of a noob questions! I'm a long term Splunk tinkerer but I only dip into it when my role necessitates it. 
リアルタイムアラートにて受信したイベントをCSV lookupを参照して処理し、結果をアラート機能の「結果をルックアップに出力」でCSV lookupに追加しています。 イベントの処理中に次のイベントが来た際、処理中のイベント結果がCSV lookupに追加される前に次のイベントの処理が行われ、処理結果が異なることを問題としています。 Splunk の機能でこの問題に対処することは難しいでし... See more...
リアルタイムアラートにて受信したイベントをCSV lookupを参照して処理し、結果をアラート機能の「結果をルックアップに出力」でCSV lookupに追加しています。 イベントの処理中に次のイベントが来た際、処理中のイベント結果がCSV lookupに追加される前に次のイベントの処理が行われ、処理結果が異なることを問題としています。 Splunk の機能でこの問題に対処することは難しいでしょうか。 Knowledge Manager マニュアルで、CSV ルックアップは「マルチユーザー アクセスされないロックは提供されません。」と記載が有るのは確認しています。KV ストアルックアップも同様にロックされないことを確認しています。
on index=_internal I have to create two searches one on (report ) and one connected to the dashboard where the index and the sourcetypes have collected events, with the date of the first and last eve... See more...
on index=_internal I have to create two searches one on (report ) and one connected to the dashboard where the index and the sourcetypes have collected events, with the date of the first and last event on the specific sourcetypes, the searches must take into account the data already present in the lookup itself. With the search, we don't have to overwrite the lookup every time, making the very purpose of its creation fail, since the date of the first event is lost as soon as the searchable data go beyond the retention of the indexes. also with 2 lookup datasets, an old one we have, a new one we need to create. check between the old and the new. if the old recorded data is old, it is kept if the new data is among the oldest, it will be replaced.   below the oldlookup search /// // index="_internal" | stats count, earliest(_time) as first, latest(_time) as last by sourcetype, index, host | eval first = strftime(first, "%Y-%m-%d %H:%M:%S"), last = strftime(last, "%Y-%m-%d %H:%M:%S") | table host, index, sourcetype, first, last | rename first as "first event", last as "last event" | sort - "first event" | outputlookup old .csv  
splunk fsck repair --all-buckets-all-indexes i need to know where i need to put this command on Linux
Hi all.  One of our users cannot upload files to splunk with the error "User is not allowed to modify the job". The user is a power user but not an admin user. May I know if we need to amend any ... See more...
Hi all.  One of our users cannot upload files to splunk with the error "User is not allowed to modify the job". The user is a power user but not an admin user. May I know if we need to amend any settings for it to work? Thanks.    
I receive the following error while trying to execute a simple "makeresults" command by using REST API call: Used endpoint: https://localhost:8089/servicesNS/nobody/myapp/search/jobs Search exam... See more...
I receive the following error while trying to execute a simple "makeresults" command by using REST API call: Used endpoint: https://localhost:8089/servicesNS/nobody/myapp/search/jobs Search example: "| makeresults | eval name=\"denis\"" Error message: "Error in 'makeresults' command: This command must be the first command of a search." "search": "search | makeresults | eval nombre=\"denis\"" I see that the API call changes my search adding a "search" word before the search itself. How can I get riddle of that?
Hello Splunkers we have two instances of Splunk with ES (On Prem + Cloud) how to pull all the notables from both the instances in to a single place? i am going through the mothership and es mother... See more...
Hello Splunkers we have two instances of Splunk with ES (On Prem + Cloud) how to pull all the notables from both the instances in to a single place? i am going through the mothership and es mothership app in splunkbase few clarification: 1. how ES mothership is depends on MOthership app. do we need to do the set up in mothership app which will communicates/send details to ES mothership app? 2. Where we need to install this app? seperate SH or in on prem sh or cloud?   3. what are the other alternative we hvae> can we try federated search for this. will it pull ES notable details?   Thanks D