All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi all, we are planing to update to version 9.0.1. and I was wondering if the kv store migration from mmap v1 to WiredTiger  obligatory is? The first sentence from the Splunk documentatiosn says "Sp... See more...
Hi all, we are planing to update to version 9.0.1. and I was wondering if the kv store migration from mmap v1 to WiredTiger  obligatory is? The first sentence from the Splunk documentatiosn says "Splunk Enterprise versions 9.0 and higher require the WiredTiger storage engine and server version 4.2" but I was wondering if something will go wrong if we don't perform the kv store migration before the Splunk upgrade or if we don't perform the migration at all.   Thanks!
Hi All,  The application flow map lists the calls/min, errors/min. When we click on, it lists the Errors but I would like to know where and how this is being captured.  For example:  Name Coun... See more...
Hi All,  The application flow map lists the calls/min, errors/min. When we click on, it lists the Errors but I would like to know where and how this is being captured.  For example:  Name Count Errors/min Internal server error: 500 22 0 xxxxx xx xx The Business Transactions tab lists the BT name and all the other details. It has the drill down option by clicking on view business transaction dashboard, but sometimes, it is not showing the same error counts inside.  Appreciate your inputs to understand this better. 
Hi, i got the below query, and alert should get triggered only when data is not avaiable from any one of the host_ips i gave the time range as 24 hrs to now and alert condition = o and corn expres... See more...
Hi, i got the below query, and alert should get triggered only when data is not avaiable from any one of the host_ips i gave the time range as 24 hrs to now and alert condition = o and corn expression */30 * * * * i am getting mail for every 30 mins, even if data is available. index=advcf request=* host IN(abgc, efgh, jhty, hjyu,kjnb) | eval event_ct=1 | append [| makeresults | eval host="abgc, efgh, jhty, hjyu, kjnb" | rex field=host mode=sed "s/\s+//g" | eval host=split(host,",") | mvexpand host | eval event_ct=0 ] | stats sum(event_ct) AS event_ct BY host | where event_ct=0  
We are looking in setting up a way to test Splunk Dashboard screens, is there any tool that can do this?. We are looking for functional testing and performance testing. The reason we are doing this... See more...
We are looking in setting up a way to test Splunk Dashboard screens, is there any tool that can do this?. We are looking for functional testing and performance testing. The reason we are doing this, is the data is changing and we want to version different Splunk dashboards. So we are looking to set up testing on the dashboards, to see what breaks or not when the data changes.
I have a request from one of our service managers about getting a inventory of all hosts logging into Splunk. Using tstats does get the results we need via | tstats values(host) by host drillin... See more...
I have a request from one of our service managers about getting a inventory of all hosts logging into Splunk. Using tstats does get the results we need via | tstats values(host) by host drilling down per index | tstats values(host) as hosts where index=idxname by index and exporting to a CSV file or emailing the results wont work for our current needs and he would like the exported CSV results to be stored on network drive on a weekly basis, or possibly some other format if that's an option. Not sure if this is possible with the report actions currently available, as I only see webhook, emailing results etc. wondering if there is a way to do this with a addon alert action, or possibly another way?
I have been reviewing the countless other postings on subsearches but I can't pull them all together to figure out our issue.  This first search builds a list of carts that we need to find the cont... See more...
I have been reviewing the countless other postings on subsearches but I can't pull them all together to figure out our issue.  This first search builds a list of carts that we need to find the contents of: index="name" "Authorization was not successful!" AND /placeorder | rex field=_raw "/carts/(?<cart>.+)/placeorder" | dedup cart | table cart This is where I run into issues. I need to take the table created in that search and find all of the items contained in them.  Here is the search for a single cart from that list: index="name" "3322830131/processCheckout" AND "\"paymentProvider\":\"PayPal\"" My thought is that I need to cycle through the table from the subsearch, replacing the number in this search, then finally building a visualization that shows the contents of each cart using the most recent event in the second search.  Am I way off? This seems pretty easy but I can't figure it out. TYIA
I received the following error from splunk team which failed the cloud compatibility check Any suggestions on how to resolve the errors? I have updated the splunk sdk for python packages to la... See more...
I received the following error from splunk team which failed the cloud compatibility check Any suggestions on how to resolve the errors? I have updated the splunk sdk for python packages to latest version Thanks  
<span>This call to java.lang.Runtime.exec() contains a command injection flaw. The argument to the function is constructed using untrusted input. If an attacker is allowed to specify all or part of t... See more...
<span>This call to java.lang.Runtime.exec() contains a command injection flaw. The argument to the function is constructed using untrusted input. If an attacker is allowed to specify all or part of the command, it may be possible to execute commands on the server with the privileges of the executing process. The level of exposure depends on the effectiveness of input validation routines, if any. The first argument to exec() contains tainted data from the variables (new String\[...\]). The tainted data originated from an earlier call to AnnotationVirtualController.vc_annotation_entry.</span> <span>Validate all untrusted input to ensure that it conforms to the expected format, using centralized data validation routines when possible. When using blocklists, be sure that the sanitizing routine performs a sufficient number of iterations to remove all instances of disallowed characters. Most APIs that execute system commands also have a "safe" version of the method that takes an array of strings as input rather than a single string, which protects against some forms of command injection.</span> <span>References: <a href="https://cwe.mitre.org/data/definitions/78.html">CWE</a> <a href="https://owasp.org/www-community/attacks/Command_Injection">OWASP</a></span> This fields value from where I need to create 2 separate fields  First field call flaw extract  from <span> "This call to java.lang.Runtime.exec() contains a command injection flaw. The argument to the function is constructed using untrusted input. If an attacker is allowed to specify all or part of the command, it may be possible to execute commands on the server with the privileges of the executing process. The level of exposure depends on the effectiveness of input validation routines, if any. The first argument to exec() contains tainted data from the variables (new String\[...\]). The tainted data originated from an earlier call to AnnotationVirtualController.vc_annotation_entry.  "<\span>   second fields call rededication start from <span> Validate all untrusted input to ensure that it conforms to the expected format, using centralized data validation routines when possible. When using blocklists, be sure that the sanitizing routine performs a sufficient number of iterations to remove all instances of disallowed characters. Most APIs that execute system commands also have a "safe" version of the method that takes an array of strings as input rather than a single string, which protects against some forms of command injection.<\span>
Hello, We've setup our Splunk Search Head to download snapshots from ThreatStream API directly, while troubleshooting, we observed that it was downloading the snapshots from hxxps://ts-optic.s3.ama... See more...
Hello, We've setup our Splunk Search Head to download snapshots from ThreatStream API directly, while troubleshooting, we observed that it was downloading the snapshots from hxxps://ts-optic.s3.amazonaws.com/snapshots/... but then had issues processing it.         2022-11-03 02:01:47,394 18860 ERROR threatstream_app - threatstream_kvstore> Autologin succeeded, but there was an auth error on next request. Something is very wrong. 2022-11-03 02:01:47,443 18860 ERROR threatstream_app - threatstream_kvstore> Failed at add_kvs_batch - sz == 1, collection_name: ts_md5, data: [{'date_last': '2016-02-21T14:52:32.000Z', 'id': '0', '_key': '99929352'}] 2022-11-03 02:01:47,443 18860 ERROR threatstream_app - threatstream_kvstore> Autologin succeeded, but there was an auth error on next request. Something is very wrong. 2022-11-03 02:01:47,464 18860 ERROR threatstream_app - threatstream_kvstore> Failed at add_kvs_batch - sz == 1, collection_name: ts_md5, data: [{'date_last': '2016-02-21T14:52:37.000Z', 'id': '0', '_key': '99929603'}] 2022-11-03 02:01:47,464 18860 ERROR threatstream_app - threatstream_kvstore> Autologin succeeded, but there was an auth error on next request. Something is very wrong. 2022-11-03 02:01:48,677 18860 INFO threatstream_app - ioc_loader> 193571 items with id="0" saved to kvs: ts_md5 for deletion, time: 35505.908512592316 2022-11-03 02:01:48,678 18860 INFO threatstream_app - ioc_loader> 193571 items with id="0" saved to kvs: ts_md5 for deletion, time: 35505.908512592316 2022-11-03 02:01:49,059 18860 ERROR threatstream_app - ts_ioc_ingest> failed to download optic intelligence: Autologin succeeded, but there was an auth error on next request. Something is very wrong. 2022-11-03 02:01:49,059 18860 ERROR threatstream_app - ts_ioc_ingest> failed to download optic intelligence: Autologin succeeded, but there was an auth error on next request. Something is very wrong. 2022-11-03 02:01:49,933 18860 ERROR threatstream_app - ts_ioc_ingest> Traceback (most recent call last): File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 290, in wrapper return request_fun(self, *args, **kwargs) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 71, in new_f val = f(*args, **kwargs) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 622, in delete response = self.http.delete(path, self._auth_headers, **query) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 1169, in delete return self.request(url, message) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 1255, in request raise HTTPError(response) splunklib.binding.HTTPError: HTTP 401 Unauthorized -- call not properly authenticated During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 232, in _handle_auth_error yield File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 301, in wrapper return request_fun(self, *args, **kwargs) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 71, in new_f val = f(*args, **kwargs) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 622, in delete response = self.http.delete(path, self._auth_headers, **query) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 1169, in delete return self.request(url, message) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 1255, in request raise HTTPError(response) splunklib.binding.HTTPError: HTTP 401 Unauthorized -- call not properly authenticated During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/splunk/etc/apps/threatstream/bin/ts_ioc_ingest.py", line 284, in download_iocs TmDataManager(splunka=remote_splunk, logger=logger).process_data() File "/opt/splunk/etc/apps/threatstream/bin/ts/tm_data_manager.py", line 176, in process_data self._process_data() File "/opt/splunk/etc/apps/threatstream/bin/ts/tm_data_manager.py", line 245, in _process_data self.load_from_lookup_files() File "/opt/splunk/etc/apps/threatstream/bin/ts/tm_data_manager.py", line 508, in load_from_lookup_files iocs.load_iocs() File "/opt/splunk/etc/apps/threatstream/bin/ts/lookup_iocs.py", line 404, in load_iocs util.utils.remove_0_id_values(self.kvsm, kvs) File "/opt/splunk/etc/apps/threatstream/bin/util/utils.py", line 143, in remove_0_id_values remove_delete_id_values(kvsm, ioc_kvs_name, 'id', '0') File "/opt/splunk/etc/apps/threatstream/bin/util/utils.py", line 146, in remove_delete_id_values kvsm.delete_kvs(kvs, {id_name : delete_id_value}) File "/opt/splunk/etc/apps/threatstream/bin/util/kvs_manager.py", line 286, in delete_kvs collection.data.delete(query=json.dumps(query_dict)) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/client.py", line 3678, in delete return self._delete('', **({'query': query}) if query else {}) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/client.py", line 3631, in _delete return self.service.delete(self.path + url, owner=self.owner, app=self.app, sharing=self.sharing, **kwargs) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 301, in wrapper return request_fun(self, *args, **kwargs) File "/opt/splunk/lib/python3.7/contextlib.py", line 130, in __exit__ self.gen.throw(type, value, traceback) File "/opt/splunk/etc/apps/threatstream/bin/splunklib/binding.py", line 235, in _handle_auth_error raise AuthenticationError(msg, he) splunklib.binding.AuthenticationError: Autologin succeeded, but there was an auth error on next request. Something is very wrong.         So I guess "Something is wrong"? but what? Anyone knows a solution or at least the cause of this?  
Hello! So I  just have a dashboard for practicing different searches and XML Code, etc (MacOS). and I was trying to include a random static jpg picture in the source code in the beginning of my dashb... See more...
Hello! So I  just have a dashboard for practicing different searches and XML Code, etc (MacOS). and I was trying to include a random static jpg picture in the source code in the beginning of my dashboard. This is what I have in source code based on similar questions here on Splunk Answers:   <html> /*this is where all my styling is for font, size, colors, alignment, etc. for a title and I wanted to include a jpg right after*/ <img src="static/app/search/images/picture.jpg/"> </html>   I'm not sure what exact file path to place my jpg or code I would need to get this to work. There are two different answers I found here on Splunk Answers:   splunk/etc/apps/search/appserver/static/images/picture.jpg splunk/apps/search/static/images/picture.jpg I tried both ways to no avail. (I created the folder 'images')
Hey splunk community! I need to create a search query to find instances where the time between a "Cache set' log from my application and a "Cache miss" log is not equal to a certain value(The conf... See more...
Hey splunk community! I need to create a search query to find instances where the time between a "Cache set' log from my application and a "Cache miss" log is not equal to a certain value(The configured TTL), for any cache key. I've attempted starting with a particular key(sampleKey) but the end goal is to tabularize these results for all keys. Here's my attempt to calculate the time difference for a sampleKey, between the set and miss times : index= authzds-e2e* "setting value into memcached" "key=sampleKey" [search index=authzds-e2e* "Cache status=miss" "key=sampleKey" | stats latest(_time) as missTime ] | stats earliest(_time) as setTime| eval timeDiff=setTime-missTime My goal is to calculate the difference between consecutive set and miss events, key-wise (not earliest/latest as in the above query)
Hi Team, I am not able to upload a local log file to my local Splunk getting below error in splunkd.log OneShotWriter failed to insert data to parsingQueue. Timed out in 5000 milliseconds. ca... See more...
Hi Team, I am not able to upload a local log file to my local Splunk getting below error in splunkd.log OneShotWriter failed to insert data to parsingQueue. Timed out in 5000 milliseconds. can someone please help me with this. Thanks
Hi Splunk Experts, I tried to create the search but can't be successful in it, I need a search, if in case the interface of an Cisco switch is down and doesn't came to up state within 5 minutes it s... See more...
Hi Splunk Experts, I tried to create the search but can't be successful in it, I need a search, if in case the interface of an Cisco switch is down and doesn't came to up state within 5 minutes it should throw an alert. The logs are as below screenshot. I tried to write the query only for one scenario i.e if the switch status changed state to down i can get the alert but i can't merge the both message fields back to back. Required Alert scenario: when first log containing the message field with changed state to down appears and within 5 minutes the next log containing the message field with changed state to up doesn't appear then i must get an alert.     Thanks in advance.....  
Hi ! I am using line chart at my dashboard, and I'm trying to make the axis x label constant. for example to set all labels for the last year, and in intervals of months. in addition I don't wa... See more...
Hi ! I am using line chart at my dashboard, and I'm trying to make the axis x label constant. for example to set all labels for the last year, and in intervals of months. in addition I don't want that every dot in the line will be represent with a label.  as you can see in this pic(from documentation):   if you will follow the blue line(please ignore the yellow line) int the x axis we are not getting label per each "dot"(value).  
Hi Guys, We are migrating our Splunk Authentication from LDAP to OKTA SAML. We have about 40 odd SAML groups setup in Splunk. Also, each SAML group has a different role. Now, there are many users... See more...
Hi Guys, We are migrating our Splunk Authentication from LDAP to OKTA SAML. We have about 40 odd SAML groups setup in Splunk. Also, each SAML group has a different role. Now, there are many users which are in multiple SAML groups. Question is how will Splunk decide what role that users takes? I know that LDAP authentication will give precedence to the Connection Order of the LDAP strategy meaning if a user is in strategy#6 and #7, he will be assigned the role which is assigned to LDAP strategy #6.  I don't see this option in the SAML. Any help would be appreciated. Thanks, Neerav   
Hi there, appreciate if anyone could help me with these query. I am trying to pump local file to splunk using fluentbit. The Splunk is currently https and secure. I kept encountering error messag... See more...
Hi there, appreciate if anyone could help me with these query. I am trying to pump local file to splunk using fluentbit. The Splunk is currently https and secure. I kept encountering error message of unexpected EOF, I am not sure what have I done wrongly in the fluent-bit.config file.     This is the screenshot of the splunk's general settting   Below is the fluent-bit.config that I used with the fluent-bit.exe.. [INPUT] Name tail Tag taglog Path C:\*.json [OUTPUT] Name splunk Match * Host localhost Port 443 Splunk_Token <The HTTP Event Collector token generated in Splunk Web> TLS On TLS.Verify On http_user <The username login to Splunk Web> http_passwd <The password used to login to Splunk Web> splunk_send_raw On     when i set the "TLS.Verify" to Off, it will have 303 http status code
Hi Team, We have configured the Custom Email for Database Alerts. In the Email Alerting I need to put the hostname of the database  and its type whether it's windows or Linux? Can anyone help me wi... See more...
Hi Team, We have configured the Custom Email for Database Alerts. In the Email Alerting I need to put the hostname of the database  and its type whether it's windows or Linux? Can anyone help me with the variables that I need to include in email template in order to fulfill the above requirement. Thanks, Eswari
Hi there, Anyone pls advice how to onboard VM logs and Bastion logs from Azure to Splunk. I have installed the add on microsoft cloud services  but am only receiving  metrics log from these bastion... See more...
Hi there, Anyone pls advice how to onboard VM logs and Bastion logs from Azure to Splunk. I have installed the add on microsoft cloud services  but am only receiving  metrics log from these bastion event hub and VM event hub. Please let me know how to get VM logs and Bastion logs from azure to Splunk thanks in advance