All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have a dashboard built in our Splunk Cloud instance using dashboard studio with multiple panels using different data sources There is a panel that has several calculated timestamps displayed in ... See more...
We have a dashboard built in our Splunk Cloud instance using dashboard studio with multiple panels using different data sources There is a panel that has several calculated timestamps displayed in UTC When I open in search, the timestamps are displayed in the user preferred timezone All other panels in the dashboard displays timestamps in the user preferred timezones Any ideas why this might be happening?
My Splunk Forwarder inputs.conf looks like this: [batch://C:\Splunk\MyApi\Local\Api\*.json] index = myapi_local move_policy = sinkhole disabled = 0 source = myapi sourcetype = Api   My log... See more...
My Splunk Forwarder inputs.conf looks like this: [batch://C:\Splunk\MyApi\Local\Api\*.json] index = myapi_local move_policy = sinkhole disabled = 0 source = myapi sourcetype = Api   My logging files are generating every second . Is that perhaps a little bit too excessive? What's the best practice in using the Forwarder? File name examples: MyAPI_2022-12-08 23-06-28.json MyAPI_2022-12-08 23-06-29.json ... Thanks!
Hi, I need to use a number of regression models on some index data. This index data is in an app called "XY". However, thus far, I have only been able to use the regression modelling on the Splunk... See more...
Hi, I need to use a number of regression models on some index data. This index data is in an app called "XY". However, thus far, I have only been able to use the regression modelling on the Splunk MLTK app. Is it possible to call such  functionality in the Splunk MLTK app while inside another app which has the data you wish to model on? Otherwise, what alternatives can I do in this situation? Many thanks,
Hello, I've spent probably 8+hrs now trying to debug how to get SSL certificates working with splunk web and finally got it working, so posting this here to hopefully help someone in the future. ... See more...
Hello, I've spent probably 8+hrs now trying to debug how to get SSL certificates working with splunk web and finally got it working, so posting this here to hopefully help someone in the future. Using these links as a reference: https://docs.splunk.com/Documentation/Splunk/9.0.2/Security/Turnonbasicencryptionusingweb.conf https://docs.splunk.com/Documentation/Splunk/9.0.2/Security/HowtoprepareyoursignedcertificatesforSplunk The hardest part was figuring out how to use the certificates provided by certbot into a format that splunk recognizes. The following steps ended up working: 1) Create /opt/splunk/etc/system/local/web.conf by copying /opt/splunk/etc/system/default/web.conf and change the line "enableSplunkWebSSL = false" to "enableSplunkWebSSL = true" 2) Install and configure certbot to obtain certificates as needed. They'll be in /etc/letsencrypt/live/$my_domain/ instead of /opt/splunk/etc/auth/splunkweb/ and they're not in a format that splunk can use. 3) The second link above gives some guidance on how to prepare the certbot certificates to the format that splunk needs them, which should be: server certificate private key CA certificate To do this, I'm creating the following certbot post renewal hook script: /etc/letsencrypt/renewal-hooks/post/splunk.sh #!/bin/bash #change this my_domain variable to match the domain you are using my_domain=XXXX src_path=/etc/letsencrypt/live/$my_domain dst_path=/opt/splunk/etc/auth/splunkweb cat $src_path/cert.pem $src_path/privkey.pem $src_path/fullchain.pem > $dst_path/cert.pem cat $src_path/privkey.pem > $dst_path/privkey.pem chown splunk:splunk $dst_path/cert.pem $dst_path/privkey.pem chmod 600 $dst_path/cert.pem $dst_path/privkey.pem /opt/splunk/bin/splunk restart #EOF And make the script executable: chmod +x /etc/letsencrypt/renewal-hooks/post/splunk.sh 4) Since you've already renewed the certificate with certbot, you can run the script directly: /etc/letsencrypt/renewal-hooks/post/splunk.sh The script should run automatically whenever certbot renews your certificate
Would someone know how to find out who is logged into a specific computer. Thanks in advance!
Hello! I recently install security patches in the Linux Server where my heavy forwarder is installed, but when I update the JRE path to the new one, the add-on does not work. I updated the JRE path... See more...
Hello! I recently install security patches in the Linux Server where my heavy forwarder is installed, but when I update the JRE path to the new one, the add-on does not work. I updated the JRE path, I upgrade the Splunk DB Connector add-on to the latest version, I upgrade my Heavy Forwarder as well to the 9.0.2 version and still does not work. Does anyone knows what else can I do? Thank you in advance.
What are the benefits around creating health rules in baseline and browser metrics?
Hello Splunkers, I'm looking for a Splunk search to list all indexes that were not used by users for last 30 days. I've tried the below query from audit logs, but it's not giving me the accurate res... See more...
Hello Splunkers, I'm looking for a Splunk search to list all indexes that were not used by users for last 30 days. I've tried the below query from audit logs, but it's not giving me the accurate results. This query is only giving me few indexes but not all the indexes that we used.  index=_audit sourcetype=audittrail action=search info=granted (NOT TERM(index=_*)) | where match(search, "index\s*(?:=|IN)") | rex max_match=0 field=search "'search index=(?<used_index>\w+)'" | stats count by used_index Appreciate if anyone could share some thoughts on this?
Splunk Enterprise on Prem 8.2.6 Linux I understand Dashboard studio, at least on 8.2.6, is very limited in the drilldown area but there must be a way to pass the earliest and latest time from one D... See more...
Splunk Enterprise on Prem 8.2.6 Linux I understand Dashboard studio, at least on 8.2.6, is very limited in the drilldown area but there must be a way to pass the earliest and latest time from one Dashboard Studio Dashboard to another.  I have my first level dashboard passing the earliest/latest in the URL (I have tried with and without &quot; https://10.178.1.121:8000/en-US/app/filetracker/ds_begin_clone/edit?earliest=&quot;2022-12-07T18:00:00.000&quot;&latest=&quot;2022-12-07T18:05:00.000&quot; I cannot seem to figure out how to set the global time picker on the dashboard.  I this possible? Up to now, I have been using Classic because I can generate links in my alerts to take staff directly to the data they need to troubleshoot, but so far only in another classic dashboard. I can provide a much better explanation of the problem in Dashboard Studio. How can I pass parms from one and use in another dashboard studio dashboard?
I want to store the Splunk dashboard code in Gitlab or Bitbucket so I do not lose the dashboard. Any ideal if its possible? 
Hi all, I have created a dashboard incorporating few external domains I am receiving the error message like  the dashboard is attempting to receive content from outside of splunk.the content urls a... See more...
Hi all, I have created a dashboard incorporating few external domains I am receiving the error message like  the dashboard is attempting to receive content from outside of splunk.the content urls are not in the dashboards trusted domains list.   Thanks..
Hi all, recently my customer asked me to integrate different JSON log sources (VPN concentrator, WAF and Load Balancers) comeing from only one Azure event hub. I onboarded it using the Splunk Add-o... See more...
Hi all, recently my customer asked me to integrate different JSON log sources (VPN concentrator, WAF and Load Balancers) comeing from only one Azure event hub. I onboarded it using the Splunk Add-on for Microsoft Cloud Services (https://splunkbase.splunk.com/app/3110) from the Inputs Data Manager Instance (IDM) and I selected the deafult sourcetype "mscs:azure:eventhub". At this point I need to split this sourcetype in three new ones, one for each log type (VPN concentrator, WAF and Load Balancers) distinguishing them and creating custom field extractions and so on for the Data Models. I found a field "category"  within the JSON logs which can be used as splitting criteria: Have you any idea to do that? Thanks in advance!
I am trying to create a custom metric on the database. i have selected the database but when I add a query it gives me the ORA-00942: table or view does not exist error The query returns just one va... See more...
I am trying to create a custom metric on the database. i have selected the database but when I add a query it gives me the ORA-00942: table or view does not exist error The query returns just one value when it is run against the database. Can you advise why this error is being thrown?
Hello Experts , I am trying to delete the fishbucket but I want to delete only one index=syslog..Is there a command I can run that only delete for a  particular index   Thanks in Advance 
Hello All, I recently started ingesting vac flow logs from my AWS environment using the data manager app, and everything works fine in terms of getting the logs into splunk.  There is however o... See more...
Hello All, I recently started ingesting vac flow logs from my AWS environment using the data manager app, and everything works fine in terms of getting the logs into splunk.  There is however one issue, when creating the VPC flow logs on AWS, we opted for a custom format to be able to glean additional fields like the "pkt-srcaddr" and pat-dstaddr". As a result of this, Splunk does not correctly interpret the logs on the console. I believe that Splunk is reading the logs using the default log format detailed below: Default Format: ${version} ${account-id} ${interface-id} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status} how do I get it to read the logs using the custom format detailed below: Custom Format ${version} ${account-id} ${vpc-id} ${subnet-id} ${interface-id} ${instance-id} ${flow-direction} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${pkt-srcaddr} ${pkt-dstaddr} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status}
I have a table with 3 columns: _time, type and action | makeresults count=10 | eval type = "typeA" | eval action = if((random()%2) == 1, "open", "close") | union [| makeresults count=10 | eva... See more...
I have a table with 3 columns: _time, type and action | makeresults count=10 | eval type = "typeA" | eval action = if((random()%2) == 1, "open", "close") | union [| makeresults count=10 | eval type = "typeB" | eval action = if((random()%2) == 1, "open", "close")] I need to create a column for each type that would identify the change in the column action and count # of actions in ascending order like this... _time typeA typeB typeA_count typeB_count 2022-01-01 05:00:00 open close 1 1 2022-01-01 05:00:01 open open 2 1 2022-01-01 05:00:02 close close 1 1 2022-01-01 05:00:03 open open 1 1 2022-01-01 05:00:04 close open 1 2 2022-01-01 05:00:05 open close 1 1 2022-01-01 05:00:06 open close 2 2 2022-01-01 05:00:07 open close 3 3 2022-01-01 05:00:08 close open 1 1 2022-01-01 05:00:09 open close 1 1 Thanks
  Hi , I need to extract the value FISOBPIT10101 from the below lines.   message:PSUS7|8897|FISOBPIT10101|OWA|8897|8897|SignOnID|SPT|adding routing key in producer
HI, I have a multivalued field with values as A B C I want it to be replaced as 'A','B','C' . I tried to do it with eval mvjoin, but the first and last values misses the quote like this A','B',... See more...
HI, I have a multivalued field with values as A B C I want it to be replaced as 'A','B','C' . I tried to do it with eval mvjoin, but the first and last values misses the quote like this A','B','C SPL used is:    index=* sourcetype=* host=abc NAME IN ("*A*","*B*","*C*") |stats values(NAME) as NAME by host | eval NAME = mvjoin(NAME,"','")   Any help would be appreciated, thankyou
i am working on splunk cloud , i don't have access to server and i am using dashboard studio .  This is my table code also i have attached the screenshot of my table , so i just want to know how can... See more...
i am working on splunk cloud , i don't have access to server and i am using dashboard studio .  This is my table code also i have attached the screenshot of my table , so i just want to know how can i add tooltip to each column header of my table. "viz_zfv78G8Y": { "type": "splunk.table", "title": "GP Metrics", "options": { "tableFormat": { "align": "> table |pick(alignment)" }, "columnFormat": { "messegetype": { "data": "> table | seriesByName(\"messegetype\") | formatByType(messegetypeColumnFormatEditorConfig)" } } }, "dataSources": { "primary": "ds_f9ztfdW3" } }
Hi All, I am looking for the Dashboard where it can say current Active session and User Details F5 VPN. i checked any App but those are old one not supported for cloud Splunk, can anyone know how do... See more...
Hi All, I am looking for the Dashboard where it can say current Active session and User Details F5 VPN. i checked any App but those are old one not supported for cloud Splunk, can anyone know how do we achieve this requirement. Currently the log being pushed via syslog to Splunk cloud. Any recommendation and help highly appreciated.