All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello! Using AppDynamics (SaaS Pro Edition), we would like to collect logs from Azure, so we are able to create useful custom dashboards. The data we are interested in is currently collected by Log ... See more...
Hello! Using AppDynamics (SaaS Pro Edition), we would like to collect logs from Azure, so we are able to create useful custom dashboards. The data we are interested in is currently collected by Log Analytics in Azure, but we would like to have some of that data in AppDynamics as well. All of the documention related to Log Analytics in AppDynamics mentions Analytics agent installation, but I don't think it applies to Azure App services for example, there's just AppDynamics App Agent extension available and afterwards there's no configuration to enable some Analytics Agent that you need to select while configuring Agent Scope/Source Rules. Is there a possibility to make use of Analytics Agents on Azure to fetch some logs?  Or is there another way to fetch the logs from Azure App Services that we can later see in Log Analytics or App Insights? Couldn't find an answer in the documentation. Thank you in advance! Marcin
I have one host that I want to remove from all my premade dashboards in the Splunk App for AWS Security Dashboards.  Can someone tell me where I would enter this in the source code for the Dashboard ... See more...
I have one host that I want to remove from all my premade dashboards in the Splunk App for AWS Security Dashboards.  Can someone tell me where I would enter this in the source code for the Dashboard so that it always excludes this host? 
Dear Splunkers,    Is enabling maintenance-mode for an indexers cluster needed when rebuilding frozen buckets?
Hi,  I have configured Application on SAAS platform, the application agent is successfully installed  and sending business  transaction data. But  under Application Infrastructure Performance - > Tie... See more...
Hi,  I have configured Application on SAAS platform, the application agent is successfully installed  and sending business  transaction data. But  under Application Infrastructure Performance - > Tier -> Hardware Resource,  for all the metrics the value is null (no data to display message). Please help is any change to be done to capture the host metrics data.
Hi,   I'm using the community edition of the SOAR, in that I created one label after creating that, I created a playbook and set that playbook to operate on that created label. for the automation... See more...
Hi,   I'm using the community edition of the SOAR, in that I created one label after creating that, I created a playbook and set that playbook to operate on that created label. for the automation run, I used the "Timer app" to run the playbook on that label everyday morning. Now the error comes, in automation the newly created label results in an error but the older( 2 days before ) label results properly. And if I try to run the playbook as manual on the newly created label results properly. This is the error:  'id: 789, version: 32, pyversion: 3, scm id: 2' playbook cannot be run on 'test'   Kindly help me out of this error, Thanks in advance.
In the Splunk Fortinet FortiGate app - wireless and System dashboards are not working both dashboards are not showing any data,  as I check from Fortinet app all logs are forwarded and other Splun... See more...
In the Splunk Fortinet FortiGate app - wireless and System dashboards are not working both dashboards are not showing any data,  as I check from Fortinet app all logs are forwarded and other Splunk Fortinet dashboards like (Fortinet n\w security, traffic, unified Threat management,vpn )  work perfectly.   please help.  
Just putting this here for others who come across this problem since I got no results when I searched here. After upgrading to Splunk 9.0.1 and configuring an SSL cert for the kvstore I got this err... See more...
Just putting this here for others who come across this problem since I got no results when I searched here. After upgrading to Splunk 9.0.1 and configuring an SSL cert for the kvstore I got this error on one of my two instances. On Windows, the kvstore relies on the server cert being in the Windows local machine certificate store. At startup it converts the supplied PEM (with embedded cert and password protected key) into a PFX which it then imports into the store. The error referenced in Subject is preceded by:   ERROR MongodRunner [9060 KVStoreConfigurationThread] - Command cmd="{CMD.EXE /C ( "C:\Program Files\Splunk\bin\openssl.exe" pkcs12 -inkey "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem" -in "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem" -passin pass:xxxx -export -out "C:\Program Files\Splunk\etc\auth\mycerts\splunkd.pem.pfx" -passout pass:xxxx )}" failed: exited with code 1. unable to load private key\r\n10460:error:06065064:digital envelope routines:EVP_DecryptFinal_ex:bad decrypt:.\crypto\evp\evp_enc.c:590:\r\n10460:error:0906A065:PEM routines:PEM_do_header:bad decrypt:.\crypto\pem\pem_lib.c:476:\r\n   I tried this command and it did not work. I tried the openssl command by itself in PowerShell and it DID work. The problem turned out to be one of the special character in the password I had set on my private key. I used open SSL to write out a new copy of the key with a different password with no special characters and lo and behold it worked. Just be careful to replace the new password in all locations it might be used (EG: Twice in server.conf and in inputs.conf on an indexer). The command for changing the password on your key is as follows:   .\openssl.exe rsa -aes265 -in mykey.key -out mynewkey.key -passin pass:oldpassword -passout pass:newpassword   Just make sure you are aware of what the special character in your old password might be and use PowerShell not the command prompt
Hi Team, I have a field which has the values in the below string format:  HH:MM:SS.3N 0:00:43.096 22:09:50.174 1:59:54.382 5:41:21.623 0:01:56.597 I want to convert the whole duration... See more...
Hi Team, I have a field which has the values in the below string format:  HH:MM:SS.3N 0:00:43.096 22:09:50.174 1:59:54.382 5:41:21.623 0:01:56.597 I want to convert the whole duration into minutes and anything under a min is considered 1 minute
app is unable to collect metric data  (metric_name="Memory.Page_Reads/sec" ) can any one help in the app script. operating system is linux. 
On my attached picture these many events should become one event by ID instead of so many, how can I break those events by ID. ID is on first line "DRCProvision-[1663729240506]" like this  
Hi,  I have set up an alert and under Actions, I have added 'Add to triggered Alerts'.  I would like to be able to use an API to retrieve the actual results of a specific triggered alert (Example... See more...
Hi,  I have set up an alert and under Actions, I have added 'Add to triggered Alerts'.  I would like to be able to use an API to retrieve the actual results of a specific triggered alert (Example, get the results of the alert triggered at 17.43.  I am using alerts/fired_alerts/<alert_name> but it just gives me the list of trigger history.  Is it possible to be able to retrieve the actual results? Preferably in JSON  
Hello Team,  I am running below query to get the stats but I am looking to get the Store numbers in serial order, can you help me with the query ?  index=ABC env="XYZ" StoreNumber="*" | sort by S... See more...
Hello Team,  I am running below query to get the stats but I am looking to get the Store numbers in serial order, can you help me with the query ?  index=ABC env="XYZ" StoreNumber="*" | sort by StoreNumber | stats count by StoreNumber, country, Application Store Number country count 1 US 22 100 US 7 100 US 9 100 US 2 1000 US 13 1000 US 10 1002 US 9 1002 US 32 1018 US 22 1018 US 1 104 US 3 104 US 6 1055 US 9 1055 US 28 1081 US 39 1081 US 38 1086 US 1 1086 US 6 1086 US 1 109 US 1 109 US 2 1094 US 3 1094 US 9 11 US 3
Hi , Splunk 9 Universal Forwarder getting "[app key value store migration collection data is not available] " error after upgrade from Splunk 8.2.4.    Splunk is not starting   app key value store... See more...
Hi , Splunk 9 Universal Forwarder getting "[app key value store migration collection data is not available] " error after upgrade from Splunk 8.2.4.    Splunk is not starting   app key value store migration collection data is not available   Why is UF is looking for KV Store ?   Any suggestion to resolve it
Hi all, I'm trying to create a "Fallback escalation rate" for a chatbot. This rate would be calculated by users that hit the fallback intent, and then ask for an agent anytime after that, within a ... See more...
Hi all, I'm trying to create a "Fallback escalation rate" for a chatbot. This rate would be calculated by users that hit the fallback intent, and then ask for an agent anytime after that, within a given session.  For context, when a user says something, we use an intent classifier to try and match it to an intent. If we can't match the user input to an intent, it hits our fallback intent. And if a user asks for an agent, it hits the followup_live_agent intent. Each session contains multiple events, and each event represents one intent.  Today, we calculate "Escalation rate" by counting the sessions with at least one "followup_live_agent" intent. Here's the search query I created for that:      index=conversation sourcetype=cui-orchestration-log botId=123456 | eval AgentRequests=if(match(intent, "followup_live_agent"), 1, 0) | stats sum(AgentRequests) as Totaled by sessionId | eval Cohort=case(Totaled=0, "Cooperated", Totaled>=1, "Escalated") | stats count by Cohort | eventstats sum(count) as Total | eval Agent_Request_Rate = round(count*100/Total,2)."%" | fields - Total | where Cohort="Escalated"       I need to know how to calculate this same thing, but only after the fallback intent is hit. I figure I need to retain the timestamp and do some calculation using that. I'm not even sure how to get started on this, so if anyone could point me in the right direction, that would be really helpful. 
Hello, I have installed the DB Connect add-on, after restarting and logging into the APP, it keeps loading indefinitely until the error message appears. I have gone through all the threads related ... See more...
Hello, I have installed the DB Connect add-on, after restarting and logging into the APP, it keeps loading indefinitely until the error message appears. I have gone through all the threads related to this error message but none of them have helped me to solve the problem. root@myhost:/usr# java -version openjdk version "1.8.0_242" OpenJDK Runtime Environment (build 1.8.0_242-8u242-b08-1~deb9u1-b08) OpenJDK 64-Bit Server VM (build 25.242-b08, mixed mode)   index=_internal sourcetype=dbx*    
Hello All, Can someone help me with the steps to upgrade Splunk Universal Forwarder on Linux machines? Appreciate your help.   Thanks,
Let's say we have an alert which has a few field like:   | search <INSERT_RANDOM_BASE_QUERY> | table src_ip, _time, dest_ip | rename _time as "Time", src_ip as "Source IP", dest_ip as "Destination ... See more...
Let's say we have an alert which has a few field like:   | search <INSERT_RANDOM_BASE_QUERY> | table src_ip, _time, dest_ip | rename _time as "Time", src_ip as "Source IP", dest_ip as "Destination IP"   And we want to suppress on "Source IP" and "Destination IP" being the same. Should our suppress fields look like:   alert.suppress.fields = "Source IP","Destination IP"   Or:   alert.suppress.fields = Source IP,Destination IP    ?
Long story short, I was indexing my own data for years now and recently started forwarding up stream to another cluster. I don't need to index on my network anymore and just want to have my indexer s... See more...
Long story short, I was indexing my own data for years now and recently started forwarding up stream to another cluster. I don't need to index on my network anymore and just want to have my indexer serve as a heavy forwarder so I don't have to reconfigure 600+ endpoints. Is this feasible or will I break lots of things? Thanks!
Does anyone have troubleshooting steps on how to troubleshoot parse time or index time related issue.  The use case sourcetype override or sending thing to nullQueue and filter. The reason for aski... See more...
Does anyone have troubleshooting steps on how to troubleshoot parse time or index time related issue.  The use case sourcetype override or sending thing to nullQueue and filter. The reason for asking is that I didn't see anything in the internal logs or search string that was obvious to me.  Any tips can help...  Thanks in advance