All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Experts, Our Splunk Dashboard was converted from XML to HTML file. In the left hand side of the page, there are hyperlinks to various Dashboards/Views. When we click on those links, corresponding D... See more...
Experts, Our Splunk Dashboard was converted from XML to HTML file. In the left hand side of the page, there are hyperlinks to various Dashboards/Views. When we click on those links, corresponding Dashboards/Views would load in the right hand side of the page in the "iframe". Currenly I am getting "Splunk has deprecated HTML dashboards. We recommend all HTML dashboards to be built in Dashboard studio" message. Our customer uses Splunk Enterprise 8.2.6. To fix the above error, I tried to rebuild the Dashboard from HTML code to XML. The Dashboard is loading properly with hyperlinks on the left side of the page, but Dashboards/Views not getting loaded on the right hand side of the page in the "iframe". Splunk seem to be automatically removing the "iframe". Could you please help me how to fix this issue ? Thanks, Ravikumar
Hello Splunk Community! Regarding extract new fields in splunk search, what's the lifespan of the new created fields? will be available after re-login and available to all users? and can be ... See more...
Hello Splunk Community! Regarding extract new fields in splunk search, what's the lifespan of the new created fields? will be available after re-login and available to all users? and can be easily removed later? thank you in advance!
Hi geeks, I integrated the TheHive and Cortex with Splunk ES for getting some alerts after triggering the correlation search rule. According to the attached Image-01, please help me for filling the ... See more...
Hi geeks, I integrated the TheHive and Cortex with Splunk ES for getting some alerts after triggering the correlation search rule. According to the attached Image-01, please help me for filling the correct values for "Data field name" and "Datatype field name". Also, Do I have to specify the exact name according to what is in the Cortex to identify the "Analyzers"?   Image-01:   Image-02:   image-03:   Regards, Amir
Hi everyone, I have limited disk space on /var/log path, so I try to manage phantom log rotation ( follow this link: Configure the logging levels for Splunk SOAR (On-premises) daemons - Splunk Docum... See more...
Hi everyone, I have limited disk space on /var/log path, so I try to manage phantom log rotation ( follow this link: Configure the logging levels for Splunk SOAR (On-premises) daemons - Splunk Documentation) but I found a large file named "app_interface.log" that was not included in phantom_logrotate.conf Does anyone have any suggestions on what kind of records are collected in this file? and What is the best practice to rotate this file? Thank you  
Hi everyone, I want to process the delta which is null in the middle of a time series by taking the next delta after the null to divide to the (count of null + 1) Here is the data: time id va... See more...
Hi everyone, I want to process the delta which is null in the middle of a time series by taking the next delta after the null to divide to the (count of null + 1) Here is the data: time id value delta 01/02/2022 123 12   02/02/2022 123 15 3 03/02/2022 123 20 5 04/02/2022 123     05/02/2022 123     06/02/2022 123     07/02/2022 123 60 40 08/02/2022 123 60 0 09/02/2022 123     10/02/2022 123     01/02/2022 145 20   02/02/2022 145 50 30 03/02/2022 145 70 20 04/02/2022 145 100 30 05/02/2022 145     06/02/2022 145     07/02/2022 145 190 90 08/02/2022 145     09/02/2022 145     10/02/2022 145     01/02/2022 987 50   02/02/2022 987 100 50 03/02/2022 987 160 60 04/02/2022 987 200 40 05/02/2022 987 230 30 06/02/2022 987 280 50 07/02/2022 987 360 80 08/02/2022 987 420 60 09/02/2022 987 500 80 10/02/2022 987 550 50   Here is when I untable time 123 145 987 01/02/2022 0 0 0 02/02/2022 3 30 50 03/02/2022 5 20 60 04/02/2022 10 30 40 05/02/2022 10 30 30 06/02/2022 10 30 50 07/02/2022 10 30 80 08/02/2022 0 0 60 09/02/2022   0 80 10/02/2022   0 50 As in the table, there is 1 record in 08/02/2022 when delta is 0. For other data where it doesn't have data, it is null --> ok But for id=145, delta from 08 to 10/02 should be empty. So that when I calculate the avg(delta), it will not effect. So my question is how to distinguish the zero and the null in this case? Thanks!
Greetings!!   I'm getting the warning alerts showing me that splunk forwarder is not active, as shown on the below pic, splunk forwarder is running (/opt/splunkforwarder/bin/splunk status ... See more...
Greetings!!   I'm getting the warning alerts showing me that splunk forwarder is not active, as shown on the below pic, splunk forwarder is running (/opt/splunkforwarder/bin/splunk status ) but in Monitoring Console under Forwader:Management is not active it's showing a missing status,as shown on the above screenshot even when I try to stop and restart the splunkforwader service(/opt/splunkforwarder/bin/splunk stop) can't be stopped, as shown on the below screenshot Kindly help me on how i can fix the error,   Another error while searching:(  I am running splunk_security_essentials version 3.0.0.) *********************************** error 1: Could not load lookup=LOOKUP-splunk_security_essentials   error2: How about and root cause of this error2 above? and how to fix this?   Also find the this Warning error, I got from splunkd.log 05-22-2022 19:54:02.957 +0200 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read server certificate B', alert_description='certificate expired'. 05-22-2022 19:54:02.957 +0200 ERROR TcpOutputFd - Connection to host=x.x.x.17:9997 failed. sock_error = 0. SSL Error = error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed - please check the output of the `openssl verify` command for the certificates involved; note that if certificate verification is enabled (requireClientCert or sslVerifyServerCert set to "true"), the CA certificate and the server certificate should not have the same Common Name. 05-22-2022 19:54:02.960 +0200 ERROR X509Verify - X509 certificate (O=SplunkUser,CN=SplunkServerDefaultCert) failed validation; error=10, reason="certificate has expired" 05-22-2022 19:54:02.960 +0200 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read server certificate B', alert_description='certificate expired'. 05-22-2022 19:54:02.960 +0200 ERROR TcpOutputFd - Connection to host=x.x.x.16:9997 failed. sock_error = 0. SSL Error = error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed - please check the output of the `openssl verify` command for the certificates involved; note that if certificate verification is enabled (requireClientCert or sslVerifyServerCert set to "true"), the CA certificate and the server certificate should not have the same Common Name. 05-22-2022 19:54:02.964 +0200 ERROR X509Verify - X509 certificate (O=SplunkUser,CN=SplunkServerDefaultCert) failed validation; error=10, reason="certificate has expired" 05-22-2022 19:54:02.964 +0200 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read server certificate B', alert_description='certificate expired'. 05-22-2022 19:54:02.964 +0200 ERROR TcpOutputFd - Connection to host=x.x.x.14:9997 failed. sock_error = 0. SSL Error = error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed - please check the output of the `openssl verify` command for the certificates involved; note that if certificate verification is enabled (requireClientCert or sslVerifyServerCert set to "true"), the CA certificate and the server certificate should not have the same Common Name. Kindly help and guide me on how to fix this, Thank you in advance.        
Hi All, I have installed splunk UF on windows . I have one static log file in system (json)  and that need to be monitored.   I have configure this in inputs.conf file. I see only system/applicat... See more...
Hi All, I have installed splunk UF on windows . I have one static log file in system (json)  and that need to be monitored.   I have configure this in inputs.conf file. I see only system/application and security logs being sent to indexer whereas the static log file is not seen. I ran "splunk list inputstatus" and checked,    C:\Users\Administrator\Downloads\test\test.json file position = 75256 file size = 75256 percent = 100.00 type = finished reading   So, this means the file is being read properly. What can be the issue that I dont see test.json logs at splunk side ? I tried checking index=_internal at indexer but not able to figure out what is causing issue, I checked few blogs on Internet as well. Can anyone please help on this. Thanks in Advance, Newbie to splunk
After successfully installed Splunk cloud trial I was able to use it. Next day the system doesn't let me in with proper credentials (sc_admin) with the following message: Is it something expec... See more...
After successfully installed Splunk cloud trial I was able to use it. Next day the system doesn't let me in with proper credentials (sc_admin) with the following message: Is it something expected with trial version? Will it resolve by itself after some time?
Hi, I have created a React app and symlinked it to Splunk. It now appears as one of the installed apps but I am given no option to launch the app. What can I do? Thanks
Hi, I have a column timechart with numerical values, and I would like to add strings, or characters, after these values, when displayed on the dashboard. I have tried to append the string to the... See more...
Hi, I have a column timechart with numerical values, and I would like to add strings, or characters, after these values, when displayed on the dashboard. I have tried to append the string to the results themselves, but it seems like timechart is unable to populate non-numerical data. Any help or alternative ideas on how I can achieve the above results visually? Thanks.  
im trying to play around with tags from our aws environments (using the aws addon metadata input) the tags come in looking like this:   "TagList": [ {"Key": "Project", "Value": "project... See more...
im trying to play around with tags from our aws environments (using the aws addon metadata input) the tags come in looking like this:   "TagList": [ {"Key": "Project", "Value": "project1"} {"Key": "ProdState", "Value": "prod"} {"Key": "Product", "Value": "product1"} {"Key": "Team", "Value": "team1"} {"Key": "power_state", "Value": "-1:1800:1900:-1:1800:1900:-1:1800:1900:-1:1800:1900:-1:1800:1900:-1:-1:-1:-1:-1:-1:Australia/Brisbane"} {"Key": "CostCentre", "Value": "000000"} {"Key": "workload_type", "Value": "production"} {"Key": "Name", "Value": "name1"} ]​ to make life easier for myself, i am trying to un-nest these tags - ultimately i want this to look something like this: tags.Project = project1 tags.ProdState = prod tags.Product = product1 ... ive tried with a foreach like this (below), but it doesnt seem to get all my tags out - for example, it will only extract CostCenter, ProdState Project and workload_type.     index=testing source="xxxxxxxxxxxx:ap-southeast-2:rds_instances" | spath TagList{} output=tmp_taglist | foreach tmp_taglist{} [ rex "..Key.:\s\"(?<this_key>[^\"]+)\".\s.Value..\s\"(?<this_value>[^\"]+)\"." | eval tags.{this_key} = this_value ] | table DBInstanceArn tmp_taglist tags.*​     can anyone help me understand what i am doing wrong here ?
How can I get Splunk data in my dot net core application. I need to read the logs either via database, or web api calls.
This should be pretty easy but not sure why events are still coming in. We have hosts set up to send to multiple Splunk stacks and one is security only so we want to drop incoming perfmon data.  I'... See more...
This should be pretty easy but not sure why events are still coming in. We have hosts set up to send to multiple Splunk stacks and one is security only so we want to drop incoming perfmon data.  I've created the following:   Transforms:     [setnull] REGEX = (.) DEST_KEY = queue FORMAT = nullQueue     Props:     [Perfmon:ProcessorInformation] TRANSFORMS-proc=setnull [PerfmonMetrics:CPU] TRANSFORMS-cpu=setnull [PerfmonMetrics:LogicalDisk] TRANSFORMS-ldisk=setnull [PerfmonMetrics:Memory] TRANSFORMS-mem=setnull [PerfmonMetrics:Network] TRANSFORMS-net=setnull [PerfmonMetrics:PhysicalDisk] TRANSFORMS-pdisk=setnull [PerfmonMetrics:Process] TRANSFORMS-process=setnull [PerfmonMetrics:System] TRANSFORMS-sys=setnull       However these source types are still coming through! It's been pushed out to a cluster from the CM and can see it applied on the indexers. Any obvious mistakes?   Thanks!
Hello, I have a source file with a very large event size as I require to use TRUNCATE=1000000 in my props. Do you think....it would be any issue for SPLUNK indexer/UF to handle this large size of T... See more...
Hello, I have a source file with a very large event size as I require to use TRUNCATE=1000000 in my props. Do you think....it would be any issue for SPLUNK indexer/UF to handle this large size of TRUNCATE value or event size? Are there any other alternatives if there are any issues? Any recommendation would be highly appreciated. Thank you!
Greetings!!   I'm getting the warning alerts showing me that splunk forwarder is not active, as shown on the below pic, splunk forwarder is running (/opt/splunkforwarder/bin/splunk status ... See more...
Greetings!!   I'm getting the warning alerts showing me that splunk forwarder is not active, as shown on the below pic, splunk forwarder is running (/opt/splunkforwarder/bin/splunk status ) but in Monitoring Console under Forwader:Management is not active it's showing a missing status,as shown on the above screenshot even when I try to stop and restart the splunkforwader service(/opt/splunkforwarder/bin/splunk stop) can't be stopped, as shown on the below screenshot Kindly help me on how i can fix the error, Kindly help and guide me on how to fix this, Thank you in advance.        
Hi, I am trying to create a table but how do I  extract these information in my query? I tried double quote " " but it's just looking for exact word.  I want to list out like Subject: Account Name,... See more...
Hi, I am trying to create a table but how do I  extract these information in my query? I tried double quote " " but it's just looking for exact word.  I want to list out like Subject: Account Name, then Logon Info   Subject: Security ID: S-1 Account Name: - Account Domain: - Logon ID: 0x0 Logon Information: Logon Type: 3 Restricted Admin Mode: - Virtual Account: No Elevated Token: No   rmation: Logon Type: 3. I hope it makes sense. Thank you 
Hi, I am creating a React app and need to create a symbolic link with Splunk using yarn run link:app. Currently, I am using getting the following permissions issue: Can you please he... See more...
Hi, I am creating a React app and need to create a symbolic link with Splunk using yarn run link:app. Currently, I am using getting the following permissions issue: Can you please help?
I have a dashboard as the following    May'22 Apr'22 Mar'22 KPI 1 random% random% random% KPI 2  random% random% random% KPI 3 random% random% ra... See more...
I have a dashboard as the following    May'22 Apr'22 Mar'22 KPI 1 random% random% random% KPI 2  random% random% random% KPI 3 random% random% random% KPI 4 random% random% random% The percentages for the KPI's are coming fine but the user wants to be able to download the actual data or at least show the Numerator and Denominator in the same dashboard on mouse hover or something. Any Idea how this can be achieved?
Hi Team, For hands-on, I registered for the Splunk Cloud trial which provides me the access to Splunk Cloud platform for 14days. As I did not receive an email (not in Spam as well), I started lookin... See more...
Hi Team, For hands-on, I registered for the Splunk Cloud trial which provides me the access to Splunk Cloud platform for 14days. As I did not receive an email (not in Spam as well), I started looking at the settings > Instances. To my wonder, within the Instances page, I saw my previous work email address against the instance allocated. I changed this email address long back (should say at least 3 years back). I cross verified my profile and see that, it has my latest personal email address. Now the question, why is my free instance linked to my previous email address? Any idea on how to change? Thank you in advance.
Dashboard Classic are in use I implemented a table chart I'd like to modify the column size of the table Is there a way? A <- > B  Narrow the gap between A and B A B C  D ... See more...
Dashboard Classic are in use I implemented a table chart I'd like to modify the column size of the table Is there a way? A <- > B  Narrow the gap between A and B A B C  D