All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am having below search string and suppose the file "magic_new.log" has no events and the requirement is to show that as output index=magic  source IN ("D:\\show\\magic.log", "D:\\show\\magic_new... See more...
I am having below search string and suppose the file "magic_new.log" has no events and the requirement is to show that as output index=magic  source IN ("D:\\show\\magic.log", "D:\\show\\magic_new.log", "D:\\show\\magic_old.log") | stats count by source | where count=0 current output- no results found expected output- source                                                    count D:\show\magic_new.log                0  (Note: Tried multiple solutions suggested in the community but none worked)
Hi  In July, 2021 Google has published new version of Android Gradle Plugin (AGP) 7.0.0 and Android Studio: https://developer.android.com/studio/releases/gradle-plugin#7-0-0 So far latest version ... See more...
Hi  In July, 2021 Google has published new version of Android Gradle Plugin (AGP) 7.0.0 and Android Studio: https://developer.android.com/studio/releases/gradle-plugin#7-0-0 So far latest version of AppDynamics plugin for Android is 21.6.0 When trying to build the project, there is such error probably caused by new API in latest AGP: Some problems were found with the configuration of task ':app:appDynamicsProcessProguardMappingDebug' (type 'ProcessProguardMappingFileTask'). - Type 'com.appdynamics.android.gradle.ProcessProguardMappingFileTask' property 'applicationName' is missing an input or output annotation. It is the same even if Proguard is disabled (although by obvious reasons we are not going to disable Proguard anyway) The issue is reproducible on sample project we created based on basic Android project to make sure it is not caused by our project setup: https://github.com/silin/appdynamics-mapping-upload-issues/tree/latest_gradle_plugin When we try to switch our project to latest versions of Android Studio and AGP, AppDynamics becoming a blocker for this. Dev team is really frustrated with this, because they cannot use latest tools for development, and because it is not the first time when AppD becomes a blocker we constantly have discussions of dropping AppDynimics as a monitoring tool. Could you please share some ideas about how to fix this or in what version of AppD and when it can be potentially fixed?
Hello i have a table that looks like this :  and i want it to look like this:   so the type values will be the header  what should i do ? thanks
Hi, I try to test a simple chart option "charting.legend.mode": "seriesCompare" via ChartView components of the Splunk Web Framework . I want the legend display/compare all  differences point values... See more...
Hi, I try to test a simple chart option "charting.legend.mode": "seriesCompare" via ChartView components of the Splunk Web Framework . I want the legend display/compare all  differences point values of 4 lines as we move alongs the line.  Is this a limite of ChartView or i am missing something here? thanks for your time!!! able to do in Splunk Entreprise only: g.highcharts legend not able to display difference point values. Seem like, option standard is the only avaiable option.  
0
Getting the below error for one panel of a Dashboard while exporting as PDF. Splunkd daemon is not responding: ('Error connecting to https://[::1]:8089/services/search/jobs/xxxxxx_ _xxxxxx_c29ueV9n... See more...
Getting the below error for one panel of a Dashboard while exporting as PDF. Splunkd daemon is not responding: ('Error connecting to https://[::1]:8089/services/search/jobs/xxxxxx_ _xxxxxx_c29ueV9nc2lydF9zb2M__RMD5d1f52a5d3044c8e9_1630297191.55342_74FD1776-A60D-44 F0-9CC3-A343C2FFBFAC/results: The read operation timed out',)
Is this possible to transform a data set from :   Time User Number of Errors 9 pm Josh 2 9 pm Andy 1 10 pm Josh 0 10 pm Andy 1 11 pm Josh 1 11 pm Andy 3 ... See more...
Is this possible to transform a data set from :   Time User Number of Errors 9 pm Josh 2 9 pm Andy 1 10 pm Josh 0 10 pm Andy 1 11 pm Josh 1 11 pm Andy 3 to :   Time User Number of Errors 9 pm Josh 2 9 pm Andy 1 9 pm Total Number of Errors 3 10 pm Josh 0 10 pm Andy 1 10 pm Total Number of Errors 1 11 pm Josh 1 11 pm Andy 3 11 pm Total Number of Errors 4 ?  I've tried to use  :    <insert index here> | convert num("Number of Errors") as NumberofErrors |eval Total_Number_of_Errors= Josh + Andy |table Time User Number of Errors   However  its erroring out when i try to run this query .  
I would like to write in splunk a nested if loop: What I want to achieve: if buyer_from_France:                     do eval percentage_fruits                    if percentage_fruits> 10:         ... See more...
I would like to write in splunk a nested if loop: What I want to achieve: if buyer_from_France:                     do eval percentage_fruits                    if percentage_fruits> 10:                                                          do summation                                                          if summation>20:                                                                                           total_price                                                                                            if total_price>$50:                                                                                                                       do(trigger bonus coupon) My current code (that works): > | eventstats sum(buyers_fruits) AS total_buyers_fruits by location > | stats sum(fruits) as buyers_fruits by location buyers > | eval percentage_fruits=fruits_bought/fruits_sold > | table fruits_bought fruits_sold buyers > | where percentage_fruits > 10 > | sort - percentage_fruits How do I complete the syntax/expression for the 2nd (summation) and consequently, 3rd (total price), 4th if-loop (trigger)?
I am using timewrap to compare data for a particular day of the week with same day of the week for last 4 weeks. i.e comparing wednesday to last 4 wednesday's. when i see the graph it gives current ... See more...
I am using timewrap to compare data for a particular day of the week with same day of the week for last 4 weeks. i.e comparing wednesday to last 4 wednesday's. when i see the graph it gives current week's wednesday as timescale on x axis, also when i hover the charts for previous wednesday it still mentions current wednesday + 2 week's before. Is this splunk's limitation or is there a way that previous wednesday's data should show exact date of those previous 4 wednesdays.
Hello Splunk community,  For this dataset :  Time Agent Number of calls taken 11:00 AM John 1 11:00 AM Kate 0 11:00 AM Eric 1 10:00 AM John 2 10:00 AM Kate 1 10:00 ... See more...
Hello Splunk community,  For this dataset :  Time Agent Number of calls taken 11:00 AM John 1 11:00 AM Kate 0 11:00 AM Eric 1 10:00 AM John 2 10:00 AM Kate 1 10:00 AM Eric 0 9:00 AM John 0 9:00 AM Kate 1 9:00 AM Eric 1 8:00 AM John 3 8:00 AM Kate 1 8:00 AM Eric 2 7:00 AM John 3 7:00 AM Kate 5 7:00 AM Eric 2 6:00 AM John 2 6:00 AM Kate 3 6:00 AM Eric 0 Is it possible to get a moving average for each agent along with the moving average for the total amount of calls in one specific hour  and to place this all into a time chart?  this is the Splunk query I'm currently using : | union [| search <insert index here> AGENT=* | bin _time span=1h | stats count BY _time | trendline wma2(count) AS AverageNumberoftotalcallsperhour |table _time AverageNumberoftotalcallsperhour ] [| search <insert index here> Agent=Kate| bin _time span=1h | stats count BY _time | trendline wma2count) AS AvgKate |table _time AvgKate ] [| search<insert index here> Agent=John| bin _time span=1h | stats count BY _time | trendline wma2(count) AS AverageNumberOfCallsPerHourbyJohn |table _time AverageNumberOfCallsPerHourbyJohn ] [| search<insert index here> Agent=Eric| bin _time span=1h | stats count BY _time | trendline wma2(count) AS AvgEric |table _time AvgEric ] However, when trying to run the splunk query,  the output isn't correct :  _time AverageNumberoftotalcallsperhour AvgKate AverageNumberOfCallsPerHourbyJohn AvgEric     6:00 AM 2           7:00 PM 2           8:00 AM   3         9:00 AM   3         10:00 AM     4       11:00 AM     4       Noon       5                    
Hi, We are in the process of migrating all Apps/Config's from an older standalone instance(7.2.4.2) to a newer SHC(8.1.1). A datamodel was also migrated along with the App and appears to be working ... See more...
Hi, We are in the process of migrating all Apps/Config's from an older standalone instance(7.2.4.2) to a newer SHC(8.1.1). A datamodel was also migrated along with the App and appears to be working fine in terms of acceleration statistics. But when I try to access using tstats, format that worked previously returns nothing. | tstats summariesonly=t count FROM datamodel="modelname.dataset" by dataset.field But if I do not mention dataset in the FROM cause, it works just fine. | tstats summariesonly=t count FROM datamodel="modelname" by dataset.field   Could I have missed something during the migration? What could be causing the previous command to not work.
Hello, I'm trying to setup Splunk in a lab environment. I've got one windows client which I want to send logs over to my Splunk server via a UF. I am managing the endpoint's splunk config via a dep... See more...
Hello, I'm trying to setup Splunk in a lab environment. I've got one windows client which I want to send logs over to my Splunk server via a UF. I am managing the endpoint's splunk config via a deployment server. This works fine, the client checks in, my apps get pushed to it, all fine. For windows logs, I'm using the Splunk TA for Windows (https://splunkbase.splunk.com/app/742/#/overview) with an inputs.conf as below   [WinEventLog://Security] disabled = 0 start_from = oldest current_only = 0 renderXml=true evt_resolve_ad_obj = 1 index = windows [WinEventLog://System] disabled = 0 renderXml=true evt_resolve_ad_obj = 1 index = windows [WinEventLog://Application] disabled = 0 renderXml=true evt_resolve_ad_obj = 1 index = windows     The app gets deployed correctly and I see the above inputs.conf in the %SPLUNK_HOME%/apps/Splunk_TA_windows/local/inputs.conf. However, in Splunk, I don't seem to be getting all the logs. In fact, I'm only getting event id 6xxx logs and very few (43 events/15mins)   I can't figure out why all the logs aren't coming in but only a few irrelevant ones. Any help will be much appreciated. Thank you!
SUPPOSE there is a panel to display database symbol..Now bottom of tht symbol i waant to display two value which is in a single tag...the value is dynamic. <panel><single></single><single></single></... See more...
SUPPOSE there is a panel to display database symbol..Now bottom of tht symbol i waant to display two value which is in a single tag...the value is dynamic. <panel><single></single><single></single></panel>   I used styling for single tag value it is displaying top and bottom but i need it side by side
Hi, I get the exactly same count for avg and peak, any issue with my query?   index=a sourcetype=ab earliest=-30d latest=now | bucket _time span=1mon | stats count by _time | eval date_month=str... See more...
Hi, I get the exactly same count for avg and peak, any issue with my query?   index=a sourcetype=ab earliest=-30d latest=now | bucket _time span=1mon | stats count by _time | eval date_month=strftime(_time, "%b") | eval date_day=strftime(_time, "%a") | stats avg(count) as AverageCountPerDay max(count) AS Peak_Per_Month by date_month, date_day   date_month date_day AverageCountPerDay Peak_Per_Month Aug Sun 82037650 82037650 Jul Thu 4621995 4621995
I have no idea what I need to do here (if anything), and the guy who has dealt with getting data in previously is on holiday for a while so any advice is much appreciated We upgraded our Palo Alto f... See more...
I have no idea what I need to do here (if anything), and the guy who has dealt with getting data in previously is on holiday for a while so any advice is much appreciated We upgraded our Palo Alto firewall to a newer version which has moved the VPN logs from the system category to a separate one for GlobalProtect (more info here ) When I noticed we weren’t receiving the VPN logs anymore, we got the firewall guys to forward the new log category to us and our Splunk guy assured me that we wouldn’t need to do anything else However, the logs are supposedly being forwarded to us now but Splunk isn’t showing them, at least not in the index we have for the Palo Alto logs. Is our Splunk guy wrong and we do actually have to manually set up the new sourcetype? Or have the firewall guys messed up (harder to check due to language barriers and time differences) I am pretty clueless about this so apologies if this is a silly question
Hey all, I am in the process of migrating from a Windows Heavy Forwarder to a Linux Heavy Forwarder for Splunk Cloud. Part of this exercise involves migrating the Splunk DB Connect App from the Wind... See more...
Hey all, I am in the process of migrating from a Windows Heavy Forwarder to a Linux Heavy Forwarder for Splunk Cloud. Part of this exercise involves migrating the Splunk DB Connect App from the Windows HF Box to the new Red Hat 8.4 HF box. Quick Details: Splunk DBConnect 3.6.0 DBConnect App Host Red Hat Linux 8.4 ga OpenJDK(Coretto) 11.0.2 LTS MS SQL Server Windows Server 2016 Standard (1607) MS SQL 2014 (12.0.4522.0) Registry is modified to allow TLS 1.1 and 1.2 under Schannel (via IISCrypto Tool) Registry is also modified to allow strong SChannel .Net configuration. This was mentioned to be necessary for SQL Server 2019, but I figured it might apply to 2014 as well (https://learn.mediasite.com/course/enabling-tls-1-2/lessons/sql-server-configuration/) I basically duplicated the configuration from the original Windows Server that ran DBConnect App. I brought over the same connection information as well as the same identity information. I've validated that the identity information is correct. I am getting the following error:     Database connection server.domain.com is invalid. The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Certificates do not conform to algorithm constraints".     This seems to imply that there is some sort of certificate negotiation error. I have browsed through the DBConnect documentation but nothing inside there seems to help. I noticed a few different keystores around the db connect app and I tried messing with a few of them, but without luck. Currently in the "keystore" folder, I have loaded a domain PKI cut/issued cert/key pair and the domain PKI CA chain. None of those seem to make any difference. My basic connection string looks like the following in the edit url box:     jdbc:sqlserver://server.domain.com:1433;databaseName=Splunk;selectMethod=cursor;encrypt=true     I've tried various differentiations of this as well like:     jdbc:sqlserver://server.domain.com:1433;databaseName=Splunk;selectMethod=cursor;encrypt=true;trustStore=/opt/splunk/etc/apps/splunk_app_db_connect/keystore/default.jks;trustStorePassword=password     I wasn't sure how to configure the JRE installation path and I also wasn't too positive on where it was located in the Red Hat 8.4 instance. I did some tracking and I think it's loaded here:     /usr/lib/jvm/java-11-openjdk-11.0.12.0.7-0.el8_4.x86_64/     I mainly did that because it appears like JAVA_HOME wasn't set in the OS. I could have set it but I figured it would take out any potentially issues if I just pointed it directly at the folder. I haven't had much luck. I loaded up wireshark and confirmed I could see the connection and I do see the inbound 1433 connection from the heavy forwarder. I do see the active connection being made, but because it's over 1433 Wireshark isn't showing any TLS negotiation. I am not sure if that's an issue or not. I am not sure where else to go from here. Does anyone have any thoughts?
I started exploring the new studio and created some dashboards. Cool but new so not much community wisdom and examples. Only the documentation . Is there a dedicated forum for the Studio?  The label... See more...
I started exploring the new studio and created some dashboards. Cool but new so not much community wisdom and examples. Only the documentation . Is there a dedicated forum for the Studio?  The label "studio" is not available here.  My question: is there a way for a custom color in a table in the new Studio?  Similar to the custom color codein the simple XML:  <format type="color" field="appid_tag">           <colorPalette type="expression">case(value="Banana","#a740a2")</colorPalette>         </format>   Next question up: can I use conditional color in Studio, similar to the one in simple xml:   <format type="color" field="tags">           <colorPalette type="expression">case(match(lower(value), "`"),"#b6fcd5")</colorPalette>         </format>    
Hi , A newbie to Splunk here. I have found the query for  login info for users on a host:  index=os  source=var/log/secure  host=myhost process=sshd  I want to trigger an alert if a user  who has ... See more...
Hi , A newbie to Splunk here. I have found the query for  login info for users on a host:  index=os  source=var/log/secure  host=myhost process=sshd  I want to trigger an alert if a user  who has logged in before,  logs in to the host after more than 90 days. Could someone please help me how to write a query .  So the user should not have logged in for more than 90 days on the host.  Thank you  
hi  i  have obtained a table like this code                       status                             count 1                        27 aug 2021success                45 1                     27 au... See more...
hi  i  have obtained a table like this code                       status                             count 1                        27 aug 2021success                45 1                     27 aug  2021 failure                   0   i  want  a format  like  this   code   27  aug 2021 success         27 aug 2021 failure 1             45                                                  0
I will have POC with customer. Is it possible to have single site / multi site cluster on Splunk Enterprise Trial license?