All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello everyone. Can someone tell me, how to monitor the kubernetes events in the AppDynamics Dashboard and the steps to configure it?
Hello, As you can see below, I have enabled HEC on my Splunk Cloud trial instance and I have carefully read the https://docs.splunk.com/Documentation/SplunkCloud/9.0.2209/Data/UsetheHTTPEventCollec... See more...
Hello, As you can see below, I have enabled HEC on my Splunk Cloud trial instance and I have carefully read the https://docs.splunk.com/Documentation/SplunkCloud/9.0.2209/Data/UsetheHTTPEventCollector?#Send_data_to_HTTP_Event_Collector_on_Splunk_Cloud_Platform section.   However, I can't get to nslookup the url of my HEC, which should be of the form http-inputs.prd-X-XXXnn.splunkcloud.com  (if I read the documentation correctly). Is there anything I've missed to make my HEC properly configured?  
Hello, I'm interested in minimizing the amount of noise generated by notables in one of my customer's environments, which has produced 3500 notables in the past 15 days. Is it necessary to review all... See more...
Hello, I'm interested in minimizing the amount of noise generated by notables in one of my customer's environments, which has produced 3500 notables in the past 15 days. Is it necessary to review all 3500 notables to filter out common events?
What is the equivalent Splunk Cron expression for the below Cron. 0 0 0 ? * 7#1 * An alert needs to be configured for every month 1st Saturday at 00:05 AM.
hi, I use Dashboard studio dashboard and old XML Dashboard on my Apple TV. I want to display geographical data on the TV. Unfortunately, the new Dashboard studio geo views do not work on the TV, ... See more...
hi, I use Dashboard studio dashboard and old XML Dashboard on my Apple TV. I want to display geographical data on the TV. Unfortunately, the new Dashboard studio geo views do not work on the TV, only the old XML dashboards do. Here I have so far only pie representation on the map to display. Is there somewhere an overview which geo representation can be displayed on the Apple TV and which charts work. Thanks Michael
We have 8 windows servers, where splunk universal forwarder is installed and it will forward all the logs to splunk indexer servers and from splunk search header we can monitor all the logs related t... See more...
We have 8 windows servers, where splunk universal forwarder is installed and it will forward all the logs to splunk indexer servers and from splunk search header we can monitor all the logs related to the servers.        In some instances we have seen, the windows servers sometime goes into hang state, but we don't have any alert mechanism to get notified on the same. So kindly help us to understand how to accommodate the particular monitoring through splunk.  Below is the splunk architecture for our environment:
HI Team, I am facing an issue in splunk XML Dashboard in a panel table. Below is the snippet of our panel table. We tried to change the background color of each column of the table to white ... See more...
HI Team, I am facing an issue in splunk XML Dashboard in a panel table. Below is the snippet of our panel table. We tried to change the background color of each column of the table to white but we are unable to do it. Even we tried to use the formatting option by selecting the paint icon. But we are unable to change the background color. We request you to kindly suggest us the needful in order to change the background color  
Hello everyone, I need your help for something, please. I need to remove the decimal value for this fields: - total - hier My spl request is : | union [ search index="pasrau_statuts_... See more...
Hello everyone, I need your help for something, please. I need to remove the decimal value for this fields: - total - hier My spl request is : | union [ search index="pasrau_statuts_count" libelle IN ("Envoi SNGI OK", "Envoi SNGI KO", "RECU") | lookup lk_etapes_pasrau_2020_rj libelle output evenement, ordre ] | join type=outer libelle [ search index=pasrau_statuts_count earliest=-100d@d latest=@d libelle IN ("Envoi SNGI OK", "Envoi SNGI KO", "RECU") | eval hier=count | table libelle hier ] | eval delta=case(hier < count, "+".(count-hier), hier > count, "-".(hier-count), hier=count, "0") | eval libelle=ordre+libelle, total=ordre+"."+count, hier=ordre+"."+hier | dedup libelle sortby lookup |stats list(libelle) as libelle , list(total) as total list(hier) as hier list(delta) as "DeltaJ/J-1" by evenement | sort libelle |rex field=libelle mode=sed "s/^[0-9]+//g" |rex field=total mode=sed "s/^[0-9]\.+//g" |rex field=hier mode=sed "s/^[0-9]\.+//g"   Thank you so much    
Hi Splunkers,  I have been using Splunk for a while and went through many proposed solutions in this community and found none to get what I want. This could be due to high volume of events I have f... See more...
Hi Splunkers,  I have been using Splunk for a while and went through many proposed solutions in this community and found none to get what I want. This could be due to high volume of events I have for each month or I am doing something wrong. So, the challenge I face is that I have a field lets call it "Pages" and I want to compare the last two months customers visiting the top 10 most visited pages.  I have used below query, but the number of events for both months shows the same: (Note: the example below covers the previous day and last day to save time while searching)           Main search... | addinfo | eval timephase1=if(_time>=relative_time(info_max_time, "-2d@d"), "last_month", null()), timephase2=if(_time>=relative_time(info_max_time, "-1d@d"), "this_month", null()) | stats count(timephase1) as time1 count(timephase2) as time2 by Pages | sort -time1 | head 10            Any assistance will be appreciated!
Hello everyone, I've installed Splunk Add-on for Jira Cloud v1.0.1 on Splunk Enterprise v8.1 and I need to configure a Proxy for go to JIRA Cloud, but it doesn't work. In Splunk UI, I've configur... See more...
Hello everyone, I've installed Splunk Add-on for Jira Cloud v1.0.1 on Splunk Enterprise v8.1 and I need to configure a Proxy for go to JIRA Cloud, but it doesn't work. In Splunk UI, I've configure the JIRA Domain and then show "Failed to connect to validate domain". Use TCPDump to understand the network flow without through internal proxy. I've add the proxy configs under $SPLUNK_HOME/etc/system/local/server.conf and try to connect successfully by cURL. May I know how it is? Have any suggestion? Thank you. Best Regards,
Folks, Please clarify if my understanding is correct or not. We can see current system requirement of linux kernel version at below link. https://docs.splunk.com/Documentation/Splunk/latest/Instal... See more...
Folks, Please clarify if my understanding is correct or not. We can see current system requirement of linux kernel version at below link. https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements From this page,  my understanding is splunk officially support "ANY" linux distribution that uses supported kernel version. So if I use Debian10 with kernel 5.4 or 4.x, I can use UF on that environment. Am I right?
Hi All, we had successfully upgraded to Splunk 9.0.4.  However, we observed that when using tstats command, we are getting the below message. normal searches are all giving results as expected. [... See more...
Hi All, we had successfully upgraded to Splunk 9.0.4.  However, we observed that when using tstats command, we are getting the below message. normal searches are all giving results as expected. [indexer1,indexer2,indexer3,indexer4.indexer5] When used for 'tstats' searches, the 'WHERE' clause can contain only indexed fields. Ensure all fields in the 'WHERE' clause are indexed. Properly indexed fields should appear in fields.conf.   Any idea why we are getting this and how to resolve it.  
I'm currently using Splunk's slack integration to try to send alerts. I wanted to know how to send an entire table's worth of data in one message. From what it looks like, combining data into m... See more...
I'm currently using Splunk's slack integration to try to send alerts. I wanted to know how to send an entire table's worth of data in one message. From what it looks like, combining data into mv fields as described in This post and This post only displays the topmost row (the header) in Slack. Any assistance in getting this to work is appreciated as I'd rather not spam the slack channel that I'm integrating to.
  I have a search query where the "# of Transactions Processed:" string sometimes contains one or more whitespaces before the numeric value but sometimes no whitespace The below works fine for th... See more...
  I have a search query where the "# of Transactions Processed:" string sometimes contains one or more whitespaces before the numeric value but sometimes no whitespace The below works fine for this example message  | search message="# of Transactions Processed:51"  | rex field=message "Processed:(?<ProcessedCount>\d+)"   But if the message has a space after the colon and before the number, then it doesn't match unless I add a space after the colon: | search message="# of Transactions Processed: 51"  | rex field=message "Processed: (?<ProcessedCount>\d+)"   Been struggling to find a good solution to this and was wondering if any of you Splunkers had figured this out previously    I could add another |rex with the second option but was hoping there was a more elegant and efficient way to accomplish it.   many thanks in advance !  
Let's say I have the following data that I extracted from JSON into a field called myfield. If I were to print out the values of myfield in a table, for each event, I would have an array of a variabl... See more...
Let's say I have the following data that I extracted from JSON into a field called myfield. If I were to print out the values of myfield in a table, for each event, I would have an array of a variable number of key value pairs.  myfield {"K1":"V1","K2":"V2","K3":"V3",....."KN":"VN"} {"A":"X"} {"B":"Y","C":"Z"}   How do I extract only the values (and not the keys) as a new array for each one? The output would look like: my_processed_field ["V1","V2","V3",....."VN"] ["X"] ["Y","Z"]   Help is much appreciated! Thanks!
my lookup table is history data for the search I am running. from my search and my lookup table I have command field is ID.  I am trying to match the ID from my search to lookup table and display the... See more...
my lookup table is history data for the search I am running. from my search and my lookup table I have command field is ID.  I am trying to match the ID from my search to lookup table and display the result from lookup that not match to my search table.   lookup table name save.csv my spl  base search | table _time field1 ID field2 _time field1 ID field2 02/23/23 DEMO1 1054 xyc 02/23/23 Demo2 1426 xyd below is my lookup table _time field1 ID 02/23/23 DEMO1 1054 02/10/23 DEMO2 1426 02/05/23 DEMO3 8746
my spl  base search |transaction ID | table date field1 field2 ID my result    Date                 field1      fiel2         ID 02/20/23        CCC        2k               10 02/20/23     ... See more...
my spl  base search |transaction ID | table date field1 field2 ID my result    Date                 field1      fiel2         ID 02/20/23        CCC        2k               10 02/20/23         c2           4k                11 02/10/23.         CC          2k             08 02/01/23           C             5k              01 but I only want to output latest result which 02/20/23 assuming begin of the I don't date for latest event.   
Hello, I'm working on IOC but unfortunately, keeping them in a lookup table is already getting messy and we have to index them now, so I have this query running every hour to check if our Threat In... See more...
Hello, I'm working on IOC but unfortunately, keeping them in a lookup table is already getting messy and we have to index them now, so I have this query running every hour to check if our Threat Intel source has an updated IOC or if any of those IOC that has been recorded already is now active again based on the information collected from Threat Intel and then writing them now in one of our indexes using the "collect" command. Keeping them in the lookup table and deleting the old or duplicate IOC based on their "last_updated_date" is quite easy, however since we have to move them now or write them now in an index, we cannot avoid having duplicate IOC being written in that index like, for example, an IOC has been active in the wild for a week or for a month and the Threat Intel source will keep us posted whenever there's a new update, so if updates happened for an IOC 5 times for a week, then we have 5 duplicate IOCs in our index having 5 different "last_updated_date" including the time when the Threat Intel gives update like for example within 24 hours. So my plan is to have an SPL query that will run once or twice a day to delete those old IOCs with old "last_updated_date" but my question is, how I can run a search that will only show up those old data (not showing the IOC with the latest data in the result) and then I will add in the end the "delete" command. the example below "4/18/2023" is the latest Indicator last_updated_date abc123 4/18/2023 abc123 4/17/2023 abc123 4/16/2023 abc123 4/15/2023 abc123 4/14/2023   requirement: to have a query that will give a result like the one below (knowing that the latest one is not included but still it is somewhere in the index and to make sure not to be deleted when we add the "delete" command at the end of the possible query)  Indicator last_updated_date abc123 4/17/2023 abc123 4/16/2023 abc123 4/15/2023 abc123 4/14/2023   The one below is an example of the current query I have index=threatintel indicator="0.tcp.ap.ngrok.io" | stats latest(indicator) latest(last_updated_date) by indicator But this one is showing only the latest one even though the indicator has been written in my index 12 times already, so what I'm trying to do is to list out all of the 11 indicators and just retain the 1 with the latest "last_updated_date" ------------------------------------------------------------------------------ I have another sample but this one is already good as my target here is to delete the IOC with more than 90 days of "last_updated_date" - you will see at the bottom that I have written the "delete" command as this query is showing me the events that are more 90 days old and not active anymore in the wild and also good to be deleted in our index index=threatintel | eval ninety_days_ago = relative_time(now(), "-90d@d") | eval last_updated_date = strptime(last_updated_date,"%Y-%m-%d %H:%M:%S") | where last_updated_date < ninety_days_ago | eval last_updated_date = strftime(last_updated_date, "%Y-%m-%d %H:%M:%S") | table indicator ioc_type last_updated_date | delete Hope someone can help  Best regards wvpony
The search which is fetching based on one of the nested fields "labels.errorCode" does not return the same results, query returning the wrong number of results This search below returns the right... See more...
The search which is fetching based on one of the nested fields "labels.errorCode" does not return the same results, query returning the wrong number of results This search below returns the right results. But we would like to search based on the field labels.errorCode. query returning the right number of results
my Spl is  my base search | transaction ID | stats count values(Date) as Date value(field1) as field1 by ID I get result  Date                 field1               ID 02/20/23.         CCC ... See more...
my Spl is  my base search | transaction ID | stats count values(Date) as Date value(field1) as field1 by ID I get result  Date                 field1               ID 02/20/23.         CCC              10 02/10/23 02/05/23 02/10/23.         CC                  08 02/05/23 02/01/23           C                     01   Is there anyway in Splunk to search in Date field? I am try to display result without Date 02/20/23 I try search Date!="02/20/23" and where Date="02/20/23" can anyone help is do able in splunk?