All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

hey, I cant use |timechart count span=1d to calculate recent 8 days count, search result as follow: _time count 2020/05/21 100 2020/05/22 120 2020/05/23 180 2020/05... See more...
hey, I cant use |timechart count span=1d to calculate recent 8 days count, search result as follow: _time count 2020/05/21 100 2020/05/22 120 2020/05/23 180 2020/05/24 200 2020/05/25 270 2020/05/26 380 2020/05/27 490 2020/05/28 680 now,I want to calculate the increase quantity of each day compared with the previous day. The results should be as follows _time increase 2020/05/22 20 2020/05/23 60 2020/05/24 20 2020/05/25 70 2020/05/26 110 2020/05/27 110 2020/05/28 190 then use timechart show the increase quantity |timechart count span=1d is there have a simple search statement to do it?
I have an xml file in a logging statement that I extracted 3 instances of the value . These values are correctly displayed in a table in separate columns. The xml file will have 2 or 3 instances of ... See more...
I have an xml file in a logging statement that I extracted 3 instances of the value . These values are correctly displayed in a table in separate columns. The xml file will have 2 or 3 instances of the value: **** This is the query: source="messaging-service.log" sourcetype="hidden" "createMessage MsgSource" | xmlkv | rex max_match=0 "\<purchCostReference\>(?P<segment>[^\<]+)" | eval Segment1 = if(isnotnull(mvindex(segment, 0)), "FirstSegment", ""), Segment2 = if(isnotnull(mvindex(segment, 1)), "SecondSegment", ""), Segment3 = if(isnotnull(mvindex(segment, 2)), "ThirdSegment", "") | table purchCostReference, eventType, Segment1, Segment2, Segment3 I tried using the case statement but it only returns the first value, FirstSegment in the table. sourcetype... | xmlkv | rex max_match=0 "\<purchCostReference\>(?P<segment>[^\<]+)" | eval Segments = case(isnotnull(mvindex(segment, 0)), "FirstSegment", isnotnull(mvindex(segment, 1)), "SecondSegment", isnotnull(mvindex(segment, 2)), "ThirdSegment") | table purchCostReference, eventType, Segments | eventstats list(Segments) as Segments by purchCostReference, eventType | sort purchCostReference, eventType I would like there to be 1 column, Segment and the FirstSegment, SecondSegment, ThirdSegment be listed in the column. Is there any Splunk function that allows me to create a group called 'Segment' and add the variables, FirstSegment, SecondSegment, ThirdSegment to it?
Hi, everyone A few months ago I did change a node property from UI (appdynamics.agent.metricLimits), but I tried to do that now and that didn't work. I saw the documentation https://docs.appdyn... See more...
Hi, everyone A few months ago I did change a node property from UI (appdynamics.agent.metricLimits), but I tried to do that now and that didn't work. I saw the documentation https://docs.appdynamics.com/pages/viewpage.action?pageId=45490480 and this configuration changed. Which one do I need to use, max-metrics-allowed or Dappdynamics.agent.maxMetrics? Thanks for your time. Regards. ^ Edited by @Ryan.Paredez for readability
I am only curious for a certain index index=abc | stats count by host | stats sum(count) AS Total BY host | where Total>0 This search is good to see how many logs are coming in f... See more...
I am only curious for a certain index index=abc | stats count by host | stats sum(count) AS Total BY host | where Total>0 This search is good to see how many logs are coming in for my hosts in that index but the problem is when a host stops sending I have no alert for it. I tried changing the "|where Total>=0" but it took off the host from my table when it hit zero. How can I adjust or change my query to make it so I can alert when a host hits 0 logs.
I know this is kind of repeating a existing question "Has there been an update to the project?" --- Well, that was answered in Feb. 2017, and the latest commit on the GitHub page shows as one month l... See more...
I know this is kind of repeating a existing question "Has there been an update to the project?" --- Well, that was answered in Feb. 2017, and the latest commit on the GitHub page shows as one month later in March 2017. A lot has happened in three years. My own experience thus far has been this: Ponydocs doesn't work with the latest version of Mediawiki (1.34.1 as of this writing) probably because: The Ponydocs installation instructions don't match up with changes made in the way Mediawiki registers extensions, which is now done by using wfLoadExtension function in the LocalSettings.php file. While Mediawiki works with PHP 7, Ponydocs doesn't remotely come close. Rules out my Ubuntu 18.04 server... Ponydocs probably works with Mediawiki 1.24, as recommended in the installation instructions, except that my Ubuntu 16.04 server is running PHP5.5, while Ponydocs recommends that PHP 5.2 or 5.3 is used, which have since reached end-of-life. So, to reiterate the question: Is anybody working on this? Are there any plans for upgrades to bring this into the modern era? It looks like a fantastic piece of software, and just the solution I'd like to implement, but it would be nice to know where things are headed. I suppose I could spin up a separate server to run much older software... Thanks, Mike
I am trying to better understand the encryption of data in-flight when sending data up to AWS S3 and pulling it back down. The docs page "https://docs.splunk.com/Documentation/Splunk/7.3.4/Indexer/... See more...
I am trying to better understand the encryption of data in-flight when sending data up to AWS S3 and pulling it back down. The docs page "https://docs.splunk.com/Documentation/Splunk/7.3.4/Indexer/SmartStoresecuritystrategies" (link might not show as I do not enough Karma to post links; I'm talking about the SmartStoresecuritystrategies section of the Indexer documentation) discusses the different SSL settings that can/should be set for SmartStore. We are looking to use KMS encryption at-rest and the example for it feels lacking for me to understand it. How do we ensure that the data is encrypted in-flight to AWS S3? I am assuming that the KMS encryption is not magically handling the in-flight encryption. Do we need to pull down a cert from AWS to use as the sslRootCAPath or do we need to still generate our own (Public vs. Internal)? Is the only way to do this switching over to sse-c? Any help or clarification would be greatly appreciated!
Currently seeing this error in Chrome whenever I try to save a view in Splunk. This occurs across multiple deployments, some Splunk 7 and some Splunk 8. Does anyone know what causes this? 2020-05-... See more...
Currently seeing this error in Chrome whenever I try to save a view in Splunk. This occurs across multiple deployments, some Splunk 7 and some Splunk 8. Does anyone know what causes this? 2020-05-28 16:19:14,606 ERROR [5ed01cc296109d50c90] utility:58 - name=javascript, class=Splunk.Error, lineNumber=26024, message=Uncaught TypeError: Cannot read property 'status' of null, fileName=https://localhost:8000/en-US/static/@8F68C924E96E41B2D0294AE7995C1485D1FCAADD28E0AE1AE0C13C6BAE57A9EA/js/common.min.js
Is it possible to use SmartStore with a standalone docker installation? I have been trying to set it up by specifying all my settings in the indexes.conf file. It works the first time, but when I d... See more...
Is it possible to use SmartStore with a standalone docker installation? I have been trying to set it up by specifying all my settings in the indexes.conf file. It works the first time, but when I destroy the docker container and spin up a new one, it will not read/write to the remote store.
Hi All, I have 4 SH cluster members for which i have to integrate SAML. Our AD team is asking the below information reply URL. Do i need to give all the 4 url ? Also do i need to configure the ... See more...
Hi All, I have 4 SH cluster members for which i have to integrate SAML. Our AD team is asking the below information reply URL. Do i need to give all the 4 url ? Also do i need to configure the SAML on all 4 SH UI ? please do share your thoughts. SAML-based Sign-on Attributes Value - Reply URL (Assertion Consumer Service URL) https://searchhead1.group.com/saml/acs https://searchhead2.group.com/saml/acs https://searchhead3.group.com/saml/acs https://searchhead4.group.com/saml/acs
I have several similar apps. They share global searches and dashboards. They each have custom data in a lookup table, custom_data.csv . Is it possible to see all the custom_data.csv files in a s... See more...
I have several similar apps. They share global searches and dashboards. They each have custom data in a lookup table, custom_data.csv . Is it possible to see all the custom_data.csv files in a single search using an admin account? Normally I would generate a list of apps via a |rest call and then do a |map search. I cannot figure out how to cross apps with a lookup table command inputlookup .
I setup a dir monitor with a whitelist through splunk web- now I'm looking for the specifics to find the CRCSalt settings etc but this input I put in doesn't show up in $SPLUNK_HOME/etc/system/local/... See more...
I setup a dir monitor with a whitelist through splunk web- now I'm looking for the specifics to find the CRCSalt settings etc but this input I put in doesn't show up in $SPLUNK_HOME/etc/system/local/inputs.conf WHY??? where is it?
I am having issues configuring Duo MFA with Splunk. Configuring per Splunk Docs via the UI yields Encountered the following error while trying to save: Current Duo configuration cannot be verified... See more...
I am having issues configuring Duo MFA with Splunk. Configuring per Splunk Docs via the UI yields Encountered the following error while trying to save: Current Duo configuration cannot be verified by the Duo server. Please check and re-enter it again ... and configuring via authentication.conf per Splunk Docs yields Login failed due to incorrectly configured Multifactor authentication. Contact Splunk support for resolving this issue I have verified per Duo Docs that my server time is correct by running date in bash, and I can confirm that the ikey, skey, and api_host are correct because I am using these same parameters to actively ingest data from Duo on a separate Splunk server. Does anyone have any tips or experience, or is this a support ticket?
I've got a lookup table with counts by date. This table is updated each night, and I would like to search by the date fields relative to the current date. Example: 5-26-2020 / 5-27-2020 / 5-28-202... See more...
I've got a lookup table with counts by date. This table is updated each night, and I would like to search by the date fields relative to the current date. Example: 5-26-2020 / 5-27-2020 / 5-28-2020 12 / 30 / 15 10 / 10 / 8 19 / 12 / 15 | inputlookup counts.csv | eval today=strftime(_time,"%m-%d-%Y") | stats sum(**today**) I'm thinking of something akin to the INDIRECT function in excel.
My Splunk instances are on a private network. Many of the items on the Help Menu try to retrieve documents online. I'd like to try and prevent users from attempting to select options that won't wo... See more...
My Splunk instances are on a private network. Many of the items on the Help Menu try to retrieve documents online. I'd like to try and prevent users from attempting to select options that won't work for them. I've already set the docsCheckerBaseURL in web.conf to 0, but all that does is not try and hit the actual online documentation site.
I have information in the episodes shown in the ITSI Episode Review in the form of URLs that reference other systems or the like. I would like to display the information in these fields as clickabl... See more...
I have information in the episodes shown in the ITSI Episode Review in the form of URLs that reference other systems or the like. I would like to display the information in these fields as clickable links so that users can open documentation on the error in another browser tab, or jump to the source systems using a remote operations tool, for example. How can hyperlinks be displayed in ITSI (in addition to the drill-down link) that can be defined in the correlation search? The configurable drill-down link in the correlation search is already in use for some other hyperlinks. Is there a way to display additional fields as hyperlinks? Alternatively, we are thinking about implementing a custom episode action that could enable these hyperlinks. Does anyone have any experience or examples? We are using Splunk ITSI Version 4.4.1.
Here is the part of the search that I am working on, and trying to exclude certain numbers of days. However, where Date != 20190401 removes only April 1st from the result. I need to exclude all th... See more...
Here is the part of the search that I am working on, and trying to exclude certain numbers of days. However, where Date != 20190401 removes only April 1st from the result. I need to exclude all the days of April. index: | where Date != "20190401" | stats sum(claim_nums) as "Total Claims" by Date | appendpipe [ stats avg("Total Claims") as ClaimAvg | eval ClaimAvg = round(ClaimAvg,2) | eval "Total Claims" = "TOTAL AVG CLAIMS ".ClaimAvg | fields - ClaimAvg] | sort - DATE I tried adding multiple dates in the where command, but it doesn't work.
I have set up an alert using webhooks and they have not been firing. I have set the notification to also show up in triggered alerts to make sure that the alert was in fact firing. When looking thr... See more...
I have set up an alert using webhooks and they have not been firing. I have set the notification to also show up in triggered alerts to make sure that the alert was in fact firing. When looking through the logs via index=_* webhook action=webhook I found some errors, which I cannot figure out how to remediate: event_message: action=webhook - Alert action script returned error code=3 event_message: action=webhook - Alert action script completed in duration=64 ms with exit code=3 event_message: action=webhook STDERR - Unexpected error: POST data should be bytes, an iterable of bytes, or a file object. It cannot be of type str. I realize that the last one is a Python error, which I found some information on here: stackoverflow - Python 3 urllib produces TypeErr... I guess what I am wondering is if this is something that I might be doing wrong? Or is something broken on the cloud platform?
Hi, I have a multi-select input with user permissions to change cell color ( colorPalette ) based on values the user enters. Here is the creation of my multi-select input : <input type="multis... See more...
Hi, I have a multi-select input with user permissions to change cell color ( colorPalette ) based on values the user enters. Here is the creation of my multi-select input : <input type="multiselect" token="chosenAdmin" searchWhenChanged="true"> <label>field1</label> <fieldForLabel>user</fieldForLabel> <fieldForValue>user</fieldForValue> <search> <query>my query <earliest>0</earliest> <latest></latest> </search> <delimiter> </delimiter> </input> This is the color palette I tried: <format type="color" field="user"> <colorPalette type="expression">if (match(value, $chosenAdmin$),"#DC4E41", true())</colorPalette> </format> The $chosenAdmin$ token can contain 1 or more users. This changes the color of all cells of my "user" to red. Any idea how to solve this problem?
I have a small system with one indexing server receiving information from forwarders on six other servers. Should the system be stopped before upgrading?
While upgrading from 7.3 to 8.0, the upgrade fails. Is there a log anywhere that records what has happened?