Activity Feed
- Posted Re: Version control management for Splunk Dashboards, Reports and Alerts on Dashboards & Visualizations. 09-18-2024 02:33 PM
- Posted Re: What is wrong with my sub-pipeline in appendpipe? on Splunk Search. 06-22-2023 10:15 AM
- Karma Re: What is wrong with my sub-pipeline in appendpipe? for yuanliu. 06-22-2023 10:02 AM
- Posted Re: What is wrong with my sub-pipeline in appendpipe? on Splunk Search. 06-21-2023 05:44 PM
- Karma What is wrong with my sub-pipeline in appendpipe? for yuanliu. 06-21-2023 05:26 PM
- Posted Re: What is wrong with my sub-pipeline in appendpipe? on Splunk Search. 06-20-2023 12:41 PM
- Karma Re: What is wrong with my sub-pipeline in appendpipe? for VatsalJagani. 06-20-2023 12:38 PM
- Posted What is wrong with my sub-pipeline in appendpipe? on Splunk Search. 06-10-2023 02:26 PM
- Karma Re: How to pass field values as macro arguments? for wbcem. 06-02-2023 10:44 AM
- Posted What's the meaning and mechanism of form.multiselect_lines (with the pattern form.<input_token>? on Splunk Search. 09-22-2022 10:05 AM
- Tagged What's the meaning and mechanism of form.multiselect_lines (with the pattern form.<input_token>? on Splunk Search. 09-22-2022 10:05 AM
- Posted Re: Having problem of not seeing the expected visualization of a query when launching or reloading a dashboard? on Getting Data In. 08-17-2022 08:27 AM
- Posted Having problem of not seeing the expected visualization of a query when launching or reloading a dashboard? on Getting Data In. 08-17-2022 08:04 AM
- Tagged Having problem of not seeing the expected visualization of a query when launching or reloading a dashboard? on Getting Data In. 08-17-2022 08:04 AM
- Karma Re: How to convert _time column to epoch time for jnudell_2. 07-28-2022 10:44 AM
- Karma How to convert _time column to epoch time for Becherer. 07-28-2022 10:44 AM
- Karma Re: How do I concatenate two fields into a string? for chris. 07-22-2022 04:27 PM
- Karma Re: How to implement alert that need to consider state of past alert? for gcusello. 07-15-2022 06:33 AM
- Posted Re: How to implement alert that need to consider state of past alert? on Splunk Search. 07-15-2022 06:09 AM
- Got Karma for How to implement alert that need to consider state of past alert?. 07-15-2022 01:24 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
1 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
02-10-2025
09:09 AM
Where I need to keep this JS? In dashboard or in any path?
... View more
01-16-2025
05:16 PM
1 Karma
FWIW, it's usually better to ask a new question than to pile on to a 4-year-old thread. To keep only the BAD events, try one of these index=my_index
| eval my_check=if(my_field>100,"NOK","OK")
| where my_check="NOK"
| table _time my_check or index=my_index
| where my_field>100
| table _time my_field
... View more
09-20-2024
08:17 AM
https://www.splunk.com/en_us/blog/platform/splunking-your-conf-files-how-to-audit-configuration-changes-like-a-boss.html?locale=en_us This will show you how to track conf file changes. Earlier questions wanted to change control on dashboards which are xml files so it wont work for those.
... View more
06-24-2023
02:51 AM
Well, it's all complicated and performance tuning searches might be a huge headache. As a rule of thumb it's usually better to avoid subsearches and it's best to leave as much work to the indexers as you can (so you can benefit from parallelization). So that's why I was pointing out that there should be a better way to do what you're trying to do than use map. But, having said that: 1) Sometimes map-based solution can be good enough (I have such solution at home - I wrote a map-based search way back when I was learning SPL and since the subsearch is only run twice, it's not worth the time needed to rework the search to touch it). 2) Sometimes map can be the only reasonable solution (if indeed the outer search results narrow down the subsearch parameters that the stats- or anything-else-based approach would mean too much searching from raw events and digging through too much data. So just get to know your job inspector and use it 🙂 One could argue about the readability argument though since often what is "readable" for a non-splunk-experienced people is simply something that is written with a completely different paradigm in mind and is simply not the way it's done in SPL. Like typical overuse of the "join" command by people coming to Splunk from relational database environments.
... View more
10-18-2022
10:42 AM
Hello richgalloway, Thanks are not enough for the help provided by this forum and your replies, I will try the "done" tag. thanks again, eholz1
... View more
09-22-2022
10:05 AM
I see an interesting Simple XML idiom below: <input type="multiselect" token="multiselect_lines" searchWhenChanged="true">
<label>Lines</label>
<choice value="ACEKLMRSWY">All lines</choice>
<choice value="A">A Line</choice>
<choice value="C">C Line</choice>
<choice value="E">E Line</choice>
<choice value="K">K Line</choice>
<choice value="L">L Line</choice>
<choice value="M">M Line</choice>
<choice value="R">R Line</choice>
<choice value="S">S Line</choice>
<choice value="W">W Line</choice>
<choice value="Y">Y Line</choice>
<default>ACEKLMRSWY</default>
<prefix>regex Location="^[</prefix>
<suffix>]"</suffix>
<change>
<eval token="form.multiselect_lines">
case(
mvcount('form.multiselect_lines') == 2 AND mvindex('form.multiselect_lines', 0) == "ACEKLMRSWY", mvindex('form.multiselect_lines', 1),
mvfind('form.multiselect_lines', "ACEKLMRSWY") == mvcount('form.multiselect_lines') - 1, "ACEKLMRSWY",
true(), 'form.multiselect_lines')</eval>
</change>
</input> It seems updating the appearance of the multiselect field "multiselect_lines" so whenever the selections in the multiselect change, "form.multiselect_lines" will be updated accordingly. I guess that it is supposed to solve the deficiency of multiselect in Splunk that the option of "All" does not disappear automatically when a subset is selected, or when there is no more subset selected, "All" as default does not come back automatically. The above is my trying to understand to achieve the functionality. It works as hypothesized in a dashboard that I'm studying, but when I copied the mechanism to my dashboard, it has no effect in the behavior. So I under what the token with the pattern of form.<multiselect_input_token>, and what does it take to have the above mechanism work in auto removing and adding "All" in appearance? I know that there is a javascript solution by modifying the list of multiselect options on the fly through Javascript. But I don't have the admin privilege to add the javascript for my dashboard. So a solution without requiring admin privilege is handy.
... View more
- Tags:
- dashboard
08-17-2022
08:27 AM
I further confirmed that the problem can be reproduced reliably by reloading the web page of the dashboard. When there is already working visualization, once I reload, then the visualization will be gone.
... View more
07-15-2022
06:20 AM
1 Karma
Hi @yshen, if you're not fluent in SPL I hint to follow the Splunk Search Tutorial (https://docs.splunk.com/Documentation/Splunk/latest/SearchTutorial/WelcometotheSearchTutorial) because searches are the base of everything in Splunk! Anyway the choose between lookup and summary index depends on some factors: if you have many cases Summary is more efficient, if for the check the time isn't relevant, lookup is the easier way. Anyway, you have to take your alert and add at the end the command | outputlookup your_alerts_Lookup.csv append=true then when you run an alert you can search on the lookup. It's difficoult to give you more help because it depends on the use cases. For this reason I hinted to learn SPL, so develop this solution will be very easy. Ciao. Giuseppe
... View more
07-14-2022
08:21 AM
Thanks for the perfect solution!
... View more
04-07-2022
11:21 AM
I also note that with Splunk SDK (Python), at the end of the embedded query, using 'fields' to select the returned fields, it does not work as I desired with all fields returned. But 'table' would result in only the listed fields returned.
... View more
04-07-2022
11:10 AM
1 Karma
With hint by https://splunk-usergroups.slack.com/team/UB5DA9L02, it turns out that as the sourcetype is only known in the context of my application ics_analytics, in the service definition with SDK, I must indicate the application context with app= argument. Here is the corrected service definition: service = client.connect(
host= 'splunk.bart.gov',
app='ics_analysis',
port = '8089',
username = 'userid',
password = 'secrete',
) once the sourcetype is properly declared to be known, the same code as above would be able to retrieve the field value of ENTRY. Here is the link to the relevant documentation: https://docs.splunk.com/DocumentationStatic/PythonSDK/1.6.16/client.html#splunklib.client.Service This is post is a capture of Slack discussion: https://splunk-usergroups.slack.com/archives/C04DC8JJ6/p1649351828984919?thread_ts=1649265592.685629&cid=C04DC8JJ6
... View more
02-13-2022
01:45 AM
1 Karma
Hi @yshen, good for you, see next time! Ciao and happy splunking. Giuseppe P.S.: Karma Points are appreciated 😉
... View more
08-13-2021
11:12 AM
For the record, I ran into a case, when I run into the error message. It was when I needed to replace an existing csv file for lookup. I forgot to delete the existing one, and hoping the new file will override the existing one. It turned out that Splunk just complained without clearer indication of my offense. It would have been more helpful with more concrete error diagnose.
... View more
06-24-2021
08:19 PM
I want to compute the change in temperature for each location in a given interval, say, 15 minutes, or 30 minutes. I figure that streamstats might capture the temperature value at the beginning of such time interval, using time_window to specify the interval length. But, however, the following example surprises me. The temperature readings for Pleasonton are collected every 15 minutes, thus the following query: | makeresults | eval _raw="time_ Location Temperature 2021-08-23T03:04:05.000-0700 Pleasonton 185 2021-08-23T03:04:20.000-0700 Pleasonton 86 2021-08-23T03:04:35.000-0700 Pleasonton 87 2021-08-23T03:04:50.000-0700 Pleasonton 89" | multikv forceheader=1 | eval _time=strptime(time_,"%Y-%m-%dT%H:%M:%S.%3N%z") | fields _time Location Temperature | sort _time | streamstats earliest(Temperature) as previous_temp earliest(_time) as previous_time by Location time_window=5m | convert ctime(previous_time) I’d expect the following, as with the interval 5 minutes from an event, there is no other event, but the current one. _time Location Temperature _raw previous_temp previous_time 2021-08-23 03:04:05 Pleasonton 185 2021-08-23T03:04:05.000-0700 Pleasonton 185 185 08/23/2021 03:04:05.000000 2021-08-23 03:04:20 Pleasonton 86 2021-08-23T03:04:20.000-0700 Pleasonton 86 86 08/23/2021 03:04:20.000000 2021-08-23 03:04:35 Pleasonton 87 2021-08-23T03:04:35.000-0700 Pleasonton 87 87 08/23/2021 03:04:35.000000 2021-08-23 03:04:50 Pleasonton 89 2021-08-23T03:04:50.000-0700 Pleasonton 89 89 08/23/2021 03:04:50.000000 but this is actually what I get: _time Location Temperature _raw previous_temp previous_time 2021-08-23 03:04:05 Pleasonton 185 2021-08-23T03:04:05.000-0700 Pleasonton 185 185 08/23/2021 03:04:05.000000 2021-08-23 03:04:20 Pleasonton 86 2021-08-23T03:04:20.000-0700 Pleasonton 86 185 08/23/2021 03:04:05.000000 2021-08-23 03:04:35 Pleasonton 87 2021-08-23T03:04:35.000-0700 Pleasonton 87 185 08/23/2021 03:04:05.000000 2021-08-23 03:04:50 Pleasonton 89 2021-08-23T03:04:50.000-0700 Pleasonton 89 185 08/23/2021 03:04:05.000000 All taking the earliest event's temperature, which is beyond 5 minutes from any subsequent events.How can I query to get the temperature at the beginning of the time period?
... View more
- Tags:
- search
Labels
- Labels:
-
stats
04-23-2021
01:45 PM
I've understood the example of how to display an icon and how to make it disappear. Now I need to figure out how to implement the drill-down on the icon displayed. Any suggestion would be appreciated. The more I study it, the more I feel that this is a generic requirement of placing single value visualization on a schematic diagram against a feature value, at the position specific to the feature value. Therefore, I raised an "idea" here: https://ideas.splunk.com/ideas/EID-I-945 Please review, comment, and support, if you agree. Thanks!
... View more
- Tags:
- dashboard
03-20-2021
10:47 AM
@gcusello Thanks for the pointer. My problem is a real one that I need for some field troubleshooting. Maybe, my framing of my problem could be simplified. But it's what I can do at the moment. Maybe, once I know the solution, then the problem statement might be simplified. The extraction of the fields of DEVICE, ATTRIBUTE, etc. has already been done in the respective sourcetype. Your suggestion of using transaction with startwith and endwith might work sometimes. But I eventually need to identify those event groups that have subsets of the expected events. For example, Certain group may only has event of "SOR_RESTRICT_STATUS.STATE" so the transaciton may need to startwith "SOR_RESTRICT_STATUS.STATE" as well. Likewise, some other subset may only have SOR_A_STATUS.STATE OR SOR_B_STATUS.STATE so the transaction may need also to endwith SOR_A_STATUS.STATE OR SOR_B_STATUS.STATE by this school of thought, the eventual transaction definition would be like: transaction DEVICE startswith="SOR_A_STATUS.STATE OR SOR_B_STATUS.STATE OR SOR_RESTRICT_STATUS.STATE" endswith="SOR_A_STATUS.STATE OR SOR_B_STATUS.STATE OR SOR_RESTRICT_STATUS.STATE" I have tried how it would up with the same condition for startwith and endwith I am concerned that it would lose the expected differentiation for the transactions. Maybe a question to experts.
... View more
- Tags:
- search
02-23-2021
08:56 PM
It took me at least 5 hours of experiment to make sure that I'm not making a mistake. It's really frustrated and disappointed by such mystery and arbitrary behavior without any hint of error message! A great company can do better. I wish customers should not always be treated as a victim of hidden secrete! So the moral of the story, a base search must explicitly specify all the fields that will be used by the searches using the base search by command fields or table.
... View more
01-15-2021
03:10 PM
Your question was very clear, and I just had to make it that way. I hope it is the same for others.
... View more
12-11-2020
06:00 AM
can anyone advise on this ? regards Altin
... View more
10-21-2020
02:03 PM
@yshen There are a number of ways to solve this, but it's still not clear when you talk about "sometimes" or "ever". Your example data is only 16 Sep, however, the basic solution to hanging on to the raw data is to use eventstats/where, like this | makeresults
| eval _raw="Temperature=82.4, Location=xxx.165.152.17, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=84.2, Location=xxx.165.152.48, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=82.4, Location=xxx.165.154.21, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=82.4, Location=xxx.165.162.22, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=77.0, Location=xxx.165.164.17, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=75.2, Location=xxx.165.170.17, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=77.0, Location=xxx.165.208.12, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=73.4, Location=xxx.165.48.20, Time=Wed Sep 16 07:43:01 PDT 2020, Type=UPS;
Temperature=75.3, Location=xxx.165.52.13, Time=Wed Sep 16 07:47:01 PDT 2020, Type=TempSensor;
Temperature=77.9, Location=xxx.165.52.14, Time=Wed Sep 16 07:47:01 PDT 2020, Type=TempSensor;
Temperature=76.3, Location=xxx.165.54.24, Time=Wed Sep 16 07:47:01 PDT 2020, Type=TempSensor;
Temperature=83.8, Location=xxx.165.48.20, Time=Wed Sep 16 07:47:01 PDT 2020, Type=TempSensor;
Temperature=73.8, Location=xxx.165.36.21, Time=Wed Sep 16 07:47:01 PDT 2020, Type=TempSensor"
| eval x=split(_raw,";")
| mvexpand x
| rename x as _raw
| extract
| fields - _raw
| eventstats max(Temperature) as mt by Location
| where mt>83 which will leave you 3 rows and then you can do what you want with that. However, it's not clear if, say, the location "Temperature=76.3, Location=xxx.165.54.24", recording the 76.3 on 16th September, but which had recorded 83.1 on 4th July 2002, should be in the 'hot locations'. (assuming of course you have data going back that far). If so, then the solution would have to change, as it is unlikely to be practical to search that much data with eventstats. Instead you would be better off doing a daily search to find temps that day that exceeded your threshold, or even just the max temp for each location and save that max to a lookup file for the location. Then when doing the 'find me raw data for hot locations' query, you would then do the basic search for all data for your period, then lookup the location from the lookup and make the check there, for example base_search
| lookup location_list.csv Location OUTPUT locationHistoricalMaxTemp
| eval maxTemp=max(locationHistoricalMaxTemp, Temperature)
| eventstats max(maxTemp) as mt by Location
| where mt>83 What this is doing is getting the currently saved historical max from your lookup based on location, then assuming you update that lookup at the end of the day, so the current Temp might be higher, the maxTemp will pick the highest of either today's or the historical, then the eventstats/where comes into play to find the rows from hot locations. Hope this answers what you're trying to do.
... View more
10-12-2020
09:12 AM
@isoutamo Thanks for the license information!
... View more
10-02-2020
10:33 AM
I wish there is a way that is less impacting to the service than restarting the forwarder.
... View more
09-16-2020
11:03 PM
1 Karma
Hi @yshen Start here for creating and editing forms. An example of what I have done is this Where I can select what I want the chart to show (this might be just temperature in your case so perhaps not so relevant), but I can select the field I want to filter on (location or type in your case). Then depending on the filter, the choice in the third input is a multi-select populated with dedup'd values of the filter <panel>
<input type="dropdown" token="dimension">
<label>Select Dimension</label>
<choice value="Consumer">Consumer</choice>
---
<default>Consumer</default>
</input>
<input type="dropdown" token="filter">
<label>Select Filter</label>
<choice value="Service">Service</choice>
---
<default>Service</default>
</input>
<input type="multiselect" token="selector">
<label>Select one or more $filter$</label>
<choice value="All">All</choice>
<search>
<query>search data | dedup $filter$ | fields $filter$</query>
</search>
<fieldForLabel>$filter$</fieldForLabel>
<fieldForValue>$filter$</fieldForValue>
<prefix>(</prefix>
<valuePrefix>$filter$ ="</valuePrefix>
<valueSuffix>"</valueSuffix>
<delimiter> OR </delimiter>
<suffix>)</suffix>
<default>All</default>
<change>
<eval token="form.selector">case(mvcount('form.selector')=0,"All",mvcount('form.selector')>1 AND mvfind('form.selector',"All")>0,"All",mvcount('form.selector')>1 AND mvfind('form.selector',"All")=0,mvfilter('form.selector'!="All"),1==1,'form.selector')</eval>
<eval token="selector_choice">if(mvfind('form.selector',"All")=0,$filter$+"=*",$selector$)</eval>
</change>
</input>
<chart>
<title>Graph by $dimension$</title>
<search>
<query>search data $selector_choice$ | stats count by _time, $dimension$</query>
</search>
---
</chart>
</panel>
</row>
... View more
09-04-2020
10:27 AM
Hi @yshen , I understand your point now. Try below with your log events I am assuming your _time field has format like : 2020-08-23T03:04:05.000-0700 and it represents the start time in each log event. your base search.... | transaction Agent_Hostname alarm startswith="raised" endswith="cleared" |eval end=_time+duration, start=_time |eval end=strftime(end,"%Y-%m-%dT%H:%M:%S.%3N-0700"),start=strftime(start,"%d-%m-%Y %H:%M:%S.%3N-0700") | table start,end ,Agent_Hostname , alarm, duration Try and let me know.
... View more
08-21-2020
12:06 PM
@thambisetty I further studied your example, I experimented line by line, Here is my annotation of your example: index=snmptrapd sourcetype=trapParsed critical # filter the events contain "critical"
| fields Agent_Hostname,alertStatus_1,status, temperatureVlaue # select the fields
| rename Agent_Hostname as Location # rename the field
| eventstats latest(_time) as latest_time by Location # compute the lasets(_time) and add latest_time to the events
| where latest_time=_time # select the events' whose _time equals to latest_time
| stats latest(*) as * by Location # what's the purpose? Seems redundant?
| convert ctime(latest_time) # convert the format of lastest_time to be readable It seems to me that line of stats latest(*) as * by Location basically for each Location value, get the latest event for all the fields selected above. By the state above it, where latest_time=_time effectively for each value of Location there will be only events whose _time value equals to the lastes_time for the Location value, unless there are multiple events for the same value of Location with the same _time, usually there will be only one event for the Location value. Even if there were multiple events for the same Location value, and the same _time equaling to the laste_time, it seems stats latest(*) as * by Location will select the latest for the value combinations of all the selected fields for each Location value? So it sounds the purpose of this statement is to remove duplicate events for each Location value? Thanks again!
... View more