Activity Feed
- Karma Re: Custom Splunk Dashboard Display for rjthibod. 06-05-2020 12:49 AM
- Got Karma for Re: Only show logs where field value has a decimal place. 06-05-2020 12:49 AM
- Got Karma for Compare Minute by Minute Timechart "Today" vs Summary Index Timechart Average. 06-05-2020 12:49 AM
- Got Karma for Return only events where field value is in lookup table. 06-05-2020 12:49 AM
- Got Karma for Return only events where field value is in lookup table. 06-05-2020 12:49 AM
- Karma Re: Extract multiple values when field is in the same log twice for DalJeanis. 06-05-2020 12:48 AM
- Got Karma for Re: Extract multiple values when field is in the same log twice. 06-05-2020 12:48 AM
- Got Karma for Re: Dedup Lookup Table Field Results for Table. 06-05-2020 12:47 AM
- Got Karma for Re: Dedup Lookup Table Field Results for Table. 06-05-2020 12:47 AM
- Got Karma for Re: Dedup Lookup Table Field Results for Table. 06-05-2020 12:47 AM
- Karma Re: Automatic Field Extraction Using "Translatefix" App for Glenn. 06-05-2020 12:46 AM
- Got Karma for Re: Table for Daily Event Start/Finish Time. 06-05-2020 12:46 AM
- Got Karma for Lookup Table "vlookup" Function?. 06-05-2020 12:46 AM
- Got Karma for Re: Lookup Table "vlookup" Function?. 06-05-2020 12:46 AM
- Got Karma for Re: Lookup Table "vlookup" Function?. 06-05-2020 12:46 AM
- Posted Re: Combine the Sum of Two Fields on Splunk Search. 03-19-2020 10:40 AM
- Posted Combine the Sum of Two Fields on Splunk Search. 03-19-2020 10:21 AM
- Posted Transaction Command: Determine Outliers/Mismatches Only on Getting Data In. 01-13-2020 11:17 AM
- Posted Return only events where field value is in lookup table on Splunk Search. 05-23-2018 12:47 PM
- Tagged Return only events where field value is in lookup table on Splunk Search. 05-23-2018 12:47 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
2 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
03-19-2020
10:40 AM
Hmm, this doesn't seem to do anything. Although I think it doesn't work because if I don't do stats I don't get the numerical values to combine. Stats is what causes the totalx by y to become available. Before I run stats y is just a text field=value pair that is non numerical if that makes sense?
... View more
03-19-2020
10:21 AM
Hi all,
For a search similar to the following:
index=myindex "Search Term" NOT field=value source="mylog.log" | eval totalx=aCount+bCount | stats sum(totalx) by y | sort -sum(totalx)
Splunk returns exactly the data I am looking for. A table of sum(totalx) by y (of which there are around 5 different values of y). I have a request to combine the sum(totalx) values for 2 of the 5 values and treat them as one value but leave the rest unchanged. What would be the best way to accomplish this?
For instance, right now my search returns a table similar to this:
y sum(totalx)
1 10
2 20
3 30
4 40
5 50
I am essentially trying to create an additional field, let's call it 45, which represents the sum of 4 and 5 at all times. So instead, the data being visualized is:
y sum(totalx)
1 10
2 20
3 30
45 90
Thanks!
... View more
01-13-2020
11:17 AM
I am using the transaction command in Splunk to group the events of an identical log file across two hosts. Essentially, the field=value pairs across both hosts should be identical at all times. From time to time, issues can issue that cause the two hosts to become out of sync. I'd like to have a search that only identifies transactions where the field=value pairs do not match exactly. What would be the best way to accomplish this?
For instance, using the search below groups the log files from multiple hosts into a single transaction by second.
"searchterm" source="mylog.log" | transaction field maxspan=1s
I want to only return events with the below pattern (mismatches)
2020-01-10 17:30:00,348 INFO field=true
2020-01-10 17:30:00,351 INFO field=false
But ignore events with this pattern (identical)
2020-01-10 17:30:00,348 INFO field=true
2020-01-10 17:30:00,351 INFO field=true
Or this pattern (identical)
2020-01-10 17:30:00,348 INFO field=false
2020-01-10 17:30:00,351 INFO field=false
... View more
05-23-2018
12:47 PM
2 Karma
Hi all,
I am running a search that returns many events. Some of these events contain a field value that is also in a lookup table I have uploaded. What is the best way to format my search in such a way that it ONLY returns events where the field value in the event is present in the lookup table? Right now, the lookup itself works, but the search returns all events, whether it can look up the field value or not.
Thanks!
... View more
05-15-2018
09:28 AM
I wish there was a facepalm emoji available so I could use it 🙂
I never even considered the option of using the token to specify the entire condition, I always thought it could only be used to specify a specific value. All good, this did the trick!
... View more
05-15-2018
08:42 AM
I have a dashboard where I am using tokens to filter the results of the individual panels. The use case for the filters are:
Token=anything (*)
Token=specific_value
Token=anything BUT specific_value
I have the first two tested and working, but can't seem to figure out the best way to account for the 3rd scenario. I have been incorporating the token into my searches by using:
| fillnull value=NULL field (this ensures value will always be equal to something, even when not in an event) | search field=$token$
This works great for scenario 1 and 2 but obviously there is no way (I think?) to leverage field=value when in the last case I want to do the opposite (!=). Is there a better way to leverage the token in my search so I will be able to filter based on all three scenarios? All values, specific value, everything NOT specific value?
Thanks!
... View more
05-15-2018
08:37 AM
Thank you, nice and easy! This did exactly what I was looking for.
... View more
05-15-2018
05:48 AM
Hi all,
What would be the best way for Splunk to handle repeating fields in a single event? For instance, one of my logs has a repeating field. For same of demo, let's call it field1. So the log event can have:
field1=123 field1=234
But when Spunk auto-extracts the field/value pair info, it only sees field1=123. What do I need to do to allow it to interpret both values for field1 in that single event. Preferably looking for a way to do this in-line in the search if possible.
Thanks!
... View more
02-13-2018
08:11 AM
Looks like this did the trick, updated w+ to S+
rex field= mode=sed "s/([a-z]*)\d+\S+/\1/g"
... View more
02-13-2018
08:00 AM
Hmm this causes the extraction to pull out more than is needed. Thanks for your help, you've put me on the right path so I can work on this a bit more to fine tune. Thanks again!
... View more
02-13-2018
07:44 AM
Thank you! This gets me very close to where I need to be. I think if a condition is added to this to recognize that the value "ends" with a comma it will work properly. Right now the extraction works correctly 99% of the time but in some cases also extracts some extra info at the end of the complete value. So the field/value pair I am extracting is:
field=valuemmddyyyyadditionaltext, nextfield=nextvalue
It pulls the value out correctly almost every time, but is including some additional characters from the "additionaltext" part in a handful of cases. So I think if the regex could basically be set to ignore everything in the value beginning with date code and ending with a comma, it will be exactly what I need (just the initial value and nothing else).
... View more
02-13-2018
07:29 AM
The regex provided a little further down in this question gets me very close:
rex field=myfield mode=sed "s/([a-z]*)\d+\w+/\1/g"
This command gets me the value I want 99% of the time and in a few cases a little bit more. The data in the "value" is typically 6-8 characters long (not always 8 characters long but at most will not be more than 8 characters long), and alphanumeric. It is then followed by today's date and some additional alphanumeric values after that.
... View more
02-13-2018
06:55 AM
Hello everyone,
I am sure this is a relatively easy regex to build but I was hoping for some assistance, my regex experience is still pretty rocky 🙂
One of my log values contains more information than I need, but the first "section" of the value is what I really want to pull out. I can say that 100% of the time the value I want to extract is followed by today's date along with some more information. For instance:
Field=value02132018additionaltext
In the above example, it would only be the "value" that I care about and want to strip everything after it. Is this possible?
Thanks
... View more
11-20-2017
10:51 AM
Thank you! This was exactly what I needed to do. Much appreciated.
... View more
11-20-2017
10:14 AM
What is the best way to use the Makemv command when my logs have no delimiter? For example:
field=abcd
Where a, b, c, and d are unique values. I'm looking to get the count of each in my logs, but I am wondering what the best way would be to delimit them. The values will always be a single letter and the "end" of the field/value pair will be a space. For example:
field1=value1 field=abcd field3=value3
Thanks!
... View more
10-17-2017
10:04 AM
1 Karma
This is exactly what I needed. Thank you!
... View more
10-17-2017
09:24 AM
Hi all,
I'm trying to run a search that only finds specific events in a log which have field X equal to a number with a decimal place. Creating the search of simply X>0 returns all log events with any number, which is a good start. Now I'm just looking to filter the results a bit more so only logs having field X equal to any number that has a decimal place will be displayed. What would be the best way to accomplish this?
Thanks.
... View more
08-24-2017
10:37 AM
Thanks so much! Sometimes Splunk has so many options it's just a matter of looking in the right place to find all the customizations 🙂
Appreciate you pointing me in the right direction. This is just what I needed.
... View more
08-24-2017
09:56 AM
Are there any parameters that can be added to dashboard XML so that when it is opened, only the panels are displayed? Basically, I want to be able to load the dashboard up on a kiosk and have it just display the content (no dashboard title, no nav bar at the top, etc.). Right now, I log into the dashboard and manually scroll down. If possible, it would be great to just open the dashboard and require no further interaction.
... View more
08-17-2017
12:51 PM
What would be the best way to run a week to date search (timechart/bin) that "flattens" the individual days so I can get an average count per minute for the week? I don't care so much about the count per minute per day, but the average count each minute taking the entire week into account.
For instance, if I want to take "timechart span=1m count" and run that week to date, but ignore the dates and only focus on times. The idea would be the have the avg(count) at 8:00, 8:01, 8:02 etc and compare that to the "current" count today.
Ideally I'm looking to run a search for Today, timechart span=1m count - and add avg(count) per minute for the prior week to give an idea for how today compares to historical data.
Thanks!
... View more
08-16-2017
10:27 AM
Thank you for your advice! Let me give this a shot and see how close it gets me to what I am looking for.
... View more
08-15-2017
05:08 AM
1 Karma
I currently have a timechart running every minute each day to show a given field value as it increases through the day. The data is being displayed as an area chart. If possible, I'd like the add an overlay to the chart that will show the "average" value each minute over a larger time period (yesterday, or last week for instance). I already have the "historical" timechart data being saved to a summary index, I'm just wondering what the best way would be to incorporate it.
Right now, the search is relatively simple:
"my search text" earliest=@d+6h latest=@d+18h source="mylog.log" | timechart span=1m count
And I am running this same search, without the earliest and latest filters and writing the results to summary index. So it is just a matter of taking today's count by minute and comparing to the summary index count by minute so get a baseline of today vs prior days to make it easier to see if it is "normal" or not.
... View more
08-09-2017
07:03 AM
Thanks, I made a slight tweak and it is working perfectly now!
source="mylog.log" | eval time=strftime(_time,"%I:%M %p") | stats count latest(time) as time | eval count=tostring(count, "commas") | eval today=count." "."messages processed as of"." ".time | table today
... View more
08-09-2017
06:24 AM
I've tried that, but anytime I do my table that used to display the text string returns nothing. Am I doing something in the wrong order?
source="mylog.log" | eval time=strftime(_time,"%I:%M %p") | stats count | eval count=tostring(count, "commas") | eval today=count." "."messages processed today as of"." ".time | table today
... View more
08-09-2017
05:50 AM
Hi all,
I'm currently working on a dashboard in Splunk that I am trying to take a count value and include it in a sentence to make it more presentable. As of now, I am able to get a count of events and then create a variable that works great:
eval today=count." "."messages processed today."
I have this dashboard panel set to refresh every hour, so ideally I would like the text to say "xxx messages processed today as of (time most recent search completed)". I've tried creating variables to do this or using by using stats, but any time I include the time in my "today" variable it causes no results to show up. Any thoughts?
... View more