Activity Feed
- Karma Re: Alert if result two following days for _d_. 06-05-2020 12:47 AM
- Posted Re: How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 01-03-2017 12:16 AM
- Posted Re: How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 01-02-2017 01:21 AM
- Posted Re: How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 10:26 PM
- Posted How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 05:47 AM
- Tagged How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 05:47 AM
- Tagged How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 05:47 AM
- Tagged How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 05:47 AM
- Tagged How to correlate correlate login and logout events using the map command or a more efficient method? on Splunk Search. 12-20-2016 05:47 AM
- Posted Re: Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-25-2016 02:28 AM
- Posted Re: Why are field values lost using the table command after the transaction command? on Splunk Search. 01-21-2016 10:57 PM
- Posted Re: Why are field values lost using the table command after the transaction command? on Splunk Search. 01-21-2016 10:57 PM
- Posted Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Tagged Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Tagged Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Tagged Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Tagged Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Tagged Using the transaction command to group events being logged to two indexes, how do I only include events that are found in both indexes? on Splunk Search. 01-21-2016 12:37 AM
- Posted Why are field values lost using the table command after the transaction command? on Splunk Search. 01-21-2016 12:04 AM
- Tagged Why are field values lost using the table command after the transaction command? on Splunk Search. 01-21-2016 12:04 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
01-03-2017
12:16 AM
The search now works, and by raising maxsearches, I get all the results.
However, the search performs very badly (As expected..). Automatic logout occurs very often, hence the mapped search will run many times.
Is there any other way? Piping to search, using a sub search or maybe using the stats command?
I have tried a subsearch, but can't make changing time of the parent search work. And I guess sub search will be slow as well.
Thanks for any assistance?
... View more
01-02-2017
01:21 AM
I found out what was wrong with the search, a misplaced ".
Here is a working search, If anyone is interested:
index=klpiprod Automatic logout logged
|rex("user-ID=\[(?<nin>[0-9]{11})\]") | dedup nin
| eval starttime=strftime(_time-5,"%m/%d/%Y:%H:%M:%S")
| eval endtime=strftime(_time,"%m/%d/%Y:%H:%M:%S")
| map search="search index=klpiprod CustomTai-2.0.4 LoginDetails nationalIdentificationNumber=$nin$ earliest=$starttime$ latest=$endtime$"
... View more
12-20-2016
10:26 PM
Thanks for your response!
I'm using map because I want the time and the nin from the first search to be run in the second search with additional seatrch parameters and slightly changed time.
I extract the nin to use in the mapped search, where the field nationalIdentificationNumber is present.
First search: Detect logout. Extract nin and time
Mapped search: Check if there was a login for the same user within a few seconds earlier.
The where clause will not work, since the nationalIdentificationNumber field is not present in the events returnes by the first search
... View more
12-20-2016
05:47 AM
Hi there!
I am trying to achieve the following:
Detect users that are unwillingly logged out of my web site. If the logout occurs two seconds or less after the login, I want this event to be returned.
I have tried the following command:
index=klpi OR index=klpiprod Automatic logout logged | rex("user-ID=\[(?<nin>[0-9]{11})\]") | eval starttime=strftime(_time-2,"%m/%d/%Y:%H:%M:%S") | eval endtime=strftime(_time + 1,"%m/%d/%Y:%H:%M:%S") | map search="search index=klpi OR index=klpiprod *login* CustomTai-2.0.4 LoginDetails nationalIdentificationNumber" nationalIdentificationNumber=$nin$ maxsearches=100
The search returns events that Contains custom-tai, but do not scope to only those with nationidentificationNumer=nin
Any ideas what I'm doing wrong? Is there a faster way to achieve this? The map command is slow and ultimately I want to get some statistics over time.
Thanks for any assistance!
Tor Erik
... View more
01-25-2016
02:28 AM
Hi There!
I made it work using the coalesque command. That was actually causing the whole problem.
Using the where isnotnull command strips away unwanted transactions.
I would like to use stats, but would that work when not all the fields are present in all the events?
... View more
01-21-2016
10:57 PM
Thanks for the reply. The path variable will not be present in all events.
The web will log the path, and the middleware will log other fields.
Will I then be able to display the whole "transaction" in one row using stats?
... View more
01-21-2016
12:37 AM
Hi all!
I am using the transaction command to group events being logged to two indexes. I have a common identifier.
However, I only want to include transactions where there are events from both indexes, (Events being initiated from Web). How can I achieve that?
I tried using the join command, but the output from it just confuses me..
I tried using where (isnotnull(uri)), because I know uri will always be set if the transaction consists of events from both indexes. No result returned.
index=* | rename correlationId as CID | transaction CID | where isnotnull(uri) | fields * | table uri, serviceName,operationName,host
Thanks for any assistance!
... View more
01-21-2016
12:04 AM
Hi all!
I am using the transaction command to group events based on an identifier occuring in separate indexes. Works nicely. The events shows up with all the information. In the left of the Splunk window, all expected fields from both indexes have values.
However, when trying to display some of the fields using the table command, only the fields from one of the indexes have values. The others are empty.
Piping the output from the transaction command through the fields command fixes the problem, but why is this happening?
Doesn't work (Path is blank):
index=* | rename correlationId as CID | transaction CID | table host, path
Works (Both fields have values):
index=* | rename correlationId as CID | transaction CID | fields * | table host,path
Thanks for any assistance!
... View more
11-18-2015
10:22 AM
Thanks for the answer. I think that by doing rename sessionIndex as search i do not search for sesssionIndex= in the outer search, I search for the value of sessionIndex from the inner search. These are used in the outer search.
The source type in the outer search does not contain sessionIndex=, and that's why I do a rename as search
... View more
11-17-2015
10:45 PM
The normalized search only contains this:
litsearch NOT () | fields keepcolorder=t "*" "_bkt" "_cd" "_si" "host" "index" "linecount" "source" "sourcetype" "splunk_server" | remotetl nb=300 et=1447827180.000000 lt=1447829003.000000 remove=true max_count=1000 max_prefetch=100
... View more
11-17-2015
02:35 AM
Hi All!
I am trying to use the subsearch functionality to find a token which should be used in the main search. Pretty basic, I guess.
However, I don't think I quite understand the fundamentals. What happens with the time in the main search? What happens with the other properties? Do they influence the outer search?
If I don't return any values from the subsearch, the search results seems to be limited by the time from the hits in the subsearch. Might be expected behaviour. Search string:
index=klpi [search index=klpi 2345678910]
This results in many results over a period of time.
This search, however, only results in a a few results. And only the ones which has sessionIndex=
index=klpi [search index=klpi sessionIndex=2345678910 ]
The subsearch seems to influence the main search, but what correlation is actually performed?
Here is the actual search I've been trying. I have tried both with and without the time modification.
index=klpi sourcetype="websphere:system:out" [search index=klpi sessionIndex sourcetype="fedag:debug" | rex("sessionIndex=(?<sessionIndex>.+)") | eval earliest=_time-1000 | eval latest=_time+1000 | format "(" "(" " " ")" "OR" ")" | rename sessionIndex as search]
Although I know there are hits in the source type of the outer search in the period given by the subsearch, I receive no results.
Thank for any assistance, both on the basic understanding, and the last search.
Regards
Tor Erik
... View more
06-28-2015
11:02 PM
Actually, I wanted to narrow to 1 second, but startet with 60 to be sure not to miss any while adjusting the search.
... View more
06-14-2015
11:45 PM
Didn't work either. What I actually made work was this:
index=myindex NullPointerException "history-service" | eval starttime=strftime(_time-1,"%m/%d/%Y:%H:%M:%S") | eval endtime=strftime(_time + 1,"%m/%d/%Y:%H:%M:%S") | map search="search index=myindex history-service earliest=$starttime$ latest=$endtime$" | where isnotnull(UID) | dedup UID | table UID
Doesn't earliest and latest handle epoch time?
... View more
06-12-2015
03:07 AM
Still no results returned. By the way, I tried both the first search and the second search separately (Setting the time manually), and they both worked
My search string:
index=klpi NullPointerException history-service | head 1 | map search="search index=klpi history-service earliest=$_time$-60 latest=$_time$+60"
... View more
06-11-2015
11:30 PM
Hi!
I have log statements containing error messages. This is lacking context information (ie user id). Using the event time from the result of a search for the error should be used to limit search for log statements containing the context information
I am trying to perform a subsearch, and returning the time interval from this search to be used in the parent search.
I have tried many different approaches suggested in these forums, but I can't get any one to work as expected.
My time preset in the date picker is last 24 hours, so the sub search is supposed to search in that range.
*This search doesn't limit the time in the parent search. Results for all 24 hours: *
index=myindex value-to-search-for [search index=myindex "NullPointerException myapplication" | head 1 | eval earliest = _time - 60 | eval latest = _time + 60 | return earliest latest]
*This search doesn't return any values: *
index=myindex value-to-search-for [search index=myindex "NullPointerException myapplication" | head 1 | eval earliest = _time - 60 | eval latest = _time + 60 | fields earliest latest]
*Still no values *
index=myindex value-to-search-for earliest=myearliest latest=mylatest [search index=myindex "NullPointerException myapplication" | head 1 | eval myearliest = _time - 60 | eval mylatest = _time + 60 | fields myearliest myearliest]
Giving new names. No result
index=myindex value-to-search-for earliest=myearliest latest=mylatest [search index=myindex "NullPointerException myapplication" | head 1 | eval myearliest = _time - 60 | eval mylatest = _time + 60 | fields myearliest myearliest]
Using return for new value. Gives invalid time
index=myindex value-to-search-for earliest=myearliest latest=mylatest [search index=myindex "NullPointerException myapplication" | head 1 | eval myearliest = _time - 60 | eval mylatest = _time + 60 | return myearliest myearliest]
Any ideas what I'm doing wrong?
Thanks for any assistance!
... View more
01-22-2015
11:16 PM
Maybe I'm a bit slow, or just new to splunk semantics, but I don't understand what you mean. Out of the first stats comes two numbers, yesterday and today. How will a new pipe to stats change that? Could you please modify the search with what you mean? By the way, many thanks for the assistance!
... View more
01-22-2015
09:47 AM
A bit easier than my attempt:) However, I will only trigger the alert if there are hits both yesterday and today. How can I achieve that with this search?
... View more
01-22-2015
04:47 AM
Hi !
I wan't to create an alert which triggers if number of results for a search are greater than 0 two following days.
I have tried using eval with two subsearches, but I can't make it work. Any ideas?
My search which doesn't include a check for the number of values, and returns the same value for hitsyesterday and hitstoday:
sourcetype="websphere:systemout" | eval hitcountyesterday=[search "Problem occured while storing credit card application" earliest="-48h" latest="-24h" | stats count As hitcountyesterday| rename hitcountyesterday as query] | eval hitcounttoday=[search "Problem occured while storing credit card application" earliest="-24h" | stats count As hitcounttoday| rename hitcounttoday as query]
... View more
- Tags:
- alert
- eval
- subsearches