Hi,
What I did understand from tags, is that you can tag a field value. For example, I can tag clientip=1.1.1.1 as suspect. But, if I have dozens of addresses from a search.
Suppose that I'm analysing a incident in an Apache server. I know that the hacker is from a country C. And may be he accessed to the admin panel.
so my first search would be host=apache | iplocation clientip | search country=C
And my second search: host=apache uri="/admin/"
Now I want to tag all ip addresses that are in the result of both search 1 and search 2 as suspects. Then analyse each ip address thoroughly.
My second question, is it possible to make an eval field permanent. For example, I define a field success rate for an ip address with eval as the count of success status returns (200) devide by the count of requests.
Thank you
What I need is mark ip addresses or events in a first search to use it later. I gave two example of filters:
So further in my analysis I will use this 2 ip addresses.
I have 2 options tag clientip or create an eventtype.
The first one seem more logic, because I'm marking only one field: clientip. But, the limitation here I can't tag'em all automatically. I should tag each ip address manually, but I have a couple hundreds of em coming from country X and another couple hundreds that tried to access to my administration page.
So the second one should be good, if I use subsearch. Again, there is a limitation, Eventtype search string cannot be a search pipeline or contain a subsearch. This limitation may be exists because of some theoretical or practical problems that may occur otherwise.
while I was looking for an answer I found this question about search some specific bad domains:
http://answers.splunk.com/answers/2457/inputlookup-against-a-list-of-bad-domains
The difference between this case and mine, is that i should create my bad guys ip list from my data in Splunk not from an other source. But, why not export my list of suspects?
so I export it the first and second list to two csv files:
host=apache | iplocation clientip | search Country=X | fields clientip | dedup clientip | outputlookup suspectIP1.csv
host=apache uri="/admin/*" | dedup clientip | fields clientip | outputlookup suspectIP2.csv
Then I used with both lists, and ofcourse I can call this two lists any time later:
host=habous [|inputlookup suspectIP2.csv | fields clientip] [|inputlookup suspectIP1.csv| fields clientip]
Now I have only 4 IP addresses instead of hundreds. Honestly, I wanted something less complicated like:
host=apache tag=suspectIP1 tag=suspectIP2
As for the second question, this is the answer that I found:
host=apache | eval OK=if(status==200,1.0,0.0) | stats sum(OK) as sum,count by clientip | eval avg=(1-sum/count) | fields clientip avg| sort - avg | head 20
But, I don't know how add a new column or save it permanently (except exporting it in a csv format)
What I need is mark ip addresses or events in a first search to use it later. I gave two example of filters:
So further in my analysis I will use this 2 ip addresses.
I have 2 options tag clientip or create an eventtype.
The first one seem more logic, because I'm marking only one field: clientip. But, the limitation here I can't tag'em all automatically. I should tag each ip address manually, but I have a couple hundreds of em coming from country X and another couple hundreds that tried to access to my administration page.
So the second one should be good, if I use subsearch. Again, there is a limitation, Eventtype search string cannot be a search pipeline or contain a subsearch. This limitation may be exists because of some theoretical or practical problems that may occur otherwise.
while I was looking for an answer I found this question about search some specific bad domains:
http://answers.splunk.com/answers/2457/inputlookup-against-a-list-of-bad-domains
The difference between this case and mine, is that i should create my bad guys ip list from my data in Splunk not from an other source. But, why not export my list of suspects?
so I export it the first and second list to two csv files:
host=apache | iplocation clientip | search Country=X | fields clientip | dedup clientip | outputlookup suspectIP1.csv
host=apache uri="/admin/*" | dedup clientip | fields clientip | outputlookup suspectIP2.csv
Then I used with both lists, and ofcourse I can call this two lists any time later:
host=habous [|inputlookup suspectIP2.csv | fields clientip] [|inputlookup suspectIP1.csv| fields clientip]
Now I have only 4 IP addresses instead of hundreds. Honestly, I wanted something less complicated like:
host=apache tag=suspectIP1 tag=suspectIP2
As for the second question, this is the answer that I found:
host=apache | eval OK=if(status==200,1.0,0.0) | stats sum(OK) as sum,count by clientip | eval avg=(1-sum/count) | fields clientip avg| sort - avg | head 20
But, I don't know how add a new column or save it permanently (except exporting it in a csv format)
Hi jack_howard,
regarding the first question, you can create an event type and specify some tag for it. Regarding the second question, you setup field extraction to have the fields created either at search time or at index time.
hope this helps ...
cheers, MuS
Yes, But this way all apache logs will be tagged as "TheBadGuys".
try to build an event type named apache like this to start and see if this is what you want/need/expect:
host=apache
and set a tag like 'TheBadGuys' . If you now search for
eventtype=apache | iplocation clientip | search country=C
the results will have a tag called tag::eventtype=TheBadGuys
Also, is it possible to tag automatically tag all the clientip result from this search (not individually)?
Thank you for your reply, regarding the first question. I've tried to create an event but it was not possible:
In handler 'eventtypes': Error while parsing eventtype search: host=apache eventtype=admin | iplocation clientip | search Country=C. Message: Eventtype search string cannot be a search pipeline or contain a subsearch