Hi,
I want to import these feeds to Splunk and compare the domains to the domains in the firewall log. Importutil is not working for me. Getwatchlist gives me 1 or 2 columns. I want all the data in Splunk. Is there another way? I did say Add Data -> Monitor Input -> Web-pages, but that did not work for me.
Thanks in advance for all any help!
There is no need to get them indexed
but this command gets them in:
| getwatchlist https://ransomwaretracker.abuse.ch/feeds/csv/
| rename ip_address AS _raw
| rex "(?<Firstseen>[^\,]*)\,(?<Threat>[^\,]*)\,(?<Malware>[^\,]*)\,(?<Host>[^\,]*)\,(?<URL>[^\,]*)\,(?<Status>[^\,]*)\,(?<Registrar>[^\,]*)\,(?<IPaddresses>[^\,]*)\,(?<ASNs>[^\,]*)\,(?<Country>[^\,]*)"
| eval _time = Firstseen
If you really need them indexed, then create a summary index
and use the collect
command by adding this:
| collect myNotNecessarilyNecessarySummaryIndexNameHere
There is no need to get them indexed
but this command gets them in:
| getwatchlist https://ransomwaretracker.abuse.ch/feeds/csv/
| rename ip_address AS _raw
| rex "(?<Firstseen>[^\,]*)\,(?<Threat>[^\,]*)\,(?<Malware>[^\,]*)\,(?<Host>[^\,]*)\,(?<URL>[^\,]*)\,(?<Status>[^\,]*)\,(?<Registrar>[^\,]*)\,(?<IPaddresses>[^\,]*)\,(?<ASNs>[^\,]*)\,(?<Country>[^\,]*)"
| eval _time = Firstseen
If you really need them indexed, then create a summary index
and use the collect
command by adding this:
| collect myNotNecessarilyNecessarySummaryIndexNameHere
Thanks a lot this is very cool, I did not know how to get all columns to splunk using Getwatchlist. Now the imported feed looks like this
Host IPaddresses Malware
"pmenboeqhyrpvomq.xx6jck.top" "202.7.59.40|202.7.59.40|5.1.75.177" "Cerber"
Now I want to alert real time when one of my user hits the Host mentioned in the feed.
Here are the fields in my firewall log
Src_ip , dest_ip, url
1.52.11.11, 202.7.59.40,"pmenboeqhyrpvomq.xx6jck.top/favicon.ico"
The alert should include
firewall.src_ip,firewall.dest_ip.firewall.url,feed.host,feed.IPaddresses,feed.Malware
I was under the impression that I probably have to index the data to combine info from both sources. Please let me know if there is another way to do it.
Hi,
Need your help. My idea of index did not quite work. So I am doing lookups like you suggested.
| getwatchlist https://ransomwaretracker.abuse.ch/feeds/csv/ delimiter="," relevantFieldName=url relevantFieldCol=5 referenceCol=3 dateCol=1 categoryCol=6 ignoreFirstLine=true isbad=true | outputlookup ransomwaretracker.csv
url category reference date isbad
http://dillerator.chat.ru/09yhbvt4 online Locky 2016-07-22 10:53:03 true
The trouble I am having is finding the firewall logs that match the url in the lookup. I ran the malware on my test VM and I have the following log on firewall -
x.x.x.x,195.161.119.85,0.0.0.0,0.0.0.0,URL_Global_Unknown_Continue-EXE,,,web-browsing,vsys1,Trust,Untrust,ethernet1/21,ethernet1/22,Panorama-Logging,2016/07/22 09:58:13,67140998,1,49296,80,0,0,0x8000,tcp,alert,"dillerator.chat.ru/09yhbvt4", -
the url in my firewall logs doesnt have "http" character in front so how do I join or find the matches. the fieldname is url in both places.
here is what I tried
sourcetype=pan:threat [ | inputlookup ransomwaretracker.csv | fields url]
You need to save it as a lookup and the use that lookup. Here is a blogpost about how this all works:
Hi,
I actually ended up indexing it for the ease of data extraction. I am still have trouble matching up based on domain since only part of domain matches.
so my firewall logs have url as zuerich-gewerbe.ch/favicon.ico and the ransomware index has domain http://zuerich-gewerbe.ch/mbv58gbv. I tried writing a join for these 2 fields but haven't had success so far since only "zuerich-gewerbe.ch" matches.
You can schedule a search to update the lookup every week or every night or whatever you like.
You did not list any feeds! In any case, the right way to do it is with Getwatchlist and modify the data once it is in to fit what you need. Explain more about how that app does not work.
Oops my bad - this is the feed that I wanted to send to Splunk - https://ransomwaretracker.abuse.ch/feeds/csv/