Splunk Search

Search for n number of instances

johnblakley
Explorer

I'm wondering if this is possible. I have a field from our ASA formatted like the following:

5/16/13 11:26:28.000 AM cisco_asa udp:514 May 16 11:26:28 10.125.100.54 May 16 2013 11:26:28 : %ASA-6-106100: access-list INSIDE denied tcp INSIDE/x.x.x.x(59168) -> OUTSIDE/x.x.x.x(80) hit-cnt 1 first hit [0x3532d8ca, 0x0]

What I'd like to do is be able to report on the source address only if it hits 10 times within a certain amount of minutes. Is that even possible with Splunk?

Thanks!
John

Tags (3)
0 Karma

johnblakley
Explorer

Thank you for your response! The only issue is that the field that I'm wanting to search on isn't a recognized field. I can search as text on that field, but it's seen as a _raw field. These are always going to be different source addresses within the raw data. For example, if I had host 192.168.1.50, 192.168.1.52, and 192.168.1.53, I may have Splunk data like:

192.168.1.50 (may have 10 hits in 10 minutes)
192.168.1.52 (may have 50 hits in 10 minutes)
192.168.1.53 (may have 22 hits in 10 minutes)

The problem that I see is that rex seems to be nothing more than a regex pattern. How can I keep track of 192.168.1.50's count and 192.168.1.53's count separately when the text in the original post is seen as a single string with no fields attached?

Also, I won't be able to create a props file to dump these because we're talking about thousands of addresses that could potentially be coming through the firewall that need to be reported on.

Thanks!
John

0 Karma

usethedata
Path Finder

Create a field extract to parse the data and then you can work with variables:

ASA-6-10610[02]: access-list (?P<fw_aclname>[^ ]+) (?P<fw_action>[^ ]+) (?P<protocol>[^ ]+) [^/]+/(?P<sourceip>[^\(]+)\((?P<sourceport>[^\)]+)\) [^/]+/(?P<destip>[^\(]+)\((?P<destport>[^\)]+)
0 Karma

chris
Motivator

Revised Answer:

The short answer is:
Yes splunk is designed for such tasks

The following steps are required to reach your goal:

  • Prepare the raw data (set the sourcetype, download & install the cisco app, create a firewall Index or modify the cisco app)
  • Create Alert

To reach your goal a few steps have to be accomplished but don't worry they're not that hard.

You have Cisco ASA events in Splunk, there is a technology add on that you can download (it's free):

>> Get it

The app contains the necessary regexes and will extract a lot of fields by default for you.
In order for the app to work the sourcetype of your logs have to either be "syslog" or "cisco:asa". From the sample that you posted in the question it appears as though you are forwarding the events over port 514:udp at some point. So on the Splunk Instance where the you configured the udp input you can set the sourcetype to cisco:asa if the cisco logs are the only logs that are sent to that port or maybe syslog if lots of sources with syslog sourcetype are sent to that port. If the udp transmission is only an intermediary step and you have an inputs.conf with a monitor stanza somewhere set the sourcetype there.

The technology add on will try to send all the log files to the "firewall" index if your original sourcetype is syslog. You can either create that index or delete the following lin in $SPLUNK_HOME/etc/apps/TA-cisco_asa/default/props.conf:

[syslog]
TRANSFORMS-force_sourcetype = force_sourcetype_for_cisco_asa
TRANSFORMS-force_index = force_index_for_cisco_asa -> delete/comment this line


[cisco:asa]
LOOKUP-vendor_action = cisco_asa_actions vendor_action OUTPUT action
LOOKUP-app_type = cisco_asa_apptype sourcetype OUTPUT app
...
..
.

Then you can create the alert. This search will give you a list of source ips that occurred more than once:

index=firewall sourcetype=cisco:asa 106100 | stats count by src_ip | where count>10    

Then when you create the alert:

create alert

By chosing to get an alert when the number of results is greater than 0, you will be notified whenever the list that search creates contains at least one row. There are different ways of doing this, but I think this is enough to get started.

Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...