Splunk Enterprise Security

In Splunk Enterprise Security, how do you go about keeping historical data on a cron job alert that alerts on new Information?

chandlercr
New Member

I am pulling information from a search that I need to keep but update on top of.

For example, my search is finding machines that contain a certain file path (via execution logs at this point)

sourcetype=security_source FilePath="whatever_goes_here" | table HostName | dedup HostName | sort HostName

It's a very simple search, but it gets me a list of machines that have had executions along that path, which is what I need. I want to set up an alert that will let me know when new machines enter the bunch.

So if I had machines:

D1234
D1414
L1312

Those would show up on, lets say, Search 18. When Search 19 comes around to happening, a new machine (L8564) had an execution along that path, meaning that it would be added to the list. I only want to alert on the new machine being added to the Table (L8564).

Here's a few caveats:

  • I cannot have this as a real-time search as this will bog down our system. That means that, if I run this as a cron job search, it will alert on all of the machines that are listed every time it is run. Instead, I want it to know that, "Oh, this machine hasn't been seen on this before "Triggers Alert"
  • I am using the dedup command to find, essentially, a "values()" of the HostName's, and that way, it creates a new cell to which I can alert on.
  • Another reason I am using dedup is because we have a lot of different sorts of executions happening along that path, meaning that we can have about 10 results on the table but about 4,000 results in the events. (I also cannot use a specific file as there is not exactly a common one that gets executed across the board, and also, the file's name can change.)

TLDR: Is there any way that I can have this search made on a cron job, keep the historical data (all of the original table entries), and alert on any new entries (meaning we had 6 computers before this search was run and now we have 7 this time Alert on new Entry)

0 Karma

woodcock
Esteemed Legend

The general approach is to use a lookup file, like this:

 sourcetype=security_source FilePath="whatever_goes_here"
| dedup HostName
| inputlookup append=t MyHistoricalLookup.csv
| dedup HostName
| sort HostName
| outputlookup MyHistoricalLookup.csv
0 Karma

maraman_splunk
Splunk Employee
Splunk Employee

Hi ,

First possible solution :
you could try to use a lookup with something like this
index=xxxx sourcetype=security_source FilePath="whatever_goes_here" | fields HostName | dedup HostName | lookup hostname-seen-lookup.csv OUTPUT Hostname as Hostname2 | search NOT Hostname2=* | fields Hostname | outputlookup append=t createinapp=t hostname-seen-lookup.csv

Note : I replaced table with fields as table is a formatting command that should only be at the end.

Second possible solution :
index=xxx sourcetype=security_source FilePath="whatever_goes_here" | stats earliest(_time) as _time by HostName
then use the builtin throttling functionality in ES correlation searches with the Hostname fields and a very high duration (like years).

Note : try to use a cim fields when possible (host, dvc, src, dest, ...) , then you could even optimize further by leveraging accelerated data.

Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...