Splunk Search

How to create a list to collect "well-knowed process"?

raffaelecervino
Engager

Hi,

I'm doing a project and I've installed Splunk Trial Enterprise on a server and Universal Forwarder on other three servers (with Ubuntu) that sends me logs. On forwarders exist a script that sends me logs of every processes that's running on server.

I would to create a dynamic list where logs of processes is added and tagged as "Well-Knowned Processes".  
After that when new logs of processes come to indexer they are compared with logs on dynamic list and if the process was not recognized (doesn't exist in the list) the alert is triggered.

I would to do that to check suspicious process.

Thanks 

 
Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you have two choices:

  • schedule a search that lists all the "well_known" processes and stores them in a lookup to use for the following checks;
  • run a long search.

I prefer the first solution because is quicker but it requires a little bit more work.

in few words, you have to:

  • create a lookup called e.g. processes.csv in which there's at least one column called process (the same field name to search),
  • schedule a search like the following using a frequency that depends on when you want to update your list (e.g. one time a day or every hour),
index=your_index
| dedup process
| sort process
| table process
| outputlookup processes.csv append=true
  •  schedule an alert like the following to trigger when there are results (results>0):
| index=your_index NOT [ | inputlookup processes.csv | dedup process | fields process ]
| dedup process
| sort process
| table process

In this way you have a very quick search that you can run also with an high frequency and, if you want, you can also manually modify the lookup adding or deleting processes.

Ciao.

Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you have two choices:

  • schedule a search that lists all the "well_known" processes and stores them in a lookup to use for the following checks;
  • run a long search.

I prefer the first solution because is quicker but it requires a little bit more work.

in few words, you have to:

  • create a lookup called e.g. processes.csv in which there's at least one column called process (the same field name to search),
  • schedule a search like the following using a frequency that depends on when you want to update your list (e.g. one time a day or every hour),
index=your_index
| dedup process
| sort process
| table process
| outputlookup processes.csv append=true
  •  schedule an alert like the following to trigger when there are results (results>0):
| index=your_index NOT [ | inputlookup processes.csv | dedup process | fields process ]
| dedup process
| sort process
| table process

In this way you have a very quick search that you can run also with an high frequency and, if you want, you can also manually modify the lookup adding or deleting processes.

Ciao.

Giuseppe

raffaelecervino
Engager

Thanks,  it works perfectly!

Is there a semantic to don't append the same processes in the .csv file?

Because I run the search everyday (for a while) to appending new processes (to train the model about the main processes in the machine) and i would to prevent double processes in .csv file.

Thanks a lot!

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you could create another scheduled search that every day removes duplicates, something like this:

| inputlookup processes.csv
| dedup process
| sort process
| table process
| outputlookup processes.csv

or modify the sceduled search for populating the lookup:

index=your_index
| fields process
| append [ | inputlookup processes.csv | fields process ]
| dedup process
| sort process
| table process
| outputlookup processes.csv

Ciao.

Giuseppe

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino ,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...