Splunk Search

How to create a list to collect "well-knowed process"?

raffaelecervino
Engager

Hi,

I'm doing a project and I've installed Splunk Trial Enterprise on a server and Universal Forwarder on other three servers (with Ubuntu) that sends me logs. On forwarders exist a script that sends me logs of every processes that's running on server.

I would to create a dynamic list where logs of processes is added and tagged as "Well-Knowned Processes".  
After that when new logs of processes come to indexer they are compared with logs on dynamic list and if the process was not recognized (doesn't exist in the list) the alert is triggered.

I would to do that to check suspicious process.

Thanks 

 
Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you have two choices:

  • schedule a search that lists all the "well_known" processes and stores them in a lookup to use for the following checks;
  • run a long search.

I prefer the first solution because is quicker but it requires a little bit more work.

in few words, you have to:

  • create a lookup called e.g. processes.csv in which there's at least one column called process (the same field name to search),
  • schedule a search like the following using a frequency that depends on when you want to update your list (e.g. one time a day or every hour),
index=your_index
| dedup process
| sort process
| table process
| outputlookup processes.csv append=true
  •  schedule an alert like the following to trigger when there are results (results>0):
| index=your_index NOT [ | inputlookup processes.csv | dedup process | fields process ]
| dedup process
| sort process
| table process

In this way you have a very quick search that you can run also with an high frequency and, if you want, you can also manually modify the lookup adding or deleting processes.

Ciao.

Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you have two choices:

  • schedule a search that lists all the "well_known" processes and stores them in a lookup to use for the following checks;
  • run a long search.

I prefer the first solution because is quicker but it requires a little bit more work.

in few words, you have to:

  • create a lookup called e.g. processes.csv in which there's at least one column called process (the same field name to search),
  • schedule a search like the following using a frequency that depends on when you want to update your list (e.g. one time a day or every hour),
index=your_index
| dedup process
| sort process
| table process
| outputlookup processes.csv append=true
  •  schedule an alert like the following to trigger when there are results (results>0):
| index=your_index NOT [ | inputlookup processes.csv | dedup process | fields process ]
| dedup process
| sort process
| table process

In this way you have a very quick search that you can run also with an high frequency and, if you want, you can also manually modify the lookup adding or deleting processes.

Ciao.

Giuseppe

raffaelecervino
Engager

Thanks,  it works perfectly!

Is there a semantic to don't append the same processes in the .csv file?

Because I run the search everyday (for a while) to appending new processes (to train the model about the main processes in the machine) and i would to prevent double processes in .csv file.

Thanks a lot!

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino,

you could create another scheduled search that every day removes duplicates, something like this:

| inputlookup processes.csv
| dedup process
| sort process
| table process
| outputlookup processes.csv

or modify the sceduled search for populating the lookup:

index=your_index
| fields process
| append [ | inputlookup processes.csv | fields process ]
| dedup process
| sort process
| table process
| outputlookup processes.csv

Ciao.

Giuseppe

gcusello
SplunkTrust
SplunkTrust

Hi @raffaelecervino ,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

Get Updates on the Splunk Community!

Splunk Answers Content Calendar, July Edition I

Hello Community! Welcome to another month of Community Content Calendar series! For the month of July, we will ...

Secure Your Future: Mastering Upgrade Readiness for Splunk 10

Spotlight: The Splunk Health Assistant Add-On  The Splunk Health Assistant Add-On is your ultimate companion ...

Observability Unlocked: Kubernetes & Cloud Monitoring with Splunk IM

Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team on ...