Getting Data In

Get version, date, and definition date from ClamAV

albinortiz
Engager

I need to get the today's date, av def date, and version from clamav (Linux antivirus). If you run the ./clamav.sh -V you get this info. However, I want to be able to get that into Splunk. What I've been trying to do is to capture this info through a script (inside a cron job) and output that data into a txt file. So far I have been successful. Here's were I am failing:

The only common directory that I can use is the "/home/security" directory. However, when I put the file in that directory, I can't read it. I checked permissions and everything is as it should. I believe I have to do something with the source or sourcetype but I don't really understand that.

Some help will be greatly appreciated.

Tags (1)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Another option is to use a scripted input. Anything output by a scripted input is indexed so you don't need an intermediate file. You can probably use your existing script, without the part that writes the results to disk. The script will have to reside in an app's bin directory. Splunk can even use cron strings to schedule when the script runs.

To create the scripted input, select Settings->Data inputs then click "Add new" on the Scripts line. Fill in the form, select "Cron schedule" from the "Input interval" dropdown, "Run on Cron Schedule" as the Interval, fill in the Cron Expression, and click Next. Choose the sourcetype of the data and the index to receive the events. Click Review then Save and you're done.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Another option is to use a scripted input. Anything output by a scripted input is indexed so you don't need an intermediate file. You can probably use your existing script, without the part that writes the results to disk. The script will have to reside in an app's bin directory. Splunk can even use cron strings to schedule when the script runs.

To create the scripted input, select Settings->Data inputs then click "Add new" on the Scripts line. Fill in the form, select "Cron schedule" from the "Input interval" dropdown, "Run on Cron Schedule" as the Interval, fill in the Cron Expression, and click Next. Choose the sourcetype of the data and the index to receive the events. Click Review then Save and you're done.

---
If this reply helps you, Karma would be appreciated.
0 Karma

albinortiz
Engager

I like this idea! However, how does this work for all computers? Splunk is going to run that script locally, right?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Yes, it runs locally. To run it on all computers, set up the script as an input to your Universal Forwarders (similar to Splunk_TA_nix).

---
If this reply helps you, Karma would be appreciated.
0 Karma

albinortiz
Engager

So I did it and I am seeing the results but only from 1 box. I selected the index which contains all my Linux boxes. Do I need to copy the script to every box? Is it that the Forwarder runs that script locally on those boxes?

Here's what I did:

Add data > Select Forwarder (selected my nix forwarders) > Selected the Script, script path, and interval > select Input settings (selected generic_single_line) > Finish

I selected single line since the output is a single line of characters.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The script should be copied to every box from which you want ClamAV data.

If you have more than a few forwarders, you should be using a deployment server (DS) to manage them. Create an app on your DS, put your script in the app's bin directory, and define a [script:...] stanza in the default/inputs.conf file of the app. Tell the DS to ship the app to all indexers and you're done.

---
If this reply helps you, Karma would be appreciated.
0 Karma

albinortiz
Engager

Thanks fro all the information!

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...