All Apps and Add-ons

automating lookup?

yehias90
New Member

Hello,

First thanks a lot for your efforts, and for such a great app.

I'm wondering if I can automate ip, domain, and file hash lookup then append the result to my index in order to parse additional appended fields from splunk.

Thank you.

0 Karma

tomaszdziwok
Path Finder

Hi yehias90,

If correctly understand your intent, you wish to augment your events with information brought back from VirusTotal?

You may automate the "cache building" - the creation of lookups that will contain the data relevant to your environment. This can be done in the "Set-Up" page of the TA, and is further documented here.

After enabling the "Cache Auto Update" sections, you will be able to use lookups in any of your searches to quickly reference VirusTotal data. This can be achieved easily by running a search like this:

index=email_attachments attachment_hash=*
| fields attachment_hash, from
| lookup virustotal_hash_cache vt_hashes AS attachment_hash OUTPUT vt_classification, vt_query_time

This will provide you with the "vt_classification" and "vt_query_time" for your data.

From here, you can build any Scheduled Searches, Reports, or even alerts that you need.
If you are looking to save the results of these searches to an index for audibility, I would suggest using the "collect" command.

Hopefully this helps answer your question.

Best Regards,
Tomasz

0 Karma

yehias90
New Member

Hello tomaszdziwok,

Thanks a lot for your reply,

for the search query you posted this will help me check the hash but I found another way using VT with dashboard for collect command is there any example or tutorial I can follow just to understand the how-to

0 Karma

tomaszdziwok
Path Finder

Hi yehias90,

You can use other (similar) searches to look at other VT types. (This documentation)[https://gitlab.com/adarma_public_projects/splunk/TA-VirusTotal#caching-support] lists all the lookups you can use.

So for example, if I wanted to search the URLs lookup, I could use something like:

index=proxy_logs url=*
 | fields url, from
 | lookup virustotal_url_cache vt_urls AS url OUTPUT vt_classification, vt_query_time

Splunk's "collect" command can be used to store the results of a search in an index of your choosing. So if I ran:

| index=proxy_logs url=*
| fields url, from
| lookup virustotal_url_cache vt_urls AS url OUTPUT vt_classification, vt_query_time
| vt_classification>0
| collect index=bad_url_audit

This search would find all "suspicious" URLs that have been requested in the "proxy_logs" and then the collect command would write those events into the "bad_url_audit" index.

More examples of the collect command can be found on the official Splunk Docs: https://docs.splunk.com/Documentation/Splunk/7.3.1/SearchReference/Collect

0 Karma
Get Updates on the Splunk Community!

Don't wait! Accept the Mission Possible: Splunk Adoption Challenge Now and Win ...

Attention everyone! We have exciting news to share! We are recruiting new members for the Mission Possible: ...

Unify Your SecOps with Splunk Mission Control

In today’s post, I'm excited to share some recent Splunk Mission Control innovations. With Splunk Mission ...

Data Preparation Made Easy: SPL2 for Edge Processor

By now, you may have heard the exciting news that Edge Processor, the easy-to-use Splunk data preparation tool ...