Splunk Search

Is there a way to make use of a list of IP addresses for servers which have sensitive information in a Splunk search?

wrangler2x
Motivator

Our campus is putting together a database of systems with sensitive or restricted information on them. I'd like to export the IP addresses from this system in a list format (just IP addresses) and find a way of using these in a search of indexed data.

For example, search Palo Alto logs in Splunk and have the search come back with results if the log entry has severity=high and the IP address also exists in this database.

So the question is, how to do that, both in terms of where the information goes on the system, and in terms of using it in a search.

My initial thought was to export it as a list and then somehow use the lookup command, but looking at the docs it looks like lookup tables are expected to be CSV rows.

Any good ideas about how to do this?

Tags (3)
0 Karma

elliotproebstel
Champion

Step One: Create and upload lookup file
For your purposes, you can turn that list into a CSV by ensuring you have one IP address per line and prepending a header row by adding a single row at the top of the file containing the name you'd like to call that field - something like ip, probably. Save the file with a .csv extension, and then use the Splunk UI to add it as a Lookup table file: Settings > Lookups > Lookup table files > Add new. Upload the file into the app in which you plan to use it, and remember the name you give it (the name should end in .csv). Let's say you named it campus_ips.csv so I can reference it later.

Step Two: Use lookup in search
If you want to use the list of IP addresses as a search filter across your Palo Alto logs and retain only events from those IPs whose severity=high, then this should work:

index="something palo alto" sourcetype="something palo alto" severity=high
[| inputlookup campus_ips.csv 
 | fields ip
 | format ]

wrangler2x
Motivator

I was going to say this does not work, but then I realized that the search that comes back from the sub-search is going to be ip=xxx.xxx.xxx.y OR ip=xxx.xxx.xxx.z and so if there is no field called ip it fails. I added a rename to it to rename ip as dest_ip and that did work.

So, here is the next thing... can I update the .csv file without restarting splunk and have it reflect in the searches, or am I going to have to schedule a restart or reload every time I've got to update the file?

0 Karma

damiensurat
Contributor

You do not have to restart Splunk when you upload a lookup. I’m curious to understand why you are restarting Splunk.

0 Karma

wrangler2x
Motivator

I did not restart Splunk when I created the .csv table. I used Settings > Lookups > Lookup table files > Add new as you suggested. But I didn't know what is involved with updating it. It must reload each time it is accessed then, so I can just re-write it. Why I asked about that.

On this comment about pulling back information by leveraging python in Splunk, is there any documentation on this, or an existing example somewhere?

0 Karma

elliotproebstel
Champion

The most common ways people incorporate python into Splunk are by writing custom search commands or external lookups. Do you have a use case? If you describe it, I'm happy to help you figure out the best way to approach it.

0 Karma

wrangler2x
Motivator

The current loose plan is to have the person who created the software to input to and display from the database write a utility that runs on that system anytime an IP address is changed, added or deleted, which will then do an export of the IPs to a file.

From this point, we could have that utility scp the file to a drop-off point (say /tmp) on the search head, Then we need something on the search head (perhaps a cron job running under user Splunk) to update the .csv lookup file in /opt/splunk/etc/search/lookups.

0 Karma

starcher
SplunkTrust
SplunkTrust

If the lookup has more than a couple hundred entries do not do the sub search method. use the pattern. Or if you are doing a NOT exclusion.

index="something palo alto" sourcetype="something palo alto" severity=high | lookup campus_ips.csv ip AS dest_ip OUTPUTNEW dest_ip AS isFound | where isnotnull(isFound)

wrangler2x
Motivator

This works dandy if I change 'OUTPUT dest_ip AS isFound' to OUTPUTNEW ip AS isFound

Using dest_ip there I get this:

Error in 'lookup' command: Could not find all of the specified destination fields in the lookup table.

I'd much rather use this than a sub-search. That's a great suggestion -- thanks!

0 Karma

elliotproebstel
Champion

Yeah, great point, @starcher. As is so often true with Splunk, the "correct" approach will change based on data size/types, and you'll be best off testing a few different routes and seeing which one fits your environment best.

0 Karma

damiensurat
Contributor

One last note, you can invest data from sql into Splunk via the dbconnect app. Or perhaps write a custom command that connects to the database to pull back information without indexing the data into Splunk. This can be done leveraging python in Splunk.

0 Karma
Get Updates on the Splunk Community!

Improve Your Security Posture

Watch NowImprove Your Security PostureCustomers are at the center of everything we do at Splunk and security ...

Maximize the Value from Microsoft Defender with Splunk

 Watch NowJoin Splunk and Sens Consulting for this Security Edition Tech TalkWho should attend:  Security ...

This Week's Community Digest - Splunk Community Happenings [6.27.22]

Get the latest news and updates from the Splunk Community here! News From Splunk Answers ✍️ Splunk Answers is ...