Splunk Search

How can I set up a case insensitive lookup instead of tags?

lguinn2
Legend

I have a lot of variation in my hostnames - some are upper case, some are lower case. I want my users to be able to easily search based on hosts. In addition, I have been using tags to label my hosts based on their purpose, ownership, etc. With a large number of hosts, this is getting hard to manage.

Can I use a lookup instead of tags? It can't be case-sensitive, since my host field varies.

Tags (3)
1 Solution

lguinn2
Legend

Here is one way to accomplish this:

First, create a lookup CSV file for the hosts. In the best example that I have seen, a script runs every night and collects host info from the Configuration Management database. Here is the header line for a CSV file

host, hostname, fqdn, ip, dept, application

host is the name as it appears in Splunk.

hostname is the "common name" that users know and use.

dept is the dept that "owns" the server.
application is the system that this server belongs to, such as "AP" "Ordering" etc.

The script writes the CSV file to the appropriate directory: /opt/splunk/etc/apps/myapp/lookups
In the same app, here is the props.conf and the transforms.conf (assume that the file name is hosts.csv)

props.conf

[yoursourcetypehere]
LOOKUP-lhosts = host_lookup host OUTPUT  hostname fqdn ip dept application

transforms.conf

[host_lookup]
filename = hosts.csv
case_sensitive_match = false
min_matches = 1
max_matches = 1
default_match = no entry for host

This lookup allows users to do searches such as application=Ordering and see all events related to a set of servers. I like it beter than tags, because I can set up a variety of ways to search from just one CSV file.

BTW, this search will give you a list of hosts that appear in Splunk, but have no entry in the lookup:

| metadata type=hosts index=* | fields host lastTime totalCount 
| fieldformat lastTime=strftime(lastTime,"%x %X") | where hostname = " no entry for host"

View solution in original post

lguinn2
Legend

Here is one way to accomplish this:

First, create a lookup CSV file for the hosts. In the best example that I have seen, a script runs every night and collects host info from the Configuration Management database. Here is the header line for a CSV file

host, hostname, fqdn, ip, dept, application

host is the name as it appears in Splunk.

hostname is the "common name" that users know and use.

dept is the dept that "owns" the server.
application is the system that this server belongs to, such as "AP" "Ordering" etc.

The script writes the CSV file to the appropriate directory: /opt/splunk/etc/apps/myapp/lookups
In the same app, here is the props.conf and the transforms.conf (assume that the file name is hosts.csv)

props.conf

[yoursourcetypehere]
LOOKUP-lhosts = host_lookup host OUTPUT  hostname fqdn ip dept application

transforms.conf

[host_lookup]
filename = hosts.csv
case_sensitive_match = false
min_matches = 1
max_matches = 1
default_match = no entry for host

This lookup allows users to do searches such as application=Ordering and see all events related to a set of servers. I like it beter than tags, because I can set up a variety of ways to search from just one CSV file.

BTW, this search will give you a list of hosts that appear in Splunk, but have no entry in the lookup:

| metadata type=hosts index=* | fields host lastTime totalCount 
| fieldformat lastTime=strftime(lastTime,"%x %X") | where hostname = " no entry for host"
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...