I am importing in splunk many tables of data of 500 to 10000 events each and I need to use them to enrich events with scheduled searches. At the moment I import these tables using a modular input and dumping them into an index, I then join my saved searches results with the latest data from this index. The tables are imported once a day to update if something changed (they usually are mostly unchanged).
index=my_events
| join type=left common_field
[ search index=imported_data source=src earliest=-24h
stats latest(*) as * ]
I know join is bad for performance and was wondering if importing the data in a KVStore and setting up an automatic lookup for the index with the data I want to enrich would be a better solution. in this case i would overwrite the KVStore once a day with the new data. Other solutions are welcome, these are the ones I came up with. Thanks.
... View more