I'm on the fish for ideas or anybody who has previous experience with this.
Essentially, we have two tables of (mostly) fixed data which we would like to 'teach' Splunk (for want of a better term).
To put it in context we have throughput files that report a transaction ID and a transaction time, Splunk grabs these fields no problem. Elsewhere in some flat tables we have transaction names (that relate to an ID) and a time threshold for each transaction time.
Is there anyway we can bring this data into the mix? If Splunk can know about the average for each transactions, and compare to the actual times (our main concern) and if it could line up the arbitrary transactions ID's with the meaningful names it would make analysis of the logs inifnitely more useful.
I'm a bit of a Splunk noob (actually, a lot of one) so sorry if there is precedent for this or some glaringly obvious answer. Really just looking for any sort of starting point.
Thanks in advance for any advice you can give. I can elaborate further if need be.