All Apps and Add-ons

TrackMe App: Is there a way to monitor a lookup or kvstore?

rbolande
Explorer

In my testing, I am very impressed by the TrackMe app.  It is very full featured and very mature.  Thank you for your efforts in delivering it to the Splunk Community!

One need we have in our environment is monitoring the contents of lookups.  I would love to be able to do this within TrackMe as well...is there a trick to maybe getting this to work today?  I played around with 'inputlookup' with 'raw' and 'from' search types in elastic data sources, but I am not really seeing a way to implement.

Does anyone have any ideas on how/if TrackMe might be able to monitor lookups as well?

Thanks,
REID

Labels (1)
0 Karma

guilmxm
SplunkTrust
SplunkTrust

HI @rbolande 

Thank you very much 😉
I am very glad you enjoy it! (well common truely TrackMe is stunning lol)

I am interested in this use case, these were one of the things I was thinking about underneath the Elastic sources.

I believe indeed the from approach with Elastic Sources would comply with the requirements, I haven't much the chance to document it and what would be the requirements, however I believe something like this would be a good start:

guilmxm_0-1606772066946.png


In this example, the lookup has an _time field, there are some concepts which of course are not relevant for a lookup like ingestion level metrics, however if the lookup has an _time field then it is possible to monitor the "event lag" naturally:

guilmxm_1-1606772311090.png


With a few evolutions, I could design TrackMe to fully address the use case, if you wanna tell me more about yours feel free to do so, otherwise I will start to look at it.

It is a very good case, I see so often customers not having the knowledge that the lookups were not updated, this is a definitively a job for TrackMe.

Guilhem









0 Karma

rbolande
Explorer

Thanks for the quick response @guilmxm, and sorry for me delay in responding.

There are many different use cases that we have discussed regarding monitoring lookups.  Simply adding an _time field to some of our lookups may meet a lot of our needs though.  Because adding an _time field will help us monitor the volume of change and the number of records in a lookup, that may meet many of our needs.  The only other needs are very specialized use cases for validating the data actually in a specific lookup.  This is stuff like:

1) Make sure that our 'holidays' lookup contains > 6 records for every year

2) Validating the data within the lookup fields to make sure someone didn't mess something up during a manual edit.

I'll have to play for with your suggestion to see if we find any other use cases that don't work well.

Thank you!

REID

0 Karma

guilmxm
SplunkTrust
SplunkTrust

Hi @rbolande 

Thanks and no pb at all.
I am fully convinced by the use case and TrackMe is perfectly in line for the job 😉

Lookup tracking will be fully taken in charge in the upcoming 1.2.28 version:

https://trackme.readthedocs.io/en/testing/userguide.html#elastic-source-example-3-tracking-lookups-u...

https://trackme.readthedocs.io/en/testing/userguide.html#elastic-source-example-3-creation

Which allows exactly what you mentioned, monitoring that the lookup was last updated in a given time period, and as well tracking the number of records via the outliers detection.

In addition, TrackMe 1.2.28 will allow as well tracking lookups remotely, which means monitoring lookups against remote Splunk search head(s) using the rest command, that is being able to track lookups that are available from the search head(s) hosting the TrackMe app.

Dev and qualification is on processing and the release will be published shortly.

Let me know if you have any question.

Guilhem

0 Karma

guilmxm
SplunkTrust
SplunkTrust

@rbolande 

TrackMe 1.2.28 is now out in Splunk Base, this version supports officially monitoring lookups, CSV or KVstore based, locally or remotely via REST:

https://trackme.readthedocs.io/en/latest/userguide.html#elastic-source-example-4-creation

If any question or issues, let us know 😉

Guilhem

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...