Splunk Search

Unable to access Metrics Indexes

mark_wymer
Path Finder

Hi all,
I'm using the (excellent) TrackMe app which uses a Metrics Index. The index has been created on a Indexer Cluster and I've verified that it is actually there ( /opt/splunk/bin/splunk list index -datatype metric ). However, when I try and search the index using 

|  mcollect split=t index="trackme_metrics"

I get the following: "Error in 'mcollect' command: Must specify a valid metric index"

This is the 1st and only metrics index on our cluster so I cannot verify that other metrics indexes work OK. Also, the only suggested resolution to this seems to be that I should put the metrics index on our searchhead cluster - but that makes no sense to me!

Am I doing something wrong or is there some setting that I need to configure before I can use a metrics index?
Many thanks, Mark.

0 Karma
1 Solution

mark_wymer
Path Finder

The solution that I found, and as mentioned by others in different scenarios, is that the Metrics Index has to be defined on the searchheads as well as the indexer cluster. This is unlike an event index which does not, necessarily, have to be defined on a searchheads.

Once I added an indexes.conf for that index on the SH, everything worked fine.

View solution in original post

0 Karma

gjanders
SplunkTrust
SplunkTrust

I suspect you may be confused here, you are running | mcollect which collects data into a metrics index.

 

Perhaps you want | mstats or | mcatalog

 

If you are new to metrics then I'd suggest starting in the analytics workspace (Splunk 😎 or metrics workspace (Splunk 7.x?) depending on your Splunk version

0 Karma

mark_wymer
Path Finder

Hi, thanks for your response. I cut down the actual SPL for brevity but I do understand your point. Being new to Metrics indexes any pointers are always useful.

i have, however, resolved my issue (posted separately).

 

0 Karma

mark_wymer
Path Finder

The solution that I found, and as mentioned by others in different scenarios, is that the Metrics Index has to be defined on the searchheads as well as the indexer cluster. This is unlike an event index which does not, necessarily, have to be defined on a searchheads.

Once I added an indexes.conf for that index on the SH, everything worked fine.

0 Karma

guilmxm
Influencer

@mark_wymer 

In true every index you declared on the indexers should be declared in the same way in all the search heads accessing these same indexers, this is a configuration good practice for different reasons such as the index name auto completion, or this use case you encountered.

The good deployment config practice we recommend you to use the Professional Services base config apps:

Base Apps: https://drive.google.com/open?id=107qWrfsv17j5bLxc21ymTagjtHG0AobF

Cluster Apps: https://drive.google.com/open?id=10aVQXjbgQC99b9InTvncrLFWUrXci3gz

In your deployment, on the search head you want:

- The same volumes defined than you have in your indexer cluster
- Which allows you to push the exact same copy of indexes.conf you deploy to the indexers

My best advise is to look at what Splunkenizer does:
https://github.com/splunkenizer/Splunkenizer

(you can spawn a virtual env and analyse for instance)
Splunkenizer generates a 100% perfect good practice compliant Splunk env. (and is in between such a wonderful thing!)

Last but not least, trackme itself includes a default/indexes.conf, since some years this is not something recommended (from the app publication point of view), however this is required for TrackMe because there are various usage of collect and mcollect commands in different reports, which would lead appinspect to fail if the indexes are not part of the app, plus some more serious issues like you had. So technically you should have had the indexes defined in the search head too

Guilhem


gjanders
SplunkTrust
SplunkTrust

Oh right. Newer versions of trackme include the indexes.conf for this reason 

Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...