In the metrics getting started documentation ( http://docs.splunk.com/Documentation/Splunk/7.0.0/Metrics/GetStarted ) it says "Summary indexing does not work with metrics."
When I read the rest of the documentation I don't see any specific reason I couldn't craft my own data to fit the metrics format.
If I massage an event into having all the correct fields ( http://docs.splunk.com/Documentation/Splunk/7.0.0/Metrics/GetStarted#Metrics_data_format ) could I save that event to a metrics store?
I am looking to leverage the speed increase in the metric store with data I already process and save into summaries.
My only concern might be that the sourcetype would be stash so I may need custom input stash parsing to make it "fit".
You could also just create an accelerated data model with your existing events, add all needed fields, and use
tstats to get the data back from the data model. Benefits: no need to munch up data and also lighting fast 😉
We'd tried accelerated data models for this specific use case but it just killed our indexers (multi million user radius accounting stats)
I'll probably revisit again as a couple of big splunk versions have come out so perhaps performance has improved since then.
I do agree accelerate data models makes keeping those statistics up to date a much simpler process though!
If you have the correct fields, you can push this into a metrics index, Yes.
Accelerated data model (as @MuS says) is a better option in my view though.. Otherwise you're left writing the Props or Searches required to get this to fit into the metrics indexes.
Here's a bit more for that approach : http://docs.splunk.com/Documentation/Splunk/7.0.0/Metrics/GetMetricsInOther
Looks like the only way is an outputcsv file.
collect puts to many extra fields and also stores the file into the spool dir which will be picked up by the default stash inputs stanzas.