Knowledge Management

Best ways to report on large volumes of data?

mlevsh
Builder

Hi,

For Service-Now dashboards we need to report on large volumes of data (one year of data to be precise).
Is summary index the best way to prevent slowness of dashboards?

To get all panels on the dashboards "Active tickets", "Active Tickets aged more than 60 days", "Not Updated tickets", etc - we would need to search on all available data for an year.

Thank you in advance

2 Solutions

bandit
Motivator

Datamodel/tstats would be my first choice as they can be updated/rebuilt when you have maintenance or changes to the fields you are reporting on. Data models can only retain data as long as the source index.

Summary indexes can live indefinitely however may need manual intervention to backfill data gaps during maintenance windows or failures.

Metrics/mcollect/mstats is the newest and fastest method for historical metrics but lacks some of the automation of a data model.

Lastly there are lookup tables/CSVs and the kvstore that sometimes work well for edge use cases where you have a frequently changing datcache however won't work with the default time picker

View solution in original post

woodcock
Esteemed Legend

A summary index is perfect for this use case, but so is an accelerated data model. You should be able to prototype both in the same day if you know your data and outcomes well, so play around and enjoy the exercise.

View solution in original post

woodcock
Esteemed Legend

A summary index is perfect for this use case, but so is an accelerated data model. You should be able to prototype both in the same day if you know your data and outcomes well, so play around and enjoy the exercise.

mlevsh
Builder

@woodcock a delayed thank you!

0 Karma

woodcock
Esteemed Legend

Which way did you go and why?

0 Karma

bandit
Motivator

Datamodel/tstats would be my first choice as they can be updated/rebuilt when you have maintenance or changes to the fields you are reporting on. Data models can only retain data as long as the source index.

Summary indexes can live indefinitely however may need manual intervention to backfill data gaps during maintenance windows or failures.

Metrics/mcollect/mstats is the newest and fastest method for historical metrics but lacks some of the automation of a data model.

Lastly there are lookup tables/CSVs and the kvstore that sometimes work well for edge use cases where you have a frequently changing datcache however won't work with the default time picker

mlevsh
Builder

@rob_jordan , delayed thank you!

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...