Knowledge Management

Best ways to report on large volumes of data?

mlevsh
Builder

Hi,

For Service-Now dashboards we need to report on large volumes of data (one year of data to be precise).
Is summary index the best way to prevent slowness of dashboards?

To get all panels on the dashboards "Active tickets", "Active Tickets aged more than 60 days", "Not Updated tickets", etc - we would need to search on all available data for an year.

Thank you in advance

2 Solutions

bandit
Motivator

Datamodel/tstats would be my first choice as they can be updated/rebuilt when you have maintenance or changes to the fields you are reporting on. Data models can only retain data as long as the source index.

Summary indexes can live indefinitely however may need manual intervention to backfill data gaps during maintenance windows or failures.

Metrics/mcollect/mstats is the newest and fastest method for historical metrics but lacks some of the automation of a data model.

Lastly there are lookup tables/CSVs and the kvstore that sometimes work well for edge use cases where you have a frequently changing datcache however won't work with the default time picker

View solution in original post

woodcock
Esteemed Legend

A summary index is perfect for this use case, but so is an accelerated data model. You should be able to prototype both in the same day if you know your data and outcomes well, so play around and enjoy the exercise.

View solution in original post

woodcock
Esteemed Legend

A summary index is perfect for this use case, but so is an accelerated data model. You should be able to prototype both in the same day if you know your data and outcomes well, so play around and enjoy the exercise.

mlevsh
Builder

@woodcock a delayed thank you!

0 Karma

woodcock
Esteemed Legend

Which way did you go and why?

0 Karma

bandit
Motivator

Datamodel/tstats would be my first choice as they can be updated/rebuilt when you have maintenance or changes to the fields you are reporting on. Data models can only retain data as long as the source index.

Summary indexes can live indefinitely however may need manual intervention to backfill data gaps during maintenance windows or failures.

Metrics/mcollect/mstats is the newest and fastest method for historical metrics but lacks some of the automation of a data model.

Lastly there are lookup tables/CSVs and the kvstore that sometimes work well for edge use cases where you have a frequently changing datcache however won't work with the default time picker

mlevsh
Builder

@rob_jordan , delayed thank you!

0 Karma
Get Updates on the Splunk Community!

.conf25 technical session recap of Observability for Gen AI: Monitoring LLM ...

If you’re unfamiliar, .conf is Splunk’s premier event where the Splunk community, customers, partners, and ...

A Season of Skills: New Splunk Courses to Light Up Your Learning Journey

There’s something special about this time of year—maybe it’s the glow of the holidays, maybe it’s the ...

Announcing the Migration of the Splunk Add-on for Microsoft Azure Inputs to ...

Announcing the Migration of the Splunk Add-on for Microsoft Azure Inputs to Officially Supported Splunk ...