Reporting

Large Report Export in CSV, Slow Dashboard Panels using Accelerated Report reference

mbasharat
Builder

I have a large report that returns data anywhere between 4GB-6GB in a nice tabular format. Report has everything what I need. This report is actually for hosts vulnerabilities. Each host is affected with several vulnerabilities so my report returns about a million rows.

However, if I try to Export it in CSV, it errors out because it is too large. If I try to go to /dispatch//results.csv.gz, I am able to copy it to my local BUT that CSV does no open in Excel because it is too large or something else is not right. WHen I try to use this reports reference in my dashboard panels, they are very slow but return results, just need to wait longer.

Is there a way I can make above happen?
Is there a way I can speed up the reference in dashboard and have dashboard panels load much quicker?
Is there a way I can create an accelerated data model from this report OR accelerated data model wont help since my report is already accelerated?
Can I accelerate dashboard panel?

Tags (1)
1 Solution

tom_frotscher
Builder

Hi,

if you want to export it to your local machine and you can not open it in excel or notepad: This is why we initially have tools like splunk, it is to much data for those tools. However, you can still use the outputcsv command and use the option singlefile=false (your search | outputcsv singlefile=false filename=results.csv). Instead of one big csv, you should get multiple smaller files, maybe excel can handle them.

In case you want to use the results of your report in splunk:
There are multiple options to gain speed ups. You could make your report a scheduled report (if you do not already have), and reuse the results of the reports in a dashboard. Also you can accelerate your report or use summary indexing. If you want, you can also create a data model for your report data and use it to speed up searches.

In your case i would start to look into scheduled reports, report acceleration and summary indexing.

Greetings

Tom

View solution in original post

0 Karma

woodcock
Esteemed Legend

It could be that multivalued fields are borking things. Try adding this to the end of your SPL:

foreach * [nomv "<<FIELD>>"]
0 Karma

mbasharat
Builder

Hi Woodcock,

I am using stats dc(fieldA)....values(FieldB) ..... BY FieldC. None of the fields have multiple values.

0 Karma

tom_frotscher
Builder

Hi,

if you want to export it to your local machine and you can not open it in excel or notepad: This is why we initially have tools like splunk, it is to much data for those tools. However, you can still use the outputcsv command and use the option singlefile=false (your search | outputcsv singlefile=false filename=results.csv). Instead of one big csv, you should get multiple smaller files, maybe excel can handle them.

In case you want to use the results of your report in splunk:
There are multiple options to gain speed ups. You could make your report a scheduled report (if you do not already have), and reuse the results of the reports in a dashboard. Also you can accelerate your report or use summary indexing. If you want, you can also create a data model for your report data and use it to speed up searches.

In your case i would start to look into scheduled reports, report acceleration and summary indexing.

Greetings

Tom

0 Karma

mbasharat
Builder

Hi Tom,

I will use chunks of CSV using your provided method above and see how that goes. TY!!

0 Karma

mbasharat
Builder

Hi Tom,

My reports are all scheduled and accelerated.

Can a data model be created from a report?

0 Karma

tom_frotscher
Builder

Hi,

you can not create a data model from a query and i don't think it will be much faster than the accelerated report. But i do not think, that you want to show 1million rows on your dashboard, this is not very helpful. Is your dashboard rather showing aggregated values? something like number of unique vulnerabilities, number of clients, ...?

If so, you should calculate those accelerated values, store them in a summary index or lookup and use them as a basis for your dashboards.

Greetings

Tom

0 Karma

mbasharat
Builder

Hi Tom,

Report is for stakeholders to download for their needs e.g. grabbing list of vulnerabilities with relevant info as well as hosts so they can use data from CSV for scripting to apply patched etc.. Dashboards are not fed off of this report. I am using a different query for dashboard panels in base and post processing with only actionable information for display.

Thanks,

M

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...