Splunk Search

how to reduce the time of a search?

francesco1g
Engager

Hi, I have a search that contains millions of events and is extremely slow, is there a way to speed it up?

 

This is the query:

 

index=audit
| table db_name 
| dedup db_name
| outputlookup audit.csv

 

 

Labels (2)
Tags (2)
0 Karma
1 Solution

ashvinpandey
Contributor

@francesco1g try the below one:

index=audit db_name=* | fields db_name | dedup db_name | table db_name 
| outputlookup audit.csv

Also, If this reply helps you, an upvote would be appreciated. 

View solution in original post

ashvinpandey
Contributor

@francesco1g try the below one:

index=audit db_name=* | fields db_name | dedup db_name | table db_name 
| outputlookup audit.csv

Also, If this reply helps you, an upvote would be appreciated. 

gcusello
SplunkTrust
SplunkTrust

Hi @francesco1g,

the only approach when you have to manage million or more data to have fast searches is acceleration:

https://docs.splunk.com/Documentation/Splunk/8.2.2/Report/Acceleratereports

https://docs.splunk.com/Documentation/Splunk/8.2.1/Knowledge/Aboutsummaryindexing

https://docs.splunk.com/Documentation/Splunk/8.2.2/Knowledge/Acceleratedatamodels

In your situation, the easiest way is to schedule your search to run e.g. every day or every hur, or in another time period with less events; then write results in a sumamry index and then run your search on the summary index.

Or create an accelerated Datamodel with few fields, only the ones you need.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...