I have a search that is taking waaaaaaaaayyyyyyyyy too long and am looking for idea on how to improve it, be it tstats/datamodels/fields....
Here's my search (active directory data)
index=AD | regex host=(?i)\w+ads$ | regex EventCode="^462([4,5])|4634|4648|4661|4696|4723|476([1,8,9])|477([0,1,2,6])|563([2.3])|5140$$" | stats count
I stopped it when it had been running for 710 seconds, and didn't appear to be even 50% complete.
Give this a try
index=AD host=*ads [| gentimes start=-1 | eval EventCode="4624 4625 4634 4648 4661 4696 4723 4761 4768 4769 4770 4771 4772 4776 5632 5633 5140" | table EventCode | makemv EventCode | mvexpand EventCode ] | stats count
Add sourcetypes or sources that are relevant to ignore the other data. Add strings that must exist, such as
EventCode=* or even
( EventCode>4000 EventCode<6000 ). The more you can reduce the payload early, the better it will perform.
Use the Job Inspector to see where time is being spent. Perhaps there's a field extraction being performed that you don't need.
Is this running in fast mode?
You might even try (not sure if it will help) doing the stats first to reduce the result set:
index=AD source=<blah> sourectype=<blahst> ads ( EventCode>4000 EventCode<6000 ) | stats count by host, EventCode | regex host=(?i)\w+ads$ | regex EventCode="^462([4,5])|4634|4648|4661|4696|4723|476([1,8,9])|477([0,1,2,6])|563([2.3])|5140$$" | stats sum(count) AS count
Good page for some of these ideas: http://docs.splunk.com/Documentation/Splunk/latest/Search/Writebettersearches