All Apps and Add-ons

Where is the bottleneck in Splunk Hadoop?



I'm running the search below:

index=windows sourcetype=windows (username=123456  event_id=4624) OR ("Account Name:  123456" event_id=4634)
| bin _time as mday span=1d
| eval logon=if(event_id==4624,_time,null())
| eval logoff=if(event_id==4634,_time,null())
| stats min(logon) as logon, max(logoff) as logoff, count(logon) as clogon, count(logoff) as clogoff  by mday
| where clogon>1 OR clogoff> 1
| convert ctime(logon) ctime(mday) ctime(logoff) 
| fields - clogon clogoff

The cut over is 14 days, and since I'm looking for results in January, February and early March all data is coming from hadoop.
The search is taking longer than 2/3 hours, it is trawling over 15*10^9 events for 1 month.
The same search for the last 13 days takes less than 1 minute in splunk.

I have looked at the search, application and container logs but did not see anything that looked obvious.

My understanding is that the map reduce job in hadoop only uses the time stamp, all the filtering happens in splunk is this correct?
Where is the bottleneck for this search to take much longer than when searching exclusively in splunk?

Thanks in advance for any pointers.

Tags (2)
0 Karma

Ultra Champion

The biggest factor I have seen is the hadoop queue - try to get an amplified one - it makes all the difference.

0 Karma



Could you elaborate on that?

0 Karma

Splunk Employee
Splunk Employee

This document can help debug these issues:
Are you running in Verbose mode?
Are you able to access the Hadoop logs to examine the performance of Hadoop itself?

0 Karma



Thanks for the response and the link.

It is not that I am seeing errors when the search runs, the issue is the time it takes to complete.
It could be an expectations issue, searches using hadoop do take longer, but I was not expecting the observed difference (for the same time interval and similar number of events).
Also, I was expecting some wiggle room for tuning and improving performance, which I haven't found yet.

I do have access to yarn, and have checked the application logs. All complete successfully...
Should I be checking anything in particular?


0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the ...

Announcing the 1st Round Champion’s Tribute Winners of the Great Resilience Quest

We are happy to announce the 20 lucky questers who are selected to be the first round of Champion's Tribute ...

We’ve Got Education Validation!

Are you feeling it? All the career-boosting benefits of up-skilling with Splunk? It’s not just a feeling, it's ...