Splunk Search

working with large data sets, seem to be hitting limits

I have what I think is a routine problem, but I don't know how to solve it. I have a log file that has mixed content in it. Some lines have XML and I'm trying to find statistics for it. The issue is I keep hitting limits within splunk. Either a 10,000 event or 50,000 event limit depending on how I write the query. There has to be a way to do this or splunk would be useless as a BI tool, so I'm hoping someone can let me in on the secret for large data sets.

The lines look like this (the xml is well formed, the submission form is munging it):

2010-09-12 03:14:55,550 INFO  [QELogger] A9 XML: < A9_Request_1 initialQueryTime="2010-09-12T03:14Z" > < SearchParms startNum="1" numResults="14" > < Sort type="Relevance" > < Relevance /> </ Sort > </ SearchParms > < Filter hd="ALL" mode="TextQuery" > < TextQuery type="Keyword" >GREEK < /TextQuery > < Linear forward="14400"/ > < VOD/ > < /Filter > < STBInfo > < MAC > 123456789ABC < /MAC > < Controller_ID > 2583 < /Controller_ID > < ChannelMapID > 2201 < /ChannelMapID > < StreamingServerID > 19501 < /StreamingServerID > < /STBInfo > < Debug docVersion="1.0" searchAgent="hostname.ofserver.net" > TVGI C< /Debug > < /A9_Request_1 >

So what we do is a search like this:

host="srchqenmana*" "< A9_Request" AND NOT ("FFFFFFFFFFFF" OR "000013ED3AEB" OR "Agent.007") | xmlkv | stats count(MAC)

which only pulls the XML lines and filters out some known monitoring MAC addresses and parses the XML. We then pipe that to various stat counts, such as | stats count(MAC) or the like. Some times we count unique macs or other fields. But it never return a full data set. Typically it stops at 50,000 events. There's no warning or alert that the datasent is incomplete, it just finishes normally.

So there must be something I should be doing to get all the data. Can someone point me in the right direction? Our traffic levels are such now that we can't get 24 hours worth of counts anymore. Even 'last four hours' tops out at 50,000 at some points.

Any help would be appreciated.

Rich

Tags (2)
1 Solution

Splunk Employee
Splunk Employee

This is just a bug with the specification of the xmlkv command in commands.conf. It is implemented in python, in $SPLUNK_HOME/etc/apps/search/bin/xmlkv.py and its properties are described in $SPLUNK_HOME/etc/apps/search/default/commands.conf.

It is erroneously not declared as "streaming" which allows python commands to go past 50k events. You can workaround this by adding to $SPLUNK_HOME/etc/apps/search/local/commands.conf:

[xmlkv]
streaming = true

An alternate (and probably higher performing) solution would be to use a regex based extraction for fields like this, say in props.conf:

[<sourcetype>]
EXTRACT-mac = <MAC>(?<MAC>[^<]+)</MAC>

View solution in original post

Splunk Employee
Splunk Employee

This is just a bug with the specification of the xmlkv command in commands.conf. It is implemented in python, in $SPLUNK_HOME/etc/apps/search/bin/xmlkv.py and its properties are described in $SPLUNK_HOME/etc/apps/search/default/commands.conf.

It is erroneously not declared as "streaming" which allows python commands to go past 50k events. You can workaround this by adding to $SPLUNK_HOME/etc/apps/search/local/commands.conf:

[xmlkv]
streaming = true

An alternate (and probably higher performing) solution would be to use a regex based extraction for fields like this, say in props.conf:

[<sourcetype>]
EXTRACT-mac = <MAC>(?<MAC>[^<]+)</MAC>

View solution in original post

The streaming = true fix seems to have done it. At least we're getting further than we were before.

0 Karma

Communicator
0 Karma