Splunk Search

Extraction of an indexed field in summary indexes

bojanz
Communicator

I'm having a field that is being specifically indexed (and not extracted during search time).
The following configuration is used:

props.conf:

[mysourcetype]
TRANSFORMS-bla = indexex

transforms.conf:

[indexex]
REGEX = ^(\d+)
FORMAT = MYFIELD::$1
WRITE_META = true

fields.conf:

[MYFIELD]
INDEXED = true

This all works fine in normal indexes. However, I'm creating a summary index that contains the same field, like this:

index=myindex *searchsomething* | table something MYFIELD

This is also indexed fine, however I cannot search by MYFIELD directly:

index=summaryindex MYFIELD=something

yields 0 results. This works though:

index=summaryindex | search MYFIELD

I tried adding an indexed field for the stash sourcetype but it didn't help.
Is this an issue due to the field being indexed (and fields.conf applied globally)?
Do I have to change the field name (searching through summaryindex for other fields works, it doesn't work only for this field).

Thanks.

Tags (1)
0 Karma

Ayn
Legend

Have a look at the following blog post, which probably describes your problem and its solution.

http://blogs.splunk.com/2011/10/07/cannot-search-based-on-an-extracted-field/

0 Karma

Ayn
Legend

Yeah since you put INDEXED = true I'm pretty sure there's no good workaround other than renaming the field in the summary index.

I'm still not sure I understand why you really need to have it as an indexed field - did you actually try to have it extracted in search-time and compare performance? My guess is performance would be just as good, and possibly even better, with a search-time extracted field.

0 Karma

bojanz
Communicator

What I showed is the original log. In the summary index, it looks like this:

2012-07-11 summaryindexstuff BLA=11, SOMETHING=12, MYFIELD=10203

I needed to index it separately because it's a subtoken and I'm having a huge amount of logs. The logs are ugly and not space (or anything) delimited - they just contain concatenated numbers/letters.

That first part works great, it's fast etc. But I guess I made a mistake not renaming this one field in summary index. In example below, search works fine for other fields (i.e. BLA and SOMETHING), just doesn't work for MYFIELD.

0 Karma

Ayn
Legend

No I mean what does the event in the summary index look like? Or is that it?

Also, not that it takes us closer to a solution, but why are you making this an indexed field rather than something that's extracted at search-time? It's very rarely a good idea.

0 Karma

bojanz
Communicator

I extract it - for example, I have a log such as this:

2012-07-11 01020032300404

Where this is actually couple of variables so I extract them:

REGEX = ^[^\s]+\s(.)(....)
FORMAT = SOMEFIELD::$1 MYFIELD::$2
WRITE_META = true

I used index time extraction here - and I create the same field that I use later in summary. I guess this is the problem, I could rename the field but since I already have tons of summary-indexed stuff I was wondering if there is a way to get Splunk to recognize this.

I presume Splunk tries to find that same field as indexed field and fails.

0 Karma

Ayn
Legend

Wait, if you're summary indexing, could you show what the summary indexing events look like? I imagine they should have MYFIELD=somevalue in there, no?

0 Karma

bojanz
Communicator

Unfortunately this doesn't help - I have MYFIELD in fields.conf as indexed (since I have to extract it separately, it's a subtoken).

So I cannot add MYFIELD INDEXED = false there as fields.conf appears to be global 😕

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...