Getting Data In

Is there a way to have INGEST_EVAL process the multivalue field and return the same JSON value as the EVAL lookup?

buzzard192
Explorer

I have a field with the system's IP in it and am trying to add additional fields during ingest.  It works if the IP field is a single value, but if it is a multivalue field it does not.  I can successfully add the fields at search time regardless if it is a single or multivalue field.

As an example, the field name is systemIP.  The CSV lookup file is:

cidr,location,region
192.168.1.0/24,Site-A,East
10.10.10.0/24, Site-B,East

transforms.conf:

[IPRange]
INGEST_EVAL = JSON=lookup("IPRangeLookup", json_object("cidr", systemIP), json_array("location", "region"))

[IPRangeLookup]
batch_index_query = 1
case_sensitive_match = 1
filename=systemIPLookup.csv
match_type = CIDR(cidr)
max_matches = 1

props.conf:

[(?::){0}host::*]
TRANSFORMS = IPRange

 

For the INGEST_EVAL:

If the system only has one IP address (192.168.1.10), then JSON gets set to:
{"location":"Site-A","region":"East"}

If it has two IP addresses, one in each cidr, the JSON gets set to the match for the first IP in the multivalue field.

 

For search time EVAL:

If I search:
index="*" host="host-with-two-IPs" | eval JSONzzz=lookup("IPRangeLookup", json_object("cidr", systemIP), json_array("location", "region"))

Then a system with one IP address, JSONzzz gets set to:
{"location":"Site-A","region":"East"}

If it has two IP addresses, then JSONzzz gets set to:
{"location":["Site-A","Site-B"],"region":["East","East"]}

 

The lookup is the same between the two, but the INGEST_EVAL only ever processes the first value in the field.  Is there a way to have INGEST_EVAL process the multivalue field and return the same JSON value as the EVAL lookup?

Labels (2)
0 Karma
1 Solution

buzzard192
Explorer

@VatsalJagani  - thanks for your response.  I was able to figure it out.  Not sure if it is a newer feature or not, but INGEST_EVAL can handle multi-value fields, but by default it only processes the first value.  To have it process all values, you need to prefix the value with mv: so in my example above, I needed to use: $mv:systemIP$. 

 

From the transforms.conf specification section on INGEST_EVAL:

When reading from an index-time field that occurs multiple times inside the
  _meta key, normally the first value is used. You can override this by
  prefixing the name with "mv:" which returns all of the values into a
  "multival" object. For example, if _meta contains the keys "v::a v::b" then
  'mvjoin(v,",")' returns "a" while 'mvjoin($mv:v$,",")' returns "a,b".

 

View solution in original post

buzzard192
Explorer

@VatsalJagani  - thanks for your response.  I was able to figure it out.  Not sure if it is a newer feature or not, but INGEST_EVAL can handle multi-value fields, but by default it only processes the first value.  To have it process all values, you need to prefix the value with mv: so in my example above, I needed to use: $mv:systemIP$. 

 

From the transforms.conf specification section on INGEST_EVAL:

When reading from an index-time field that occurs multiple times inside the
  _meta key, normally the first value is used. You can override this by
  prefixing the name with "mv:" which returns all of the values into a
  "multival" object. For example, if _meta contains the keys "v::a v::b" then
  'mvjoin(v,",")' returns "a" while 'mvjoin($mv:v$,",")' returns "a,b".

 

VatsalJagani
SplunkTrust
SplunkTrust

Great!!!

0 Karma

VatsalJagani
SplunkTrust
SplunkTrust

@buzzard192 - INGEST_EVAL as such doesn't have any configuration that can do that.

Try doing it as search time field extraction by moving your EVAL from search into props.conf so it auto-applies to all the data required.

OR you need to manipulate at index-time to make additional fields like ip1, ip2, ip3 and then apply lookup 3 times for each field. (Not sure how to achieve this exactly but you need to test it out.)

 

I hope this helps!!! Kindly upvote if it does!!!

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...