Splunk Search

About usage of {} in eval

yutaka1005
Builder

I recently saw the manual of eval, and I found the following description.

To specify a field name with multiple words, you can either concatenate the words, or use single quotation marks when you specify the name. For example, to specify the field name Account ID you can specify AccountID or 'Account ID'.

To specify a field name with special characters, such as a period, use single quotation marks. For example, to specify the field name Last.Name use 'Last.Name'.

You can use the value of another field as the name of the destination field by using curly brackets, { }. For example, if you have an event with the following fields, aName=counter and aValue=1234. Use | eval {aName}=aValue to return counter=1234.

It means that "{ }" is able to define the value of the field as a new field as it is,
but I can not think of a good way to use it.

Is there any use example using this eval?

1 Solution

DalJeanis
Legend

There are a lot of ways that it turns out to be useful.

First, it allows you to "build" the name of the field that you want to use. This is very helpful when you are trying to take information from one format and move it into another.

Second, let's suppose that you wanted to find out the names of all the fields that had the value "c", and you wanted the count of how many of each field there were.

The untable command looks basically like this:

| untable KeyFieldName FieldToPutFieldNameIn FieldToPutFieldValueIn

What that does is take each event and turn it into multiple events, each of which documents ONE of the fields on the original event.

So if one event had...

myfield=1 field1=c field2=b field3=c field4=d

then |untable myfield name value would turn it into this...

myfield    name       value
   1      field1         c
   1      field2         b
   1      field3         c
   1      field4         d

... and this would eliminate all the ones that were not c and create two records for that one event, one with field field1 and one with field field3....

 |  untable myfield name value 
 |  where value="c"
 | eval {name} = value

... and if you run that into a stats command, you get a count of each field with value "c'", in the name of that field.

| stats count(*) as *

That's a made up example, but it shows you how powerful that little feature can be.

View solution in original post

DalJeanis
Legend

There are a lot of ways that it turns out to be useful.

First, it allows you to "build" the name of the field that you want to use. This is very helpful when you are trying to take information from one format and move it into another.

Second, let's suppose that you wanted to find out the names of all the fields that had the value "c", and you wanted the count of how many of each field there were.

The untable command looks basically like this:

| untable KeyFieldName FieldToPutFieldNameIn FieldToPutFieldValueIn

What that does is take each event and turn it into multiple events, each of which documents ONE of the fields on the original event.

So if one event had...

myfield=1 field1=c field2=b field3=c field4=d

then |untable myfield name value would turn it into this...

myfield    name       value
   1      field1         c
   1      field2         b
   1      field3         c
   1      field4         d

... and this would eliminate all the ones that were not c and create two records for that one event, one with field field1 and one with field field3....

 |  untable myfield name value 
 |  where value="c"
 | eval {name} = value

... and if you run that into a stats command, you get a count of each field with value "c'", in the name of that field.

| stats count(*) as *

That's a made up example, but it shows you how powerful that little feature can be.

yutaka1005
Builder

To DalJeanis

Thank you for answer.
I was able to understand dynamic eval in more detail.

It seems certainly difficult to calculate the number of fields with certain values for each field without using dynamic eval, unless I know in advance about the field that I have a particular value for.

DalJeanis
Legend

@yutaka1005 - yes, there are techniques for working with that kind of structure. stats and eventstats, untable andxyseries,foreachandmap` are all useful in certain situations like that.

It's best to post the specific use case on a question by itself, and solicit ideas. Sometimes the solution is very easy in retrospect, but until you've seen the relevant structures and patterns, there's no way to figure it out.

0 Karma

sbarr0
Explorer

Try this addon: https://splunkbase.splunk.com/app/4597/

Similar to pointers in C, this allows you to use a column as a field pointer to fetch a value from.

| makeresults 
| eval Field1=1
| eval Field2=2
| eval Field3=3
| eval Field4=4
| eval pointer_field="Field4"
| pointerset newField pointer="pointer_field"
Field1  Field2  Field3  Field4  pointer_field   newField    
1           2       3       4       Field4          4
0 Karma
Get Updates on the Splunk Community!

Monitoring MariaDB and MySQL

In a previous post, we explored monitoring PostgreSQL and general best practices around which metrics to ...

Financial Services Industry Use Cases, ITSI Best Practices, and More New Articles ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Splunk Federated Analytics for Amazon Security Lake

Thursday, November 21, 2024  |  11AM PT / 2PM ET Register Now Join our session to see the technical ...