Splunk Search

## Weird behavior with the pow()-function

Path Finder

Basically what goes wrong is that the pow() function seems to act weird when exceeding anything above the power 23.
This is the example function and its output:

| eval value = pow(10,22)

This returns 10000000000000000000000.000000, which is what I want.

Next,

| eval value = pow(10,23)

Returns 99999999999999991611392.000000, which is just plain wrong.. Any ideas?

====================================================================

Some more info on why I want to do this, since maybe anyone has a more elegant solution:
My data contains a binary string, say 10001000. I need to join this to a lookup file containing binary masks, so I have to split

10001000 into 10000000 and 1000. The way I do this now is use

| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")

| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")

Which seems to do the trick, however when the binary string exceeds 23 characters, Splunk messes it up. I also don't really have an alternative solution to solve this problem.

Tags (4)
1 Solution
Path Finder

The whole query is below. With a span that returns less than 10 events it's still quite fast but going over 20 events just keeps is hanging at 'Finalizing Job'.

Another thing I just noticed that goes wrong here is in the case of a binary value such as 11000, the ltrim part goes wrong.. However I think it can be fixed with adding a 'substr(X,Y,Z)'.

I guess if this isn't going to work out I'll have the lookup files changed to a format more easy to use.

index=abc binary!=0* earliest=-60m
| eval len1=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len2=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len3=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len4=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len5=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len6=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| streamstats count
| dedup count
Path Finder

The whole query is below. With a span that returns less than 10 events it's still quite fast but going over 20 events just keeps is hanging at 'Finalizing Job'.

Another thing I just noticed that goes wrong here is in the case of a binary value such as 11000, the ltrim part goes wrong.. However I think it can be fixed with adding a 'substr(X,Y,Z)'.

I guess if this isn't going to work out I'll have the lookup files changed to a format more easy to use.

index=abc binary!=0* earliest=-60m
| eval len1=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len2=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len3=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len4=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len5=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| eval len6=len(binary)
| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")
| streamstats count
| dedup count
Super Champion

That mvexpand is growing the number of events exponentially.
You should also try to prefilter your fields at earliest as possible in your search.

Would the following maybe work for you instead?

index=abc binary!=0* earliest=-60m
| fields binary, _time

| eval len1=len(binary)
| eval binary = if(len1 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| eval len2=len(binary)
| eval binary = if(len2 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| eval len3=len(binary)
| eval binary = if(len3 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| eval len4=len(binary)
| eval binary = if(len4 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| eval len5=len(binary)
| eval binary = if(len5 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| eval len6=len(binary)
| eval binary = if(len6 == 0, "0", ltrim(ltrim(binary,"1"),"0"))

| streamstats count

| stats
by count, _time

| stats
by count, _time

| stats
by count, _time

| stats
by count, _time

| stats
by count, _time

| stats
by count, _time
Path Finder

That made it indeed a bit faster, I'll leave it at since it's working now.

I made a request to the people providing the lookup tables if they could incorporate an extra field with the length of the masks so I can join on that. Saves 2/3 of your suggested query 🙂

Anyway thanks a lot for your help, much appreciated!

Super Champion

I think the problem is that you are trying to work with huge numbers not supported internally by Splunk.
Why don't you try with just strings? I understand you just want to be able to use a lookup after all.

For example, I've written the following for mask1 (you can apply the same logic for 2):

| stats count
| eval binary = "1111111111111111111111111111111111111111111111111111"

| eval binary = ltrim(binary,"1")
| eval binary = ltrim(binary,"0")

Output:
100000000000000000000000000000000000000000000000000

Let me know if that helps.

Path Finder

Good point, leaving it a string. In my case I don't get a single binary string but a large list that I want to join multiple masks on. So to prevent the last "|stats" from joining everything together I did the following:

...
| streamstats count
...
...
...

(I need a total of 6 masks to cover the largest binary string I found so far..)

This, however, is impossibly slow..

Super Champion

If you post your whole query we might be able to help with the optimisation (if there's any possible one :D)

Get Updates on the Splunk Community!

#### Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

#### Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

#### Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...