Getting Data In

transforms ,tsv input, taking each log entry and distributing it across days in that month

Engager

I'm not sure if it's possible. I know I can limit, and I know I can play some regex on the input. But has anyone done, or seen where someone took an accumulative sum and spread it evenly across the 28/30/31 days in a month?

Say you have the value 4123.902992 in a log file. it arrives on the 1st of every month (it's a default spend) (note you will be adding smaller sized spend daily, but you have sunk costs which is that number referenced.

Just not sure if Splunk has something like this vs me breaking it out before I give it to Splunk, It's possible, but before doing so, I wanted to see if I could get creative with Splunk and be able to reuse for future type events?

Tags (2)
0 Karma

Splunk Employee
Splunk Employee

Hi @tmblue,

Did you have a chance to check out renjith.nair 's answer? If it worked, please resolve this post by approving it! If your problem is still not solved, keep us updated so that someone else can help ya.

0 Karma

Engager

Yep, sorry working on it now. trying to tweak it to see if I can get it to work.

0 Karma

SplunkTrust
SplunkTrust

@tmblue ,
Try ,

|makeresults |eval sum=4123.902992
|fields _time,sum
|eval earliest=relative_time(_time,"-1mon@mon")| eval latest=relative_time(_time,"@mon")|eval diff=round((latest-earliest)/86400,0)
|eval days=strftime(mvrange(earliest,latest,"1d"),"%Y-%m-%d")|mvexpand days
|eval perday=sum/diff|table days,perday

First line should be changed to your event time(_time) and sum value field. This should give you the distribution across days for previous month i.e. you are executing the search in Jan and you want the distribution across Dec.

Change earliest and latest to below (3rd line) if you are looking for current month, i.e. you are executing the search in Jan and you want the distribution across Jan.

|eval earliest=relative_time(_time,"@mon")| eval latest=relative_time(_time,"+1mon@mon")
0 Karma

Engager

Sorry still playing with this. As written I get something like 40K results,, and it should be just 28/30/31 depending on the month. Still working, tweaking to see if it's something I'm doing.

0 Karma

Engager

I'm not getting it, as written above I'm getting an error. it's the first value after index=xxx|

| makeresults |eval sum=4123.902992
|fields TimePeriod,Cost ; _time would work fine here as well, _time and TimePeriod (field) are identical
|eval earliest=relative_time(_time,"@mon")| eval latest=relative_time(_time,"+1mon@mon")|eval diff=round((latest-earliest)/86400,0)
|eval days=strftime(mvrange(earliest,latest,"1d"),"%Y-%m-%d")|mvexpand days
|eval perday=sum/diff|table days,perday

Error in 'makeresults' command: This command must be the first command of a search.

Was trying to tweak it (I've never used makeresults), but not getting anywhere.

0 Karma

SplunkTrust
SplunkTrust

@tmblue , when you are using index=x, you need to remove makeresults. It was just to create a dummy row.
So for you it will be

The above should give you a single row with a _time and Cost value based on the time range you selected.

Once you verify above add the rest of the search without the makeresults line to that , ie.

    index="your index" "select the date to reflect the month you are looking for" 
    |stats first(_time) as _time,first(Cost) as Cost
    |eval earliest=relative_time(_time,"@mon")| eval latest=relative_time(_time,"+1mon@mon")|eval diff=round((latest-earliest)/86400,0)
    |eval days=strftime(mvrange(earliest,latest,"1d"),"%Y-%m-%d")|mvexpand days
    |eval perday=Cost/diff|table days,perday
0 Karma

Engager

oops, okay testing now, since makeresults was an actual command, I thought that was what was being required. I did try without it before and that is when I got the 10-40K results 🙂 I'm testing again.. thanks !

0 Karma

Engager

Yes bingo, thank you! this works as part of a search. But I was wondering if there was something similar in a transform to push my data through when loading it. This works great for a search, and again thank you for that, but was just wondering if there was something similar at load time. I'd run a different process on the 1st of the month vs the rest of the month to break this out..

0 Karma

SplunkTrust
SplunkTrust

I am not sure how useful it is to put in configuration but you could add this to your props

[your sourcetype name]
EVAL-PerDayValue = Cost/round((relative_time(_time,"+1mon@mon")-relative_time(_time,"@mon"))/86400,0)

This will create PerDayValue field in the event where there is Cost field

0 Karma