Splunk Search

How to fill auto-fill missing dates in a time range and fill null with previous value?

srive326
Explorer

Hello everyone,

I need help with a query.

I have a table with the following fields:

_time USERNUMBER WEIGHT
2020-04-02 07:17:12.397 545 245.2400
2020-04-02 07:17:12.400 547 100.0011
2020-04-03 07:15:37.956 545 260.2400
2020-04-06 07:16:41.763 545 245.2400
2020-04-09 07:15:38.864 545 612.5600
2020-04-10 07:15:59.845 545 245.2400
2020-04-10 07:17:12.401 547 100.0011
2020-04-13 07:16:32.169 545 245.0258
2020-04-15 07:15:37.027 545 245.2400
2020-04-15 07:15:37.027 547 120.2400
2020-04-16 07:15:44.469 545 156.2400
2020-04-17 07:15:46.726 545 245.0215
2020-04-20 07:16:45.495 545 142.2400
2020-04-22 07:15:54.629 545 245.0214
2020-04-22 07:15:54.629 547 106.0521
2020-04-23 07:15:43.608 545 965.2850
2020-04-25 07:15:50.403 545 245.2100
2020-04-27 07:16:48.186 545 335.2400
2020-04-28 07:15:52.713 545 245.5596
2020-04-29 07:15:54.472 545 520.2400
2020-05-01 07:15:52.179 545 245.0018

I want to show for each USERNUMBER the weight on each day of the month of April. The problem is, that
the weight is only updated when there has been a change. How can I autofill the missing dates in April, and also
populate the WEIGHT for those missing dates with the WEIGHT that was there the previous day for that USER.
So for example if I'm only looking at USERNUMBER 545 the rows from 04/02 to 04/09 will look like:

2020-04-02 07:17:12.397 545 245.2400
2020-04-03 07:15:37.956 545 260.2400
2020-04-04 00:00:00.000 545 260.2400
2020-04-05 00:00:00.000 545 260.2400
2020-04-06 07:16:41.763 545 245.2400
2020-04-07 00:00:00.000 545 245.2400
2020-04-08 00:00:00.000 545 245.2400
2020-04-09 07:15:38.864 545 612.5600

Sidenote: The Hour/Time component of the date is not important to me
I hope this makes sense, thank you in advance for your help.

0 Karma

to4kawa
Ultra Champion
| makeresults
| eval _raw="_time,USERNUMBER,WEIGHT
2020-04-02 07:17:12.397,545,245.2400
2020-04-02 07:17:12.400,547,100.0011
2020-04-03 07:15:37.956,545,260.2400
2020-04-06 07:16:41.763,545,245.2400
2020-04-09 07:15:38.864,545,612.5600
2020-04-10 07:15:59.845,545,245.2400
2020-04-10 07:17:12.401,547,100.0011
2020-04-13 07:16:32.169,545,245.0258
2020-04-15 07:15:37.027,545,245.2400
2020-04-15 07:15:37.027,547,120.2400
2020-04-16 07:15:44.469,545,156.2400
2020-04-17 07:15:46.726,545,245.0215
2020-04-20 07:16:45.495,545,142.2400
2020-04-22 07:15:54.629,545,245.0214
2020-04-22 07:15:54.629,547,106.0521
2020-04-23 07:15:43.608,545,965.2850
2020-04-25 07:15:50.403,545,245.2100
2020-04-27 07:16:48.186,545,335.2400
2020-04-28 07:15:52.713,545,245.5596
2020-04-29 07:15:54.472,545,520.2400
2020-05-01 07:15:52.179,545,245.0018"
| multikv forceheader=1
| eval _time=strptime(time,"%F %T.%3Q")
| table _time USERNUMBER WEIGHT
| rename COMMENT as "this is sample"
| timechart cont=f span=1d list(WEIGHT) by USERNUMBER
| makecontinuous _time span=1d
| rename COMMENT as "Timerange created. check at this"
| filldown

I don't know what you want to display. How about this?

0 Karma

srive326
Explorer

Good morning,

Thank you for your help with this query. The goal in getting the data in that format is that I then have to look for each USERNUMBER add a conditional statement if the weight on a given day is greater than 500.000 or not, then add how many USERS had a weight over 500 and how many didn't on each day in April. I used your code and it worked in filling in the missing dates and also filling in with previous weights on the missing date (thank you). However, I have over 80,000 unique usernumbers, this means that I should have that many number of columns (splunk truncates them to show just 10). I tried transposing the result so I can instead have a table like this:
USERNUMBER 04/02/2020 04/03/2020
545 245.2400 260.2400

but I still got only 10 USERNUMBERS/rows.
Given these details, what would be the best way to display the data so I can add the conditional statements? Thank you again for your help.

0 Karma

to4kawa
Ultra Champion

You're missing the goal and the question, and the method is completely different.

0 Karma

srive326
Explorer

Ok, I will write a new question with all those details. I left them out because I thought I would be able to handle the second part of the query on my own.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

10 is the default limit of timechart. Try timechart limit=0 ..., but I suspect 80,000 values may be too many to render.

---
If this reply helps you, Karma would be appreciated.
0 Karma

srive326
Explorer

Hello,

Thank you for your input. I changed the limit to 0 and this helped return more USERNUMBERS.

I modified what to4kawa shared so it now looks like this:

| timechart limit=0 cont=f span=1d list(WEIGHT) by USERNUMBER
| makecontinuous _time span=1d
| rename COMMENT as "Timerange created. check at this"
| filldown
| eval _time=strftime(_time,"%Y-%d-%m %H:%M")
|transpose header_field=_time

This returns the USERNUMBER counts I expected (about 80k) but it only shows the first 5 dates as columns. Is there something else I need to edit to show the other dates?
I did get this error after it finished running: Events might not be returned in sub-second order due to search memory limits. See search.log for more information. Increase the value of the following limits.conf setting:[search]:max_rawsize_perchunk.

Also if I don't transpose the program doesn't finish and I get a few different errors.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Check out the filldown command.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Introducing Ingest Actions: Filter, Mask, Route, Repeat

WATCH NOW Ingest Actions (IA) is the best new way to easily filter, mask and route your data in Splunk® ...

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...