Splunk Search

simple table to count monthly and yearly

New Member

Hi together

i have some events like:

date            product     count_soled_today   
2019-01-06      bike                15
2019-01-11      bike                 5
2019-01-12      scooter           2
2019-03-16      bike                 3
2019-03-17      bike                 5
2019-04-03      scooter           3

and would like to create the following table

month   product     count_month     count_year
01      bike           20              20
01      scooter     2               2
02      bike           0                20
02      scooter     0               2
03      bike           8                28
04      bike           0                28
04      scooter     3               5
.....

to count the soled

  • no. of each product per month
  • no. of each product per current year

can someone give me a hint how to count the two different time periods (month and year) in one search? i tried chart, subsearch, stats, eventstats, append... - so far unfortunately without done.
Many thanks in advance.

0 Karma
1 Solution

Ultra Champion
| makeresults
| eval _raw="date            product        count_soled_today    
 2019-01-06        bike                15
 2019-01-11        bike                 5
 2019-01-12        scooter              2
 2019-03-16        bike                 3
 2019-03-17        bike                 5
 2019-04-03        scooter              3"
 | multikv forceheader=1
 | table date,product,count_soled_today
 | eval date=strptime(date,"%Y-%m-%d")
 | fieldformat date=strftime(date,"%Y-%m-%d")
`comment("this is sample data")`
| eval year=strftime(date,"%Y")
| eval month=strftime(date,"%b")
| stats sum(count_soled_today) as count by year, month, product
| eventstats sum(count) as year_total by year,product

Hi, how about it?

View solution in original post

0 Karma

Ultra Champion
| makeresults
| eval _raw="date            product        count_soled_today    
 2019-01-06        bike                15
 2019-01-11        bike                 5
 2019-01-12        scooter              2
 2019-03-16        bike                 3
 2019-03-17        bike                 5
 2019-04-03        scooter              3"
 | multikv forceheader=1
 | table date,product,count_soled_today
 | eval date=strptime(date,"%Y-%m-%d")
 | fieldformat date=strftime(date,"%Y-%m-%d")
`comment("this is sample data")`
| eval year=strftime(date,"%Y")
| eval month=strftime(date,"%b")
| stats sum(count_soled_today) as count by year, month, product
| eventstats sum(count) as year_total by year,product

Hi, how about it?

View solution in original post

0 Karma

New Member

Thank you so much for your quick answer. That looks very promising.

unfortunately, i made a mistake in the description of my events. in fact, i don't have the amount of sold products within the events yet.

the events look more like this:

date            product     
2019-01-06      bike    
2019-01-06      bike
2019-01-06      bike
.....       
2019-01-11      bike
2019-01-11      bike
...
2019-01-12      scooter     
2019-01-12      scooter 
....

i guess i'll have to calculate the sum i named in my first event description (countsoledtoday) first, right? Can you help me with this? Thank you in advance.

0 Karma

Ultra Champion
| makeresults count=2 
| streamstats count 
| eval _time = if (count==2,relative_time(_time,"-2y@d"), relative_time(_time,"@d")) 
| makecontinuous span=1d
| eval counter=random() % 5 + 1
| eval tmp="ababababababababab"
| eval product=substr(tmp,1,counter)
| rex field=product mode=sed "s/b/scooter,/g"
| rex field=product mode=sed "s/a/bike,/g"
| eval product=rtrim(product,",")
| makemv delim="," product
| mvexpand product
| table _time product
`comment("this is sample data")`
| eval month=strftime(_time,"%m/%Y")
| eval day=strftime(_time,"%y/%m/%d")
| eval year=strftime(_time,"%Y")
| stats count by day, month, year, product
| eventstats sum(count) as monthly_count by month product
| eventstats sum(count) as yearly_count by year product

I wanted you to say from the beginning

0 Karma

New Member

many thanks. and again, sorry 🙂

0 Karma