Reporting

room usage over a time range

hgran
Explorer

Hello,

We have some logs generated by video conference systems that we use to create utilization and quality report. Now, we would like to create a report showing room usage by time of day. The log data has the following fields

Room Name, Meeting Start time and Duration

So lets assume that room A3 has a meeting at 4:00 PM for 90 Minutes. I would like to create a report that looks like this:

Time Room A3 Active

15:00 No

15:15 No

15:30 No

15:45 No

16:00 Yes

16:15 Yes

16:30 Yes

16:45 Yes

17:00 Yes

17:15 Yes

17:30 Yes

17:45 No

18:00 No

18:15 No

18:30 No

I know we can do this in Excel but I would like to move to this to Splunk but I can't figure this out. Anyone have any ideas?

Tags (2)
0 Karma
1 Solution

jonuwz
Influencer

Tricky, I've had to do this myself, and would love to see a better solution.

The concurrency command only tells you how many events were occuring at the time of an event, so you have to generate your own 'events' (using gentimes) if you want a continuous output. Since concurrency has no 'by' clause, we have to create the booking table for each room using 'map' then combine the outputs using 'chart'

......
| dedup room 
| addinfo
| eval info_min_time=strftime(info_min_time,"%m/%d/%y") 
| eval info_max_time=strftime(info_max_time+86400,"%m/%d/%y") 
| map search="search room=\"$room$\" 
              | append [ | gentimes start=$info_min_time$ end=$info_max_time$ increment=15m
                         | eval _time=starttime 
                         | eval duration=0 
                         | fields _time duration
                       ] 
              | eval duration=duration*60 
              | eval room=\"$room$\" 
              | concurrency duration=duration start=_time 
              | eval used=if(concurrency>1,\"Yes\",\"No\") 
              | stats max(room) as room max(used) as used by _time" 
| eval Time=strftime(_time,"%H:%M") 
| chart limit=0 first(used) as used over Time by room

View solution in original post

0 Karma

jonuwz
Influencer

Tricky, I've had to do this myself, and would love to see a better solution.

The concurrency command only tells you how many events were occuring at the time of an event, so you have to generate your own 'events' (using gentimes) if you want a continuous output. Since concurrency has no 'by' clause, we have to create the booking table for each room using 'map' then combine the outputs using 'chart'

......
| dedup room 
| addinfo
| eval info_min_time=strftime(info_min_time,"%m/%d/%y") 
| eval info_max_time=strftime(info_max_time+86400,"%m/%d/%y") 
| map search="search room=\"$room$\" 
              | append [ | gentimes start=$info_min_time$ end=$info_max_time$ increment=15m
                         | eval _time=starttime 
                         | eval duration=0 
                         | fields _time duration
                       ] 
              | eval duration=duration*60 
              | eval room=\"$room$\" 
              | concurrency duration=duration start=_time 
              | eval used=if(concurrency>1,\"Yes\",\"No\") 
              | stats max(room) as room max(used) as used by _time" 
| eval Time=strftime(_time,"%H:%M") 
| chart limit=0 first(used) as used over Time by room
0 Karma

hgran
Explorer

Thanks That works!

0 Karma

hgran
Explorer

Room_Name Type Start_time Duration

K1 CTS-3000 4/2/2012 5:00 60

P1 H.323 HD 3M 4/2/2012 5:00 60

SC1 CTS-1300 4/2/2012 16:01 46

Thanks

Henry

0 Karma

lguinn2
Legend

Great example of the results that you want. Can you also show a few lines of the log file that you are putting into Splunk?

Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

Splunk Decoded: Business Transactions vs Business IQ

It’s the morning of Black Friday, and your e-commerce site is handling 10x normal traffic. Orders are flowing, ...

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...