Another approach is needed. What are you actually trying to achieve (ideally from the raw events rather than assume timechart is the place to start)?
The events contains ifHCInOctets counters for switch ports which I calculate the bandwidth.
{"timestamp": "2021-05-01T00:40:02", "device": "switch1", "port_id": "1/0/g2", "port_alias": "LONDON1_Gi1_0_2", "port_type": "network", "ifHCInOctets": "386349938202882"}
By specifying a specific switch port (port_id=1/0/g2), I am able to fit the time series to an algorithm but this approach is not scalable.
index=main sourcetype=_json port device=switch1 port_alias=LONDON* port_id=1/0/g2
| streamstats window=1 global=f current=f last(ifHCInOctets) as last_in by port_id
| eval in_change = last_in - ifHCInOctets
| where in_change>=0
| eval inMbps=in_change*8/1000/1000
| eval c_time=strftime(_time,"%m/%d/%y %H:%M:%S")
| timechart span=15m per_second(inMbps) by port_id
| eval date_minutebin=strftime(_time, "%M")
| eval date_hour=strftime(_time, "%H")
| eval date_wday=strftime(_time, "%A")
| fit DensityFunction 1/0/g2 by "date_minutebin,date_hour,date_wday" into switch1_1_0_g2 threshold=0.05 dist=norm
I would like to fit multiple switch ports (port_id=1/0/g*) into each of their seperate models in one search.
Then afterwards, I would like to apply multiple switch port models to their new respective time series data (using the same approach?).