Is there a bug in the df script that produces the wrong byte size for filesystems greater than 1 TB? I'm running a search similar to this:
index=linux sourcetype=df|dedup host|multikv|search host="xxxxx" AND data4|chart eval(sum(Size)) as x
The number that get returns is 2.2.
data4 is a 2.2TB filesystem and I was expecting to see the actually byte count of approximately 200000000
This bug throws off my numbers when I try to calculate the total storage on my systems.