We use splunk to index beacons our application sends in, many of these fields are optional, and we'd like to calculate the average number of fields that we are sent, I've not been able to work this out, and have this so far. (and have tried variants of it)
index=Beacon | fieldsummary | stats values(field), count(field), avg(count(field))
Thanks
The highest value of the field count in the results for fieldsummary should be the same as the events returned by the query (as every event has an index field) As such the following query should give you the mean number of fields:
index=Beacon | fieldsummary | stats sum(count) as total_fields max(count) as total_events | eval avg_fields=total_fields/total_events
you could modify the fields being counted by placing | search NOT foo=*
between fieldsummary and stats to exclude fields such as timestamp fields that are common to all events. Just make sure you leave at least one metadata field so stats can figure out the total number of events.
The highest value of the field count in the results for fieldsummary should be the same as the events returned by the query (as every event has an index field) As such the following query should give you the mean number of fields:
index=Beacon | fieldsummary | stats sum(count) as total_fields max(count) as total_events | eval avg_fields=total_fields/total_events
you could modify the fields being counted by placing | search NOT foo=*
between fieldsummary and stats to exclude fields such as timestamp fields that are common to all events. Just make sure you leave at least one metadata field so stats can figure out the total number of events.
Thanks this worked
Does this work for you?
index=Beacon| eval filedlist=split(_raw," ") | eval fieldcount=mvcount(filedlist)|table filedlist fieldcount|stats avg(fieldcount)
Thanks, it was almost right, but we have a URL we extract into different fields and it counted that as one, but we wanted each extracted part to be counted