- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a search and I would like to normalize a data point so that I can use it effectively in conjunction with other data points to determine performance impact. In particular, I have a search that is essentially a stats count as hits ... by requestUri
I need to know min(hits)
and max(hits)
in order to determine the normalized value, which I imagine would require preprocessing. Is this possible? See https://en.wikipedia.org/wiki/Feature_scaling if you're wondering what I'm trying to accomplish.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Many times this is desirable because of too-broad a span of datapoint values to see on a chart. If this is your motivation, have you tried changing your Y-Axis to "log" scale? If you have to do it, you can pre-process using eventstats
like this:
... | stats count as hits ... BY requestUri | eventstats min(hits) AS minHits max(hits) AS maxHits | eval hitsPrime=(hits-minHits)/(maxHits-minHits)
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Many times this is desirable because of too-broad a span of datapoint values to see on a chart. If this is your motivation, have you tried changing your Y-Axis to "log" scale? If you have to do it, you can pre-process using eventstats
like this:
... | stats count as hits ... BY requestUri | eventstats min(hits) AS minHits max(hits) AS maxHits | eval hitsPrime=(hits-minHits)/(maxHits-minHits)
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This is exactly what I needed. Thank you!
