Monitoring Splunk

Different Amount of Events per UF with different searches

sscholz
Explorer

Hello guys,

Iam creating a dashboard which show some statistics about the UFs of our environment.

By finding a good solution for the amount of events delivered per index, I noticed something I cant explain at the moment. Hopefully you can bring light in the dark. 😉

For my understanding:

# The amount of indexed events on the indexer by the forwarder itself

| tstats count as eventcount where index=* OR index=_* host=APP01 earliest=-60m@m latest=now by index, sourcetype | stats sum(eventcount) as eventcount by index

indexeventcount
_internal11608
win1337

 

# The amount of events which are forwarded by the forwarder 

index=_internal component=Metrics host=APP01 series=* NOT series IN (main) group=per_index_thruput
| stats sum(ev) AS eventcount by series

serieseventcount
_internal1243
win2876

 

But both of them are delivering different values for the same timerange (60min)

Has anyone an idea why this is happening?

Thanks.

BR, Tom

0 Karma
1 Solution

codebuilder
Influencer

In your first query you are looking at all events, for all internal and non-internal indexes.

In your second query you are looking only at _internal, and have it further delimited to only the Metrics component and the per_index_thruput group.

That is why you are seeing different results. Essentially, you are not comparing apples to apples, so to speak.

----
An upvote would be appreciated and Accept Solution if it helps!

View solution in original post

0 Karma

codebuilder
Influencer

In your first query you are looking at all events, for all internal and non-internal indexes.

In your second query you are looking only at _internal, and have it further delimited to only the Metrics component and the per_index_thruput group.

That is why you are seeing different results. Essentially, you are not comparing apples to apples, so to speak.

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

sscholz
Explorer

Thank you for clarification.

It seems that i had apples on my eyes. 😞

...

 

Greetings.

0 Karma
Get Updates on the Splunk Community!

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...

New! Splunk Observability Search Enhancements for Splunk APM Services/Traces and ...

Regardless of where you are in Splunk Observability, you can search for relevant APM targets including service ...

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...