Monitoring Splunk

Backendcalls tracking query

saireddy
Loves-to-Learn Lots

Hi All,

Am hitting a API service which has 7 to 8 backend calls, now i get all the backend call response times as well in my query

Problem statement

My API is calling 7 to 8 backend calls, the samebackend is called by different APIs.

During single hit, i see a message-id is triggered which is same in all backends, but during huge load like 3000 transactions in one hour, how do i construct a query with get messageid from parentid call which is traversed in backend call.

Labels (1)
0 Karma

rnowitzki
Builder

Hi @saireddy ,

Is the tracking id the same in all backends, or would you have to resolve it first with another ID?

If you know the tracking id and it's unique across all Backends, you could just work with transaction or even stats. 

Can you share some example events? Maybe change some wording/IDs/IPs to anonymize it.

BR
Ralph

--
Karma and/or Solution tagging appreciated.
0 Karma

saireddy
Loves-to-Learn Lots

Hi @rnowitzki ,

 

Say Parent API  generates a message id[Ex-88e59799-986f-4099-838e-2128ca333] for each API hit, it generates unique message id for each hit from parent- this gives the count of Parent transaction, here am tracking all count of parent with API name/URL, but backend is called by other APIs as well. so i want backends API called by my parent API.

If am tracking with single messageid- ex which i shared i see all the backends, but with load of 2000 or 3000 how do get filter.

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...