Activity Feed
- Posted Re: Using a summary index and collect command to pull large data by month on Splunk Search. 08-10-2023 06:29 AM
- Posted How to use a summary index and collect command to pull large data by month? on Splunk Search. 08-09-2023 12:29 PM
- Posted Re: How to remove null values then add fields? on Splunk Search. 06-17-2023 01:54 PM
- Karma Re: How to remove null values then add fields? for yuanliu. 06-17-2023 01:54 PM
- Posted Re: How to remove null values then add fields? on Splunk Search. 06-17-2023 07:14 AM
- Posted How to remove null values then add fields? on Splunk Search. 06-08-2023 06:45 AM
- Posted Re: mvindex with a conditional on Splunk Search. 02-08-2023 08:53 AM
- Posted Re: mvindex with a conditional on Splunk Search. 02-08-2023 05:50 AM
- Posted Re: mvindex with a conditional on Splunk Search. 02-08-2023 05:49 AM
- Posted How do I write this search with a mvindex with a conditional? on Splunk Search. 02-07-2023 01:18 PM
- Posted Re: subtracting two timestamps per logEvevtType on Splunk Search. 01-24-2023 09:47 AM
- Posted Re: subtracting two timestamps per logEvevtType on Splunk Search. 01-23-2023 05:33 AM
- Posted Re: subtracting two timestamps per logEvevtType on Splunk Search. 01-23-2023 05:30 AM
- Posted How to subtract two timestamps per logEvevtType? on Splunk Search. 01-22-2023 07:17 PM
- Posted Re: How to achieve field extraction txt delimiting by space? on Splunk Search. 12-23-2022 11:53 AM
- Posted Re: How to achieve field extraction txt delimiting by space? on Splunk Search. 12-22-2022 10:59 AM
- Posted How to achieve field extraction txt delimiting by space? on Splunk Search. 12-22-2022 08:58 AM
- Karma Re: Split Group by Values for yuanliu. 11-06-2022 02:58 PM
- Posted Re: Split Group by Values on Splunk Search. 11-06-2022 02:48 PM
- Posted Re: Split Group by Values on Splunk Search. 11-06-2022 01:28 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
08-10-2023
06:29 AM
Thank you. Apologies, I do not want by apiName. I can remove that line. I want stats for each API call. Yes, there is only a single request (response_time/processing_time) per session id. Currently the query is only looking at 2xx responses. So I will definitely consider another query for errors. Do you know if the mechanics of the collect command and summary is properly set up?
... View more
08-09-2023
12:29 PM
Hi all. I’m kind of new to Splunk. I have data by day - this is the response time for each API call by day. I want to run that automatically every day, collecting it into a summary index. (I cannot run this by month since it is too much data). Then, every month, I want to use the summary index to calculate the 95th percentile, average, stan dev, of all the response times by each API call. The summary index will allow me to do that faster. Although I am not sure of the mechanics on how to use.
For instance, do I need to readd my filters for the monthly pull?
Does the below so far look correct to pull in all information (events)?
So, I want to understand if I am doing this correctly. I have the below SPL by day:
index=virt [other search parameters] | rename msg.sessionId as sessionId | rename msg.apiName as apiName | rename msg.processingTime as processingTime | rename msg.responseCode as responseCode | eval session_id= coalesce(a_session_id, sessionId) | fields … | stats values(a_api_responsetime) as responsetime, values(processingTime) as BackRT by session_id | eval PlatformProcessingTime = (responsetime - BackRT) | where PlatformProcessingTime>0 | collect index=virt_summary
Then I have the below SPL by month:
index=virt_summary | bucket _time span=1mon | stats count as Events, avg(PlatformProcessingTime), stdev(PlatformProcessingTime), perc95(PlatformProcessingTime) by _time
Any assistance is much appreciated! Let me know if you need more clarification. The results are what I have attached, so it looks like it is not working properly. I tested the results by day.
... View more
Labels
- Labels:
-
stats
06-17-2023
01:54 PM
Ah, ok. I see. I missed that part. Unfamiliar with eventstats. Thank you very much. That worked!!
... View more
06-17-2023
07:14 AM
Thank you Taruchit and yuanliu !! I will take those into consideration for removing null values. It seems combining the two logs with eval coalesce as well as removing null values caused issues for the query. So, for now, I have decided to ignore that. I will create two separate queries later. I have decided to go a different, simpler route this time. I now have two processing times/ fields to work with: processingTime and a_api_responsetime. The math to get the PlatformProcessingTime is PlatformProcessingTime = a_api_responsetime - processingTime. My largest issue that I cannot seem to solve is calculating the processing time by session_id (or one API call) then talking the 95th percentile of the PlatformProcessingTime by _time. The PlatformProcessingTime has to be calculated by session_is. But then, how do I display in the Splunk stats the 95th percentile of PlatformProcessingTime by time? Any assistance is appreciated please! What I have so far: index=* (sourcetype="*" OR sourcetype="*" OR sourcetype="*") ("Response" OR "IS2JS")
| eval session_id= coalesce(a_session_id, sessionId)
| bucket _time span=1h
| fields processingTime, apiName, a_log_type, a_api_responsetime, a_api_name, responsetime, IS2JSRT, session_id, a_session_id, sessionId, PlatformProcessingTime
| stats max(eval(a_api_responsetime)) as responsetime, max(eval(processingTime)) as IS2JSRT by session_id
| eval PlatformProcessingTime = (responsetime - IS2JSRT)
... View more
06-08-2023
06:45 AM
Hi all, would love help with this one.
I currently have a query where I have 4 different processing times by sessionId. I want the ability to remove/ delete any sessionId from the results that has a blank/ null value. If any one of the four processing times, has a blank or null value, remove the sessionId from the stats.
After that, I would like the ability to add those four processing times into one processing time by _time and take the perc95.
Any assistance is appreciated. Let me know if more clarification is needed. Thank you!!
index= [...] | bucket _time span=1h | eval apiIdentifier=coalesce(msg.apiIdentifier,apiIdentifier) | eval apiName=coalesce(msg.apiName,apiName) | eval apiVersion=coalesce(msg.apiVersion,apiVersion) | eval clientRequestId=coalesce(msg.clientRequestId,clientRequestId) | eval companyId=coalesce(msg.companyId,companyId) | eval contentType=coalesce(msg.contentType,contentType) | eval datacenter=coalesce(msg.datacenter,datacenter) | eval entityId=coalesce(msg.entityId,entityId) | eval logType=coalesce(msg.logType,logType) | eval processingTime=coalesce(msg.processingTime,processingTime) | eval responseCode=coalesce(msg.responseCode,responseCode) | eval serverId=coalesce(msg.serverId,serverId) | eval sessionId=coalesce(msg.sessionId,sessionId) | eval timestamp=coalesce(msg.timestamp,timestamp) | eval totalResponseTime=coalesce(msg.totalResponseTime,totalResponseTime) | eval session-id=coalesce(a_session_id, sessionId) | eval AM2JSRT = if(a_log_type=="Response" AND isNum(a_req_process_time), a_req_process_time,0) ,JS2ISRT = if(logType=="JS2IS", processingTime, 0), JS2AMRT = if(logType=="JS2AM", processingTime, 0), AM2DPRT = if(a_log_type=="Response" AND isNum(a_res_process_time), a_res_process_time,0) | stats SUM(AM2JSRT) as AM2JSRespTime, SUM(JS2ISRT) as JS2ISRespTime, SUM(JS2AMRT) as JS2AMRespTime, SUM(AM2DPRT) as AM2DPRespTime by sessionId | eval gw_processingTime=(AM2JSRespTime+JS2ISRespTime+JS2AMRespTime+AM2DPRespTime
... View more
02-08-2023
08:53 AM
Not a problem. It looks like I may have achieved it by modifying your solution. I had issues in the past with Regex so was hoping to use this. I am not sure what "null" does in the below "status_index" but it seems to work by not including frivolous information. | eval temp=split(_raw," ")
| eval status_index1 = if(match(API,"/services/protected/v1/developers"), 6, null)
| eval status_index2 = if(match(API,"/services/public/v1/signup"), 6, null)
| eval status_index3 = if(match(API,"/wcaapi/userReg/wgt/apps"), 10, null)
| eval http_status1 = mvindex(temp, status_index1)
| eval http_status2 = mvindex(temp, status_index2)
| eval http_status3 = mvindex(temp, status_index3)
| eval http_status = coalesce(http_status1, http_status2, http_status3)
| search (
"/services/public/v1/signup" OR
"/services/protected/v1/developers" OR
"/services/public/v1/captcha" OR
"/wcaapi/userReg/wgt/apps"
)
| search NOT "Mozilla"
| eval API = if(match(API,"/services/public/v1/signup"), "DEVP1: Signup", API)
| eval API = if(match(API,"/services/protected/v1/developers"), "DEVP1: Developers", API)
| eval API = if(match(API,"/services/public/v1/captcha"), "DEVP1: Captcha", API)
| eval API = if(match(API,"/wcaapi/userReg/wgt/apps"), "User Registration Enhanced Login", API)
| fields API, http_status, wf_env
| convert timeformat="%Y-%m" ctime(_time) AS Date
| stats count(http_status) as Total_Calls, count(eval(http_status>=500)) as Server_Error by Date, API, wf_env
| eval SuccessRate=round((1-(Server_Error/Total_Calls)) * 100,2)
... View more
02-08-2023
05:50 AM
What I am attempting to do below: | eval temp=split(_raw," ")
| eval API=mvindex(temp,4,8)
```| eval http_status=mvindex(temp,6,10)```
| eval status_index = if(match(API,"/services/protected/v1/developers"), 4, 6)
| eval status_index1 = if(match(API,"/services/public/v1/signup"), 4, 6)
| eval status_index2 = if(match(API,"/wcaapi/userReg/wgt/apps"), 8, 10)
| eval http_status=mvindex(temp, status_index)
| search (
"/services/public/v1/signup" OR
"/services/protected/v1/developers" OR
"/services/public/v1/captcha" OR
"/wcaapi/userReg/wgt/apps"
)
| eval API = if(match(API,"/services/public/v1/signup"), "DEVP1: Signup", API)
| eval API = if(match(API,"/services/protected/v1/developers"), "DEVP1: Developers", API)
| eval API = if(match(API,"/services/public/v1/captcha"), "DEVP1: Captcha", API)
| eval API = if(match(API,"/wcaapi/userReg/wgt/apps"), "User Registration Enhanced Login", API)
... View more
02-08-2023
05:49 AM
Thank you!! However, if I wanted to make more than one "status_index" for each API and then combine all status_indexes into one field called "http_status"; how would I do that?
... View more
02-07-2023
01:18 PM
Hello,
I have the below SPL with the two mvindex functions.
mvindex position '6' in the array is supposed to apply http statuses for /developers.
mvindex position '10' in the array is supposed to apply http statuses for /apps.
Currently position 6 and 10 are crossing events. Applying to both APIs. Is there anyway I can have one mvindex apply to one command?
(index=wf_pvsi_virt OR index=wf_pvsi_tmps) (sourcetype="wf:wca:access:txt" OR sourcetype="wf:devp1:access:txt") wf_env=PROD
| eval temp=split(_raw," ")
| eval API=mvindex(temp,4,8)
| eval http_status=mvindex(temp,6,10)
| search (
"/services/protected/v1/developers" OR
"/wcaapi/userReg/wgt/apps"
)
| search NOT "Mozilla"
| eval API = if(match(API,"/services/protected/v1/developers"), "DEVP1: Developers", API)
| eval API = if(match(API,"/wcaapi/userReg/wgt/apps"), "User Registration Enhanced Login", API)
... View more
Labels
- Labels:
-
eval
01-24-2023
09:47 AM
Awesome! Excellent insights!! This solution worked out great. I will take a look at failures as well. Thank you very much for this!!
... View more
01-23-2023
05:33 AM
as a follow up, each "transaction" or "call" has one RequestID. Each RequestID with two timnestamps, one Request and one Response. Something like the below? Any assistance is appreciated. Date ClientPathURI Number of calls 95thpercentile of Duration
... View more
01-23-2023
05:30 AM
Thank you, that is a huge help. Question, if I had multiple calls, how do I get the SPL to subtract timestamp by RequestID? I don't need the RequestID in the stats, but want the SPL to capture the difference in timestamps per call. And then take the 95th percentile of that call per day?
... View more
01-22-2023
07:17 PM
Hello, apologies if this was stated previously. I have multiple calls - each RequestID with a RequestReceive and ResponseTransmit. I am trying to find the difference between the two timestamps below. The difference of ResponseTransmit timestamp and RequestReceive timestamp. Then put that into a stats command ordered by clientPathURI and then the difference between the timestamps.
Any assistance is much appreciated!
{ [-] RequestID: b74fab20-9a7b-11ed-bd70-c503548afa99 clientPathURI: signup level: Info logEventType: ResponseTransmit timestamp: 2023-01-22T12:43:57.547-05:00 }
{ [-] RequestID: b74fab20-9a7b-11ed-bd70-c503548afa99 clientPathURI: signup } level: Info logEventType: RequestReceive timestamp: 2023-01-22T12:43:57.496-05:00 }
... View more
12-23-2022
11:53 AM
That definitely works for the time being. Thank you very much!
... View more
12-22-2022
10:59 AM
Thank you. I can look into that. Is there a short-term solution I can do in the interim?
... View more
12-22-2022
08:58 AM
Hello,
I am trying to extract the below 201 text highlighted in red below as one separate field from two separate events. How may I do this? I attempted the field extraction feature in Splunk but had no luck. Any assistance is appreciated!
Event 1:
106.51.86.25 [22/Dec/2022:07:48:10 -0500] POST /services/public/v1/signup HTTP/1.1 201 5 539
Event 2:
23.197.194.86 - - [22/Dec/2022:07:48:09 -0500] "POST /services/public/v1/signup HTTP/1.1" 201 -
... View more
Labels
- Labels:
-
field extraction
-
regex
-
rex
11-06-2022
02:48 PM
Apologies! You are correct. I had a typo. This works perfectly! Thank you very much yuanliu!!
... View more
11-06-2022
12:16 PM
Hello, I am looking to separate by date and API_Name. I am looking for something like the below. I need a count and 95thPercentileRespTime(ms) specifically for accountstatements-v1 and a count and 95thPercentileRespTime(ms) specifically for Realtime_Image_Access_Service_V2 organized by date. date count 95thPercentileRespTime(ms) API_Name 2022-11-05 x x accountstatements-v1 2022-11-06 x x Realtime_Image_Access_Service_V2 2022-11-06 x x accountstatements-v1
... View more
11-06-2022
11:38 AM
Hello, I am very new to Splunk. I am wondering how to split these two values into separate rows. The "API_Name" values are grouped but I need them separated by date. Any assistance is appreciated! SPL:
index=...
| fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, api_name, API_ID
| convert timeformat="%Y-%m-%d" ctime(_time) AS date
| eval sessionID=coalesce(a_session_id, transaction_id)
| stats values(date) as date dc(source) as cnt values(timestamp) as start_time values(a_timestamp) as end_time values(api_name) as API_Name by sessionID | where cnt>1
| eval start=strptime(start_time, "%F %T.%Q")
| eval end=strptime(end_time, "%FT%T.%Q")
| eval duration(ms)=abs((end-start)*1000)
| stats count,
perc95(duration(ms)) as 95thPercentileRespTime(ms) values(API_Name) as API_Name by date
... View more
10-17-2022
12:47 PM
1 Karma
Actually, I figured it out. Thank you very much!!
... View more
10-17-2022
07:59 AM
This looks great! One thing to note: As another option, is there any way I can order stats by a bucket of time? (E.g. "| bucket timestamp span=1h@h") Taking the perc95 of the time? THANK YOU!
... View more
10-16-2022
04:29 PM
I have two events where in order to get a response time, I need to subtract the two timestamps. However, this needs to be grouped by "a_session_id" / "transaction_id." The two events I need are circled in red in the screenshot attached. I need those two events out of the three events. Every "a_session_id" has these three logs. source="/apps/logs/event-aggregator/gateway_aggregator_events.log" is always after source="/logs/apigee/edge-message-processor/messagelogging/gateway-prod/production/Common-Log-V1/14/log_message/gateway.json"
Please let me know if you need more information. Such as snippets on the SPL. Any assistance is much appreciated!
... View more
Labels
- Labels:
-
eval
-
field extraction
-
stats