All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'm trying to get a percentage of a field, based on a condition (filtered by search) by another field. e.g.  percentage of 404 errors by application. So need to get the total number of requests for ... See more...
I'm trying to get a percentage of a field, based on a condition (filtered by search) by another field. e.g.  percentage of 404 errors by application. So need to get the total number of requests for each application, filter to keep only 404 errors then count by application. At least that's the logic I used.   <unrelated part to collect proper events> | eventstats count as total by applicationId | search error=404 | stats count as error_404 by applicationId | eval errorRate=((error_404/total)*100)."%" | table applicationId, errorRate     This returns a list of applications, but no values for errorRate. Individually, I'm able to get a result for this:   | stats count as total by applicationId   And also for this:   | search error=404 | stats count as error_404 by applicationId   But something in having them together in the flow I have doesn't work. I also tried this which didn't work. In this instance I get values for applicationId and total. So I guess there's something wrong with how I'm getting the error_404 values.   | stats count as total by applicationId | appendcols[search error=404|stats count as error_404 by applicationId] | eval errorRate=((error_404/total)*100)."%" | table applicationId, error_404, total, errorRate  
Hi, If you make a curl request to the Splunk, that in the web_access.log the client is a 127.0.0.1 and user is '-', can we somehow correct client field to know who actually made the request?
@ITWhisperer , I tried but no luck . It is displaying the count but not displaying the stats    
When field names have special characters in, they often need single quotes around them (double if they are on the left of the assignment). Try this | eval "BackendResponse.content.reasonCode OR Cons... See more...
When field names have special characters in, they often need single quotes around them (double if they are on the left of the assignment). Try this | eval "BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode" = coalesce('BackendResponse.content.reasonCode', 'ConsumerResponse.content.reasonCode') | stats count by 'BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode' StatusCode
Hi, I am unable to find the upload asset option inside Edit properties in manage app. Eventhough I have admin role, i am unable to upload asset. Does it require any capabilities to upload asset to ... See more...
Hi, I am unable to find the upload asset option inside Edit properties in manage app. Eventhough I have admin role, i am unable to upload asset. Does it require any capabilities to upload asset to splunk cloud.?
Hi, I am uploading a .tgz file with js script, png and css inside my /appserver/static folder of my app. After uploading and installing the app in splunk cloud, i am unable to use the script. Any ... See more...
Hi, I am uploading a .tgz file with js script, png and css inside my /appserver/static folder of my app. After uploading and installing the app in splunk cloud, i am unable to use the script. Any idea on this.  
hi @inventsekar  1. yes 2. Where to find  $SPLUNK_HOME\var\log\splunk\first_install.log ? 3. windows 11 and splunk version is 9.2.2
@yuanliu this is the query which I am using to filter the data index="apigee" (ProxyPath="/xyz" OR ProxyPath="/abc") AND StatusCode=200 | eval "BackendResponse.content.reasonCode OR ConsumerRespons... See more...
@yuanliu this is the query which I am using to filter the data index="apigee" (ProxyPath="/xyz" OR ProxyPath="/abc") AND StatusCode=200 | eval "BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode" = coalesce(BackendResponse.content.reasonCode, ConsumerResponse.content.reasonCode) | stats count by "BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode" StatusCode It is showing the event count, but it is not generating the results. Highlited the same.  
this is the query which I am using to filter the data index="apigee" (ProxyPath="/xyz" OR ProxyPath="/abc") AND StatusCode=200 | eval "BackendResponse.content.reasonCode OR ConsumerResponse.content.... See more...
this is the query which I am using to filter the data index="apigee" (ProxyPath="/xyz" OR ProxyPath="/abc") AND StatusCode=200 | eval "BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode" = coalesce(BackendResponse.content.reasonCode, ConsumerResponse.content.reasonCode) | stats count by "BackendResponse.content.reasonCode OR ConsumerResponse.content.reasonCode" StatusCode  
It would help to know what curl command you tried and what error it returned. AIUI, alerts must be deleted individually.  There is no method in the UI for selecting multiple alerts for deletion.
Hi @darshm , if you're sure that in your events there's only one date and time, you could leave Splunk to choose the timestamp, but, my hint is the same of @ITWhisperer : different formats should ha... See more...
Hi @darshm , if you're sure that in your events there's only one date and time, you could leave Splunk to choose the timestamp, but, my hint is the same of @ITWhisperer : different formats should have different sourcetypes, eventually with a similar name (e.g. for fortinet there are fortigate_events, fortigate_logs, fortigate_utm, etc...). Ciao. Giuseppe
The short answer is that the different log formats should be in different sourcetypes.
Hi folks, I have a use case where I am having different types of events in a single sourcetype. I want to apply different timestamp extractions for both the events. I am using TIME_PREFIX and MAX_T... See more...
Hi folks, I have a use case where I am having different types of events in a single sourcetype. I want to apply different timestamp extractions for both the events. I am using TIME_PREFIX and MAX_TIMESTAMP_LOOKAHEAD to extract the timestamp from event #1. However, the same rules won't be useful for event #2. Is there a way to extract the timestamp values from both the events in a single sourcetype? Event #1 Timestamp should be extracted as (Oct  9 23:57:37.887) Oct 10 05:27:48 192.168.100.1 593155: *Oct  9 23:57:37.887: blah blah blah Event #2 Timestamp should be extracted as (Feb 13 11:27:46) Feb 13 11:27:46 100.80.8.22 %abc-INFO-000: blah blah blah TIME_PREFIX = \s[^\s]+\s\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3}\s[^\s]+:\s|\s[^\s]+\s\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3}\s MAX_TIMESTAMP_LOOKAHEAD = 30
Hi @jvamplew , I'm not sure, but it should run: <your_search> | bin span=1s -time | stats avg(host_usage) by host useother=true | addtotals | timechart span=1s avg(host_usage) by host limit=7 useot... See more...
Hi @jvamplew , I'm not sure, but it should run: <your_search> | bin span=1s -time | stats avg(host_usage) by host useother=true | addtotals | timechart span=1s avg(host_usage) by host limit=7 useother=true Ciao. Giuseppe
Hi @baiden ... Good questions will get better answers! 1) the user got admin rights, is that correct? 2) any details on - $SPLUNK_HOME\var\log\splunk\first_install.log ? 3) the Splunk version an... See more...
Hi @baiden ... Good questions will get better answers! 1) the user got admin rights, is that correct? 2) any details on - $SPLUNK_HOME\var\log\splunk\first_install.log ? 3) the Splunk version and windows OS version pls..  
Below is my two ROW event- message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00253","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},... See more...
Below is my two ROW event- message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00253","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00314","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":6,"FAILED":6,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00314","TOTAL":0,"PROCESSED":7295,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00052Med","TOTAL":0,"PROCESSED":273,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00011H","TOTAL":0,"PROCESSED":23,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00303","TOTAL":0,"PROCESSED":8,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00355","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":22,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_01015","TOTAL":0,"PROCESSED":3,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00011H","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":2,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00314","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":38,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00355","TOTAL":0,"PROCESSED":44,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00364","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":6,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00302","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":2,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00364","TOTAL":0,"PROCESSED":7177,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00302","TOTAL":0,"PROCESSED":116,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00607Bundle","TOTAL":0,"PROCESSED":37,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00086","TOTAL":0,"PROCESSED":215,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00061","TOTAL":0,"PROCESSED":4,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00607Bundle","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":14,"FAILED":14,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00030","TOTAL":0,"PROCESSED":21,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00075Med","TOTAL":0,"PROCESSED":546,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00030","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":801,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00022AdjPro","TOTAL":0,"PROCESSED":150,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00473H","TOTAL":0,"PROCESSED":69,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00075Med","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":542,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00607Bundle","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":2,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00022AdjPro","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":335,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00473H","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":10,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00304","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":12,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00304","TOTAL":0,"PROCESSED":637,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00396","TOTAL":0,"PROCESSED":2,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00079MEDICA","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":88,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00086","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00079MEDICA","TOTAL":0,"PROCESSED":24,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00304","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":1,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00022AdjPro","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":1,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":5}] message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00253","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00797H","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":2,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00365","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":511,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00365","TOTAL":0,"PROCESSED":210,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":8,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00396","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":21,"PROCESSING":0,"DATE":"7/9/2024","DAYHOUR":4}]
I am using below query and its throwing [nxg-splunk-idx503,nxg-splunk-idx504,nxg-splunk-idx506] Field 'collection' does not exist in the data. Same query is working fine for other events. Its only fa... See more...
I am using below query and its throwing [nxg-splunk-idx503,nxg-splunk-idx504,nxg-splunk-idx506] Field 'collection' does not exist in the data. Same query is working fine for other events. Its only failing for the hour 5 and 4   index = ***** host=**** source=***| spath | eval message="{\"message\":".message."}" | spath input=message message{} output=collection | mvexpand collection | spath input=collection |eval totalCount = SKIPPED + PROCESSED|chart sum(SKIPPED) as SKIPPED,sum(PROCESSED) as Processed sum(totalCount) as TotalClaims by DAYHOUR DATE   Below is my two ROW event- Event 1 {"id":"0","severity":"Information","message":"[{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00253\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00314\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":6,\"FAILED\":6,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00314\",\"TOTAL\":0,\"PROCESSED\":7295,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00303\",\"TOTAL\":0,\"PROCESSED\":8,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00355\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":22,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_01015\",\"TOTAL\":0,\"PROCESSED\":3,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5}{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00302\",\"TOTAL\":0,\"PROCESSED\":116,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":5}]"} Event 2 {"id":"0","severity":"Information","message":"[{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00253\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00797H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":2,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00365\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":511,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00365\",\"TOTAL\":0,\"PROCESSED\":210,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00410\",\"TOTAL\":0,\"PROCESSED\":8,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4},{\"TARGETSYSTEM\" "CPW\",\"ARUNAME\" "CPW_00410\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":21,\"PROCESSING\":0,\"DATE\" "7/9/2024\",\"DAYHOUR\":4}]"} You know its scanned 8 events and its matched with only 5 . Now sure why those events which is generated in hour 4 and 5 are not matching 
Thanks Giuseppe. Unfortunately that is the problem, I actually have 30 values. I want to display the total for all, but don't necessarily want to chart them all, as this many series over a number of... See more...
Thanks Giuseppe. Unfortunately that is the problem, I actually have 30 values. I want to display the total for all, but don't necessarily want to chart them all, as this many series over a number of charts tends to slow down the dashboard. I was hoping that by using Other, it would sum the other values into that column, thereby allowing me to display an accurate total while not displaying all the values.  Is there a way to do this? I'm thinking it may only work by appending a subsearch for the total and overlaying it on the original chart, but I was trying to avoid adding another search for every panel that displays this data. 
i have enough space and everything but it stills says there is an error i have instaled it 3 three times but i cant still run out  
Hi @srinivasmanikan , could you share a sample of your logs in text format? Ciao. Giuseppe