All Topics

Top

All Topics

Hi, Is it possible to create a single health rule schedule for the below timeline? Mon - 5 PM - Tues 9 AM Tues 5 PM - Wed 9 AM Wed 5 PM - Thurs 9 AM Thurs 5 PM - Fri 9 AM Fri 5 PM -  M... See more...
Hi, Is it possible to create a single health rule schedule for the below timeline? Mon - 5 PM - Tues 9 AM Tues 5 PM - Wed 9 AM Wed 5 PM - Thurs 9 AM Thurs 5 PM - Fri 9 AM Fri 5 PM -  Mon 9 AM Basically, I need an alert schedule for Out of Business hours. Business hours schedule is (Mon - Fri 9 AM - 5 PM). Is it possible to create the above-mentioned schedule in a single schedule window?
Hi everyone, I'm a newbie to Splunk. I installed Splunk Enterprise in a Server which is connected to AD. Other machine, I have installed the Universal Forwarder. I have a admin account for AD and w... See more...
Hi everyone, I'm a newbie to Splunk. I installed Splunk Enterprise in a Server which is connected to AD. Other machine, I have installed the Universal Forwarder. I have a admin account for AD and with that I have installed the forwarder in other machine. I want to monitor all other logs from that machine. If try to collect the logs, it says that "Unable to get wmi classes from host 'xxxxx'. This host may not be reachable or WMI may be misconfigured".  I have followed the steps under Configure Active Directory for running Splunk software as a domain user in this page - Prepare your Windows network to run Splunk Enterprise as a network or domain user page. Is that something am I missing?, Also, I'm not sure on how to collect the remote logs.
Hello, I am trying to get regex to work in ingest actions to match a list of event codes from Window Security Logs.   The following regex matches sample text on regex101.com   ^(EventCode=(11... See more...
Hello, I am trying to get regex to work in ingest actions to match a list of event codes from Window Security Logs.   The following regex matches sample text on regex101.com   ^(EventCode=(1102|4616|4624|4625|4634|46484657|4697|4698|4699|4700|4701|4702|4719|4720|4722|4723|4725|4728|4732|4735|4737|4738|4740|4755|4756|4767|4772|4777|4782|4946|4947|4950|4954|4964|5025|5031|5152|5153|5155|5157|5447))$   But it doesn't find in matches when using in ingest actions. Given the eventcodes listed above, can someone assist me with finding the correct regex that will work inside of ingest actions?   Thanks!  
Hi. I have a dataset and one of the index columns is "X". I need to check whether or not this "X" feature is normally-distributed by plotting a histogram. I tried doing this but this doesn't ... See more...
Hi. I have a dataset and one of the index columns is "X". I need to check whether or not this "X" feature is normally-distributed by plotting a histogram. I tried doing this but this doesn't plot the actual actual values:   Can you please help?
My Aim : This below query gives me count of success, failure by b_key, c_key. I want to get the distinct count of b_key for which the failure occurred. In the example below it will be 2.   ... See more...
My Aim : This below query gives me count of success, failure by b_key, c_key. I want to get the distinct count of b_key for which the failure occurred. In the example below it will be 2.       | eval Complete = case(key_a="complete", "Complete") | eval Init = case(key_a="init" , "Init") | stats count(Init) as Init, count(Complete) as Complete by b_key, c_key | eval Fcount = if((Init != Complete),1,0) | eval Scount = if((Init = Complete),1,0) | stats sum(Fcount) as FailureCount, sum(Scount) as SuccessCount | eval total=(FailureCount+SuccessCount) | eval Success% = round(SuccessCount/total*100,2) | eval Failure% = round(FailureCount/total*100,2) | table FailureCount, SuccessCount, Success%, Failure%    
I am struggling to figure out how to get the Visualization that I want, if even possible.... Timechart works great for this purpose but only when having 1 By clause (aggregated on one value), so if... See more...
I am struggling to figure out how to get the Visualization that I want, if even possible.... Timechart works great for this purpose but only when having 1 By clause (aggregated on one value), so if I have understood it properly, I should use the Stats command which supports multiple aggregations. The end goal is to have one graph showing the following: Y-axle: Count of the events X-axle: Time Graph lines: One Graph line shows the Count for a unique combination of responseCode and Location OR possibly using Trellis (probably better) split By Location, so that each Location is a separate graph with one graph line showing the count for the responseCode. The Search as it is now:   <<SEARCH>> | bin _time as time span=15m | stats count by _time,body.records.properties.responseCode,body.records.location   If using Trellis split by Location, this results in two graphs, one per Location where each has one graph line for Count (no matter the response code) and one more graph line for the response code itself (i.e. response code 200 becomes a line on 200 of the Y-axle). But I want 1 single graph line showing the count per unique responseCode (the legend should display the responseCode (i.e. 200). Any ideas?
I have a dashboard showing website user journey data by reading various elements from a  log message.  Now the structure of logs has been changed in such a way I will have to change my queries to get... See more...
I have a dashboard showing website user journey data by reading various elements from a  log message.  Now the structure of logs has been changed in such a way I will have to change my queries to get same data elements.  Say the logs changed on 1st February and I want to use same dashboard to be able to see data before and after the change.  So my question is how do I use two queries, on same data source but applying first query before hardcoded time (e.g. 2023-02-01 00:00:00) and other after this time and join the records together to generate my stats. BTW, I also have a global date time picker which dictates how far back in time I perform the search     
I've made an app and put the app in "$Splunk_Home\etc\apps\app_name\local" where I have the outputs.conf file. Since there is no outputs.conf file in "$Splunk_Home\etc\system\local" I get an error me... See more...
I've made an app and put the app in "$Splunk_Home\etc\apps\app_name\local" where I have the outputs.conf file. Since there is no outputs.conf file in "$Splunk_Home\etc\system\local" I get an error message in the log stating: "LightWeightForwarder/UniversalForwarder not configured. Please configure outputs.conf".  If I move the outputs.conf file from my app to "$Splunk_Home\etc\system\local" it will work. I've have already an old setup that I inherited where this is working. It seems like the file in my app is not read for some reason. I've checked that the user have read access to files in my app. Unfortunatly I don't have documentation from the old setup so I can't see how this was implemented. Are someone able to point me in the right direction? I've tried searching for this issue, but couldn't find anything related to this issue. Thanks in advance
Above is the title of my dashboard, need to add the present date along with the title   For the above one we need to add the event information mentioning the Success as Green, Running a... See more...
Above is the title of my dashboard, need to add the present date along with the title   For the above one we need to add the event information mentioning the Success as Green, Running as Blue, Error as Red, Wait as Yellow like below Along with the above we need to mention the total event details in the left side of the snippet
Hi Splunk friends, I'm using windows data for this example.  I want to collect in a time range of last 7 days, the numbers of hosts from my windows index with a span of 1d the result I am ex... See more...
Hi Splunk friends, I'm using windows data for this example.  I want to collect in a time range of last 7 days, the numbers of hosts from my windows index with a span of 1d the result I am expecting is that every day I can see in a timechart the total numbers of host on each day increases of decreases to do that I am using this search index=<windows Index>    Computer=XYZ* | dedup Computer | timechart count(Computer) as count span=1d The problem I am having is that the search never ends so only show a flat line and a peak from the last day.  I have around1000 host.  is there is a way to collect this data in a more efficient way?  Thank in advance.
Hi, I am trying to implement Splunk on bamboo datacenter but it seems like file which was present in Splunk download page is not supported. Any other way where I can get the information about Spl... See more...
Hi, I am trying to implement Splunk on bamboo datacenter but it seems like file which was present in Splunk download page is not supported. Any other way where I can get the information about Splunk integration on Bamboo Data center. Regards, Neha  
Hello, Please help if some has done this before, I need custom variable to split the variable output for my applications (xyzabc.com [P], abcxyz.com [G] & xyz123.com [S])  Output should be: App... See more...
Hello, Please help if some has done this before, I need custom variable to split the variable output for my applications (xyzabc.com [P], abcxyz.com [G] & xyz123.com [S])  Output should be: Application=[P] Application=[S] Application=[G] _______ Please refer the below payload which I am using : [ { "labels": { "Message": "A health rule violation occurred for the application ${latestEvent.application.name}", "Time_of_Occurrence": "${action.triggerTime}", "source": "AppD", "Application_Name": "${latestEvent.application.name}", "Event_Name": "${latestEvent.displayName}", "Event_Message" : "${latestEvent.eventMessage}", }, "annotations": { "type": "image", "src": "${latestEvent.severityImage.deepLink}", "alt": "${latestEvent.severity}", "type_link": "link", "href": "${latestEvent.deepLink}", "text": "View this transaction in AppDynamics", "Event_Message" : "${latestEvent.eventMessage}" } } ]
Hello Everyone, I have dashboard with token value as datacenter, which has 3 options from dropdown: Dublin ="*dbl_dc_01*" Singapore= "*sing_dc_01*" Both = "*"  (this is incorrect for my requi... See more...
Hello Everyone, I have dashboard with token value as datacenter, which has 3 options from dropdown: Dublin ="*dbl_dc_01*" Singapore= "*sing_dc_01*" Both = "*"  (this is incorrect for my requirement.. i  know) Currently I am plotting the line chart graph based on the search when $datacenter$ Dublin is selected using the below search query: (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*dbl_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Dublin_Hits" $datacenter$ Singapore is selected: (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*sing_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Singapore_Hits" When Both selected - I need that 2 lines to be plotted on that same chart: From the independent search query, i am able to achieve this using 2 searches with append (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*dbl_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Dublin_Hits" | append [ search (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*sing_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Singapore_Hits"] How do we get this plotted in the same dashboard when BOTH is selected from drop down   Note: $servicename$ value is generated dynamically based on data centre location
Hi All,  My Dashboard panel which calls a report search is showing "Search did not return any events." When i click on the magnifying glass icon and run the search manually, it displays the results ... See more...
Hi All,  My Dashboard panel which calls a report search is showing "Search did not return any events." When i click on the magnifying glass icon and run the search manually, it displays the results without any issues.  Please advise what could be wrong in the XML form.  I am ensuring to use <form> </form>      <form version="1.1"> <label>SLA Metrics</label> <fieldset autoRun="true" submitButton="false"> <input type="time" token="field1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <event> <title>MTTA - Mean Time to Acknowledge</title> <search ref="MHE - Mean Time to Acknowledge"> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> </search> <option name="list.drilldown">none</option> </event> </panel> </row> </form>         I have referenced https://community.splunk.com/t5/Splunk-Search/Using-time-range-picker-does-not-work-in-dashboard-where-report/m-p/148254 and as far as i can tell,  my xml code is in line with what is the solution in the post.  Please assist.
Hello all, I have a data set in Splunk from which I can extract X, Y and Z values that I need to plot into a 3D height or terrain map. I have been searching for a bit but so far, I have been unable... See more...
Hello all, I have a data set in Splunk from which I can extract X, Y and Z values that I need to plot into a 3D height or terrain map. I have been searching for a bit but so far, I have been unable to find a solution. (Which may also mean that I am not a good searcher...) Can anyone point me in the right direction on how I can get a nice 3D graph like that in my dashboard? (I am currently on Splunk 9.0.3 but I keep my version up-to-date) Thanx in advance.
Hi  I have a field(event_details) that contains a JSON array. Record 1: {"event_details":[{"product_id":"P002","price":19.99,"payment_method":"Paypal"}]} Record 2: {"event_details":[{"prod... See more...
Hi  I have a field(event_details) that contains a JSON array. Record 1: {"event_details":[{"product_id":"P002","price":19.99,"payment_method":"Paypal"}]} Record 2: {"event_details":[{"product_id":"P001","price":9.99,"payment_method":"Credit Card"},{"product_id":"P002","price":10,"payment_method":"Credit Card"}]} Query: source="sample_Logs.csv" host="si-i-01ab4b9a34d1f49ec.prd-p-gfp5t.splunkcloud.com" sourcetype="csv" | tojson auto(*) | spath "event_details{}.product_id" | search "event_details{}.product_id"=P002 When using the above query I got both records in the response. But I need only those records with product_id = "P002" only and not with any other product_id in the JSON array. In this case record 1 contains only product_id as P002. I need only that record in the response. How to form the query for it?   I really appreciate any help you can provide. Update: Have explained my query properly in the below comment.  https://community.splunk.com/t5/Splunk-Search/Nested-field-Json-array-searching/m-p/629097/highlight/true#M218519 
can anyone share me how to get data from SharePoint Online List to Splunk Enterprise. I have to get user custom actions details from SharePoint application to Splunk Enterprise. Please give me th... See more...
can anyone share me how to get data from SharePoint Online List to Splunk Enterprise. I have to get user custom actions details from SharePoint application to Splunk Enterprise. Please give me the code and samples too if it available
Does anyone know why the time range picker here on the right side (set to Yesterday Jan 30) cannot affect my _time data field in the query result? How to link them?   
I have a Saas trial [redacted]  and I have received the 500 internal error for some time. How can I fix this? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller... See more...
I have a Saas trial [redacted]  and I have received the 500 internal error for some time. How can I fix this? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller URL on Community posts for security and privacy reasons
Hi all - I'm attempting to write a query using earliest/latest based off a date field in the event, not _time. I've tried a dozen things, and no matter what I try the earliest/latest fields are not s... See more...
Hi all - I'm attempting to write a query using earliest/latest based off a date field in the event, not _time. I've tried a dozen things, and no matter what I try the earliest/latest fields are not showing what I expect. I'm using 'my_report_date' as the desired earliest/latest field. When I run the following search, the earliest should be 11/1/22, but it shows as 11/2 (these events were sent to a summary index prior to the events of 11/1). The rest of the query is finding the number of days between first/last events. How do I refine this search to use 'my_report_date' instead of _time?   index=summary | stats earliest(my_report_date) AS FirstFound, latest(my_report_date) AS LastFound by my_asset | convert mktime(FirstFound) AS FirstFoundEpoch timeformat="%Y-%m-%d" | convert mktime(LastFound) AS LastFoundEpoch timeformat="%Y-%m-%d" | eval daysdiff=round((LastFoundEpoch-FirstFoundEpoch)/86400,0) | stats count by my_asset, FirstFound, LastFound, daysdiff