All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi everyone, I am doing a search to find all the events that sent from different servers by hour, to find if any server is down, send nothing so that I will send an alert. raw data looks like thi... See more...
Hi everyone, I am doing a search to find all the events that sent from different servers by hour, to find if any server is down, send nothing so that I will send an alert. raw data looks like this: _time count 2022-09-27T10:17:48 1 2022-09-27T09:57:19 1 2022-09-27T09:56:28 1 2022-09-27T09:56:26 1 I search for events by span=1h and have a table like this: (There are several servers but  I put one for example) _time Server Count 27/09/2022 12:00 A 0 27/09/2022 11:00 A 0 27/09/2022 10:00 A 1 27/09/2022 09:00 A 3 27/09/2022 08:00 A 9 27/09/2022 07:00 A 10   It works now but not very fine for current hour. Imagine it's 12:05 now. when I run the search, I filter by count = 0 at 12:05, I have 2 line However, for example at 12:30, I receive the 1st event from server A, so the filter 0 returns 1 line of 11:00. What I want is to taken into account the count  = 0 only when the time passed 1 hour, to send a good alert. In the example, the filter 0 will return only if the server doesn't send anything during 1h (from 12:00 - 12:59) Currently I do something like this: | where count = 0 AND _time != relative_time(now(), "-1h") but do you have any better solution? I hope I make it clear. Thanks for your help!   
Hi, I am trying to run a python script on my universal forwarder which send data to splunk cloud instance. I have added the path in inputs.conf and there is not events found in my index. While ch... See more...
Hi, I am trying to run a python script on my universal forwarder which send data to splunk cloud instance. I have added the path in inputs.conf and there is not events found in my index. While checking on splunkd logs, there shows a error "The system cannot find the file specified". what could be the problem?
I want to create a Bar chart with the logs where the key would be the stats count field name and value would be the sum value Query :  search1 | eval has_error = if(match(_raw, "WARNING"),1,0)| s... See more...
I want to create a Bar chart with the logs where the key would be the stats count field name and value would be the sum value Query :  search1 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field1| join instance [search2 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field2| join instance [search3 | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field3|join instance [search4  | eval has_error = if(match(_raw, "WARNING"),1,0)| stats sum(has_error) as field4]]] | stats sum( field1), sum(field2), sum( field3), sum( field4) Current result: field1 field2 field3 field4 30 44 122 6   Expected result: Field Count field1 30 field2 44 field3 122 field4 6
Can I convert a playbook-type input to automation in Splunk SOAR (5.3.4) Thanks for helping.
i All   There are query splunk like this :  (index=Prod sourcetype=ProdApp (host=Prod01 OR Prod02) source="/prodlib/SPLID" "Response" ERR-12120) | rex "^(?:[^\[\n]*\[){6}(?P<u>\w+)" | rex... See more...
i All   There are query splunk like this :  (index=Prod sourcetype=ProdApp (host=Prod01 OR Prod02) source="/prodlib/SPLID" "Response" ERR-12120) | rex "^(?:[^\[\n]*\[){6}(?P<u>\w+)" | rex field=_raw "(?<my_json>\{.*)" | spath input=my_json output=customerName path=response.login.customerName | spath input=my_json output=responseCode path=response.responseHeader.responseContext.responseCode | dedup customerName | table customerName,responseCode | append [search index=Prod sourcetype=ProdApp (host=Prod01 OR Prod02) source="/prodlib/SPLID" "Request") | rex "^(?:[^\[\n]*\[){6}(?P<u>\w+)" | rex field=_raw "(?<my_json>\{.*)" | spath input=my_json output=userId path=data.userId | dedup userId | table userId] I will try to join both source from Request and Response, and result like below attachment : My question  is, how show for 5 user id's ? (in blue line) Because i already try join both sources, the user id shown not related for the customer name (in black line) Picture
Hi team, we have performance log in splunk, but the storage policy is only for 3 month. so i can't see data metric trend from splunk for whole 1 year. Is there anyway splunk can ingest data into ... See more...
Hi team, we have performance log in splunk, but the storage policy is only for 3 month. so i can't see data metric trend from splunk for whole 1 year. Is there anyway splunk can ingest data into MongoDB? so that I can use powerBI to connect to MongoDB, and do analysis in PowerBI. Thanks, Cherie
 
For the type of data I am trying to extract, Event Sampling really speeds up the query. This works fine when executing SPL queries, but I have not been able to figure out how to do this in a dashboar... See more...
For the type of data I am trying to extract, Event Sampling really speeds up the query. This works fine when executing SPL queries, but I have not been able to figure out how to do this in a dashboard. Found some older posts where "rand" was used, but apparently that did not speed up the query.   Is it possible to specify Event Sampling directly in a Search Query or in the Dashboard in some way?
Hi, I have multiple panels that need to run timecharts like these: something | table _time,A,B</query> | search A="1"| timechart B something | table _time,A,B</query> | search A="2"| timechar... See more...
Hi, I have multiple panels that need to run timecharts like these: something | table _time,A,B</query> | search A="1"| timechart B something | table _time,A,B</query> | search A="2"| timechart B something | table _time,A,B</query> | search A="3"| timechart B I want to optimize my dashboard for performance by using a base search, so I tried this: <search id="base> <query> something | table _time,A,B</query> </search> .... <panel> <chart> <search base="base"> <query>search A="1"|timechart count by B</query> </search> </chart> </panel> ... <panel> <chart>   <search base="base"> <query>search A="2"|timechart count by B</query> </search> </chart> </panel> ... <panel> <chart> <search base="base"> <query>search A="3"|timechart count by B</query> </search> </chart> </panel> It works great on short times (24h) but with wider ranges (30 days) I lose events because of the base search limit (probably the default, 500,000). Is there a way I can use base search for this? I'm using Splunk Enterprise version 8.1.3.  
I am using the query as below and visualizing it in a line chart.  There is date field coming on the line chart and I want to remove it through XML without removing time field? Can someone guide me... See more...
I am using the query as below and visualizing it in a line chart.  There is date field coming on the line chart and I want to remove it through XML without removing time field? Can someone guide me .  (I was able to remove it in query using field format command but it was not super helpful as I was not able to see visualization.)  Also, I was able to remove the hover through this article - Solved: How to disable mouse hover on bar chart in XML - Splunk Community But not the date.  This one is very close to what I want to do, but didn't solve my case on the line chart.  Solved: How to delete the date category on a visualization... - Splunk Community Query for reference index=xyz sourctype=abc earliest = -60m@m latest = @m |eval ReportKey="Today" |append [search index=index=xyz sourctype=abc earliest = -60m@m-1w latest = @m-1w |eval ReportKey="LastWeek" | eval _time=relative_time(_time, "+1w")] |append [search index=index=xyz sourctype=abc earliest = -60m@m-2w latest = @m-2w |eval ReportKey="TwoWeeksBefore" | eval _time=relative_time(_time, "+2w")] |append [search index=index=xyz sourctype=abc earliest = -60m@m-3w latest = @m-3w |eval ReportKey="ThreeWeeksBefore" | eval _time=relative_time(_time, "+3w")] |timechart span = 1m count(index) as Volume by Reportkey  
For example, the "SUBMIT_DATE" is split by date and time. Then define some period of time as a value(A/B/C). Can this be achieved? (SUBMIT_DATE="2021-03-09 14:30:48.0")  ==> Split to "2021-03-09" a... See more...
For example, the "SUBMIT_DATE" is split by date and time. Then define some period of time as a value(A/B/C). Can this be achieved? (SUBMIT_DATE="2021-03-09 14:30:48.0")  ==> Split to "2021-03-09" and "14:30:48.0" 0:00:00 - 8:00:00 = A 8:00:00 - 16:00:00 = B 16:00:00 - 0:00:00 = C        
Hi Team, I have  several Dashboards that contain base searches data from reports  for example:  <search id="baseSearch" ref="Report"></search>  but, I see that I am not getting option to add... See more...
Hi Team, I have  several Dashboards that contain base searches data from reports  for example:  <search id="baseSearch" ref="Report"></search>  but, I see that I am not getting option to add time token on dashboard. Is there any option we can provide the time token to expand or reduced time window for end users on dashboard by using ref="Report" saved search method ? 
Hey all, I am trying to extract dynamic field from json . {"period":{"start":"2023-04-17","end":"2023-05-14"},"check-ins":{"203":{"avail":5,"price":5},"204":{"avail":5,"price":5},"205":{"avail":5... See more...
Hey all, I am trying to extract dynamic field from json . {"period":{"start":"2023-04-17","end":"2023-05-14"},"check-ins":{"203":{"avail":5,"price":5},"204":{"avail":5,"price":5},"205":{"avail":5,"price":5},"206":{"avail":5,"price":5},"207":{"avail":5,"price":5},"208":{"avail":5,"price":5},"209":{"avail":5,"price":5},"210":{"avail":5,"price":5},"211":{"avail":5,"price":5},"212":{"avail":5,"price":5},"213":{"avail":5,"price":5},"214":{"avail":5,"price":5},"215":{"avail":5,"price":5},"216":{"avail":5,"price":5},"217":{"avail":5,"price":5},"218":{"avail":5,"price":5},"219":{"avail":19,"price":5},"220":{"avail":19,"price":5},"221":{"avail":19,"price":5},"222":{"avail":19,"price":5},"223":{"avail":19,"price":5},"224":{"avail":19,"price":5},"225":{"avail":19,"price":5},"226":{"avail":19,"price":5},"227":{"avail":19,"price":5},"228":{"avail":19,"price":5},"229":{"avail":20,"price":5},"230":{"avail":20,"price":5}}}   I need to extract 203,204,205.........till 230 as per data mentioned above then each extracted value will be added in period.start field . At the end I need that date value after addition   Thanks in advance
When I looked into splunkd log, found the below error message     UserWarning: The soupsieve package is not installed. CSS selectors cannot be used       Can someone please assist with ... See more...
When I looked into splunkd log, found the below error message     UserWarning: The soupsieve package is not installed. CSS selectors cannot be used       Can someone please assist with this ?
I recently upgraded our splunk enterprise to version9.0.1 but I do have problems with the kvstore. Any ideas on how to deal with this please?
I'm trying to use the Splunk 9 addition in foreach iteration with ITEM, but it always returns "Failed to parse templatized search for field 'i'" on my server, which runs 9.0.1.       | makere... See more...
I'm trying to use the Splunk 9 addition in foreach iteration with ITEM, but it always returns "Failed to parse templatized search for field 'i'" on my server, which runs 9.0.1.       | makeresults | eval i = mvrange(0,3) | foreach i [eval showme = <<ITEM>>]        I previously used <<ITEM>> on a laptop Splunk 9 and it didn't have this error.
I would like to create a PIE chart only using the percentage for the ID's who are all completed how many number of certificates .    For example ,  ID            No of Courses_completed  0112... See more...
I would like to create a PIE chart only using the percentage for the ID's who are all completed how many number of certificates .    For example ,  ID            No of Courses_completed  0112         3 0113          1 0114          2 0115           3 0116           0  Likewise I have 1000's of ID . Here I need to find out from total ID , 30% of them completed 2 courses , 40% of them completed 3 courses, 15 % of them completed 1 course, 15% of them not attended any course in PIE chart view. Kindly help to acquire this.  
I support a Splunk App for our company and we recently made some changes to the dashboards we ship with the app - most specifically, updating the dashboard version and removing a script that was not ... See more...
I support a Splunk App for our company and we recently made some changes to the dashboards we ship with the app - most specifically, updating the dashboard version and removing a script that was not Python 3 compatible. However, when customers updated the app, because the user of the App made some small changes to the dashboards, these dashboards did not get updated to the latest dashboards and were showing issues with the old script and the dashboard version - since the changed dashboard would be in the local directory vs. default. Is there a way for us as the developers of the app to merge or force specific changes to a dashboard for our app even if there were customizations made to the dashboard by the customer? Perhaps a warning that dashboard X is not being updated because a local version exists? Thanks, Paul
I am running a query |tstats count latest(_time) where index=abcd by host, my requirement is to create an alert when the count is 0, when there would be no event in the index.  My problem is when ... See more...
I am running a query |tstats count latest(_time) where index=abcd by host, my requirement is to create an alert when the count is 0, when there would be no event in the index.  My problem is when there is no event I am not getting the count field as 0.
In ES I am reviewing results from the "Concurrent Login Attempts Detected" correlation search which is as follows:     | datamodel "Identity_Management" High_Critical_Identities search | renam... See more...
In ES I am reviewing results from the "Concurrent Login Attempts Detected" correlation search which is as follows:     | datamodel "Identity_Management" High_Critical_Identities search | rename All_Identities.identity as "user" | fields user | eval cs_key='user' | join type=inner cs_key [| tstats `summariesonly` count from datamodel=Authentication by _time,Authentication.app,Authentication.src,Authentication.user span=1s | `drop_dm_object_name("Authentication")` | eventstats dc(src) as src_count by app,user | search src_count>1 | sort 0 + _time | streamstats current=t window=2 earliest(_time) as previous_time,earliest(src) as previous_src by app,user | where (src!=previous_src) | eval time_diff=abs(_time-previous_time) | where time_diff<300 | eval cs_key='user']     The issue is that I am seeing false positives for users who previous src is say "abc-xyz-01" and current src is "abc-xyz-02", basically systems with similar names (servers in clusters/pairs). Would it be possible to use a regex for the wordlist in the Fuzzy Search for Splunk app and then filter out similar matches with a lower ratio?