All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

  I need to calculate if more than 15% of "error1" in server logs are there since last release. Release is every Wednesday (weekly) Sat Sun Mon Tues Wed Thru Fri 1 2 ... See more...
  I need to calculate if more than 15% of "error1" in server logs are there since last release. Release is every Wednesday (weekly) Sat Sun Mon Tues Wed Thru Fri 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21   Release 1 cycle - from 6th to 12th , Release 2 cycle -  13th to 19th and so on for a month For examples: So, if splunk query runs on 14th , it should find "errors1" count in server logs from 6th to 12th as "count1". Then, it should find "errors1" count in server logs from 13th to 14th  as "count2". Then calculate the percentage increase in "count2" from "count1" If splunk query runs on 20th , it should find "errors1" count in server logs from 13th to 19th as "count1". Then, it should find "errors1" count in server logs on 20th as "count2" . Then calculate the percentage increase in "count2" from "count1". If splunk query runs on 18th , it should find "errors1" count in server logs from 6th to 12th as "count1". Then, it should find "errors1" count in server logs from 13th to 18th as "count2" . Then calculate the percentage increase in "count2" from "count1". Calculate if more than 15% of "error1" in server logs are there since last release. Please help!    
Hi All, I've been working on a search that will give me the Account_Name of someone who has failed to login 6-10 times concurrently. I'm running into an issue where i'm seeing the accurate results w... See more...
Hi All, I've been working on a search that will give me the Account_Name of someone who has failed to login 6-10 times concurrently. I'm running into an issue where i'm seeing the accurate results when specifying 1-2 Account_Names in the query. When i try to open it up to any Account_Name i'm getting no results returned. Im dealing with a very large sourcetype and im wondering if there might be some kind of streamstats limit being reached preventing results? I'm not too familiar with the streamstats command but believe im using it correctly judging by the refined search and accurate results. My ultimate goal would be to incorporate the concurrent count into the last query listed below.    Any advice/help would be appreciated.  Search Returning Results (Accurate): | from datamodel:Authentication | search sourcetype="graylogwindows:Security" (EventCode=4624 OR EventCode=4625) (Account_Name=User1 OR Account_Name=User2) NOT (Account_Name="*$" OR Account_Name="HealthMailbox*") | eval LoginAttemptResult=if((action="failure" OR EventCode=4625), "FAILED", "SUCCESSFUL") | streamstats count(eval(LoginAttemptResult="FAILED")) AS ConcurrentFailed BY Account_Name reset_before=signature_id=4624 reset_after=signature_id=4624 | where ConcurrentFailed >= 6 AND ConcurrentFailed <= 10 | sort Account_Name _time |table Workstation_Name,_time,signature,signature_id,Account_Name,ConcurrentFailed Search Returning Nothing: | from datamodel:Authentication | search sourcetype="graylogwindows:Security" (EventCode=4624 OR EventCode=4625) NOT (Account_Name="*$" OR Account_Name="HealthMailbox*") | eval LoginAttemptResult=if((action="failure" OR EventCode=4625), "FAILED", "SUCCESSFUL") | streamstats count(eval(LoginAttemptResult="FAILED")) AS ConcurrentFailed BY Account_Name reset_before=signature_id=4624 reset_after=signature_id=4624 | where ConcurrentFailed >= 6 AND ConcurrentFailed <= 10 |table Workstation_Name,_time,signature,signature_id,Account_Name,ConcurrentFailed | sort Account_Name _time   Ultimate goal to get functioning (everything functioning other than streamstats "Failure_Count"): | from datamodel:Authentication | search sourcetype="graylogwindows:Security" (EventCode=4624 OR EventCode=4625) NOT (Account_Name="*$" OR Account_Name="HealthMailbox*") | reverse | streamstats count(EventCode) as "Failure_Count" BY Account_Name reset_after=EventCode="4624" reset_before=EventCode="4624" | stats count(Keywords) as Attempts, count(eval(match(Keywords,"Audit Failure"))) as Failed, count(eval(match(Keywords,"Audit Success"))) as Success, values(Failure_Count) as "Failure_Count_Values" by Account_Name | where Attempts>=1 AND Success>=1 AND Failed>=6 AND Failure_Count_Values > 6....
Hi Team, I am using rtrim command to trim some values, but its not working for all the values,as per below screenshot i need to trim everything from _-D in all values, its working for first one for ... See more...
Hi Team, I am using rtrim command to trim some values, but its not working for all the values,as per below screenshot i need to trim everything from _-D in all values, its working for first one for remaining values its not working, can anyone please provide some solution to it. Or in case of any regex please suggest  
good morning Is there a way to validate the time of the current splunk servers? Let me explain, during these days there will be a time change so the servers should update their time automatically, b... See more...
good morning Is there a way to validate the time of the current splunk servers? Let me explain, during these days there will be a time change so the servers should update their time automatically, but I have seen over time that not all servers are correctly patched, for example a universal forwarder sends certain data and the sourcetype was configured like current_time, this would cause events to arrive either late or early. Currently I have this query to validate the time of the servers but I do not know if it is correct. | metadata type = hosts index = _internal | search host = splunk * | eval recent_time = Now () - recentTime | eval r_time = strftime (recentTime, "% m /% d /% and% H:% M:% S") | table host r_time Any information is appreciated Regards
Hi, guys I want use external lookup to query with the http api. I can use curl to get the response.  curl 'http://hq.sinajs.cn/list=sh601006' How to get the response with search command. Such as... See more...
Hi, guys I want use external lookup to query with the http api. I can use curl to get the response.  curl 'http://hq.sinajs.cn/list=sh601006' How to get the response with search command. Such as:  | stats count by num | eval num='sh601006' | lookup test  list AS num output response AS res I have tried this script,but doesn`t work. https://community.splunk.com/t5/Splunk-Search/Example-of-doing-an-external-lookup-using-HTTP-GET-or-POST/td-p/23350
Hello guys, could you let me know how to properly restore frozen buckets from clustered indexers to non-clustered instance (VM)? Thanks for your help 
Hello, I am storing data (JSON/CSV) in s3 bucket in AWS and I want to send this data into Splunk and data is updated every 5 minutes so I want to update or create a new data log in Splunk in every ... See more...
Hello, I am storing data (JSON/CSV) in s3 bucket in AWS and I want to send this data into Splunk and data is updated every 5 minutes so I want to update or create a new data log in Splunk in every 5 minutes. I am now trying by using Splunk add-on for AWS app but I don't know if it will help to send data inside the s3 bucket or not? Can anyone tell me the right method or way to do it?? Thank you!
Hi, We are not receiving Windows event logs .Below is the stanza added in input.conf file. But we are not receiving the data. We are using Universal forwarder to forward data to Splunk Cloud. from t... See more...
Hi, We are not receiving Windows event logs .Below is the stanza added in input.conf file. But we are not receiving the data. We are using Universal forwarder to forward data to Splunk Cloud. from the servers  Is it due to any permission issue in the servers? The same is executed as admin from the servers Any suggestions would be much appreciated [WinEventLog://Application] disabled = 0 start_from = oldest current_only = 0 checkpointInterval = 5 index = index_base-heilite renderXml=false
Hi, I need to change time control background color from white to grey using css file. How to do that. Please help  
Dear Team, We are generating the Temporary ID based on the Parameter which is crossing beyond the Park Average. Here i have attached a scenario for a single Parameter. For each day alert Generating f... See more...
Dear Team, We are generating the Temporary ID based on the Parameter which is crossing beyond the Park Average. Here i have attached a scenario for a single Parameter. For each day alert Generating for a Particular Events. The code which we using, index="alert_id" ID="KirvereGKA42SlipRingTemperatureHigh" |dedup New_ID, Turbine, Alert_ID, Unique_ID, CreatedDate |eval Today=strftime(relative_time(now(), "@d"), "%Y-%m-%d") |eval Date_1 = strptime(CreatedDate,"%Y-%m-%d") |eval Date_2 = strptime(Today,"%Y-%m-%d") |eval Duration = round((Date_2 - Date_1)/86400) |streamstats window=2 range(Date_1) as NDate by ID |eval Dur_Str = round(NDate/86400) |eval Combine = if(Dur_Str<=1,"Open","Closed") |eval OpenDate = if(Combine="Open", CreatedDate, "") |eval ClosedDate = if(Combine="Closed", CreatedDate, "") | streamstats count as S_No by ID |eval Close = case(Combine="Closed",S_No) |eventstats values(eval(if(S_No >= Close, "Closed", "Open"))) as Status by S_No |table S_No, Alert_ID, WindFarm, Turbine, Category, WindFarm, Parameter, WTG_value, farm_avg, CreatedDate, Duration, Dur_Str. Output of the Code ,Please find the below screenshot for your reference. S_No Alert_ID WindFarm Turbine Category Parameter WTG_value farm_avg CreatedDate Duration Dur_Str 1 TE-43065 Kirvere GKA42 High SlipRingTemperature 45.99 33.1 9/4/2020 0 0 2 TE-42243 Kirvere GKA42 High SlipRingTemperature 46.02 32.73 9/3/2020 1 1 3 TE-41336 Kirvere GKA42 High SlipRingTemperature 46.02 32.64 9/2/2020 2 1 4 TE-39260 Kirvere GKA42 High SlipRingTemperature 46.1 31.66 8/31/2020 4 2 5 TE-38213 Kirvere GKA42 High SlipRingTemperature 46.01 32.72 8/30/2020 5 1 6 TE-37103 Kirvere GKA42 High SlipRingTemperature 45.97 33.91 8/29/2020 6 1 7 TE-36017 Kirvere GKA42 High SlipRingTemperature 45.96 34.55 8/28/2020 7 1 8 TE-34988 Kirvere GKA42 High SlipRingTemperature 45.94 34.81 8/27/2020 8 1 9 TE-33969 Kirvere GKA42 High SlipRingTemperature 45.94 34.91 8/26/2020 9 1 10 TE-33042 Kirvere GKA42 High SlipRingTemperature 45.95 34.66 8/25/2020 10 1 11 TE-32112 Kirvere GKA42 High SlipRingTemperature 45.97 34 8/24/2020 11 1 12 TE-31177 Kirvere GKA42 High SlipRingTemperature 45.99 33.21 8/23/2020 12 1 13 TE-30189 Kirvere GKA42 High SlipRingTemperature 46.01 32.74 8/22/2020 13 1 14 TE-29007 Kirvere GKA42 High SlipRingTemperature 46 32.65 8/21/2020 14 1 15 TE-27658 Kirvere GKA42 High SlipRingTemperature 45.98 33.25 8/20/2020 15 1 16 TE-26334 Kirvere GKA42 High SlipRingTemperature 45.97 33.93 8/19/2020 16 1 17 TE-25039 Kirvere GKA42 High SlipRingTemperature 45.97 33.93 8/19/2020 16 0 18 TE-23723 Kirvere GKA42 High SlipRingTemperature 45.98 34.18 8/18/2020 17 1 19 TE-22307 Kirvere GKA42 High SlipRingTemperature 45.97 34.28 8/17/2020 18 1 20 TE-21016 Kirvere GKA42 High SlipRingTemperature 45.98 34.04 8/16/2020 19 1 21 TE-19738 Kirvere GKA42 High SlipRingTemperature 46 33.6 8/15/2020 20 1 22 TE-18365 Kirvere GKA42 High SlipRingTemperature 46 33.08 8/14/2020 21 1 23 TE-17108 Kirvere GKA42 High SlipRingTemperature 45.97 33.41 8/13/2020 22 1 24 TE-15941 Kirvere GKA42 High SlipRingTemperature 45.98 33.36 8/12/2020 23 1 25 TE-14800 Kirvere GKA42 High SlipRingTemperature 45.97 33.8 8/11/2020 24 1 26 TE-13633 Kirvere GKA42 High SlipRingTemperature 45.96 34.48 8/10/2020 25 1 27 TE-12587 Kirvere GKA42 High SlipRingTemperature 46.01 34.63 8/9/2020 26 1 28 TE-11483 Kirvere GKA42 High SlipRingTemperature 46.03 34.22 8/8/2020 27 1 29 TE-10278 Kirvere GKA42 High SlipRingTemperature 46.03 34.21 8/7/2020 28 1 30 TE-9178 Kirvere GKA42 High SlipRingTemperature 46.02 34.04 8/6/2020 29 1 31 TE-8156 Kirvere GKA42 High SlipRingTemperature 46.04 33.51 8/5/2020 30 1 32 TE-7065 Kirvere GKA42 High SlipRingTemperature 45.93 34.11 8/4/2020 31 1 From the above table we need to find the duration of the Particular Events, for that i have derived the day difference from today date and days difference between rows. What we want, you can see the above table from the below we are having the Events occurred from 20/07/2020(Open) to 26/07/2020(Closed) and its not happened till 03/08/2020. After that the same events happened on 04/08/2020(Open) to 31/08/2020(Closed) and its again started on 02/09/2020(Open) to till date. We need the output like the below table: Alert_ID WindFarm Turbine Category Parameter WTG_value farm_avg CreatedDate No.ofDaysOpen OpenDate ClosedDate Please Help me on this, how to we derive OPEN and Closed Status of the Particular Events based on the Date? Thanks in Advance.
Hello, im trying to skip one line while indexing whole file. This is the line im trying to skip. Trace Opening D:/nlog-all-2020-09-04.log with allowFileSharedWriting=False It changes time... See more...
Hello, im trying to skip one line while indexing whole file. This is the line im trying to skip. Trace Opening D:/nlog-all-2020-09-04.log with allowFileSharedWriting=False It changes time stamp as u can see in title of the file. How can i achieve it easiest way please?
I've got a vulnerability scan showing that SSLv3 is enabled on port 8090 on our Splunk 7.1.1   indexer.  In my server.conf file we don't have these lines below:    [sslConfig] sslVersions = *,-ss... See more...
I've got a vulnerability scan showing that SSLv3 is enabled on port 8090 on our Splunk 7.1.1   indexer.  In my server.conf file we don't have these lines below:    [sslConfig] sslVersions = *,-ssl2,-ssl3 cipherSuite = TLSv1.2:!eNULL:!aNULL   Is it ok to add them manually to disable  SSLv3 on port 443 (TCP)?
Hi, I've been trying to install this app: https://splunkbase.splunk.com/app/1293/ which uses this TA: https://splunkbase.splunk.com/app/3418/ But unfortunately it uses UI components which break t... See more...
Hi, I've been trying to install this app: https://splunkbase.splunk.com/app/1293/ which uses this TA: https://splunkbase.splunk.com/app/3418/ But unfortunately it uses UI components which break the web interface. Does anyone have NetApp devices and have a workaround to onboard this data into Splunk 8.0?
Hi All !  Has someone used "Splunk for Excel Export" in SPLUNK 8 ?  When I was using Splunk 6.5 it works, but after upgrade to Splunk 8 it stops works and I got error: page not found. Any suggesti... See more...
Hi All !  Has someone used "Splunk for Excel Export" in SPLUNK 8 ?  When I was using Splunk 6.5 it works, but after upgrade to Splunk 8 it stops works and I got error: page not found. Any suggestions ? Thank you in advance !  
Hi, May i know what is the correct SPL language to show scatter plot chart with time as x-axis and number on y-axis? Actually I read in here that this feature is currently not supported and listed ... See more...
Hi, May i know what is the correct SPL language to show scatter plot chart with time as x-axis and number on y-axis? Actually I read in here that this feature is currently not supported and listed in enhancement request. https://community.splunk.com/t5/Splunk-Search/How-to-present-the-date-and-value-not-in-epoch-format-and-is/td-p/379953 I tried to query:       .. search | table _time value       However the result shows the time in scatterplot on x-axis is a number 0,5,10.. and there's only 1 point in zero value. May I know if the enhancement request is ready ? or is there any workaround for this problem? Thanks.    
Hi everyone, I am new with Splunk but I have a knowledge in SQL. I got a new problem with my search query which is can I use the data from first searching in next searching in the same query?  Ex... See more...
Hi everyone, I am new with Splunk but I have a knowledge in SQL. I got a new problem with my search query which is can I use the data from first searching in next searching in the same query?  Example: 1) index=abc sourcetype="abc:scanning" status="new_scanning" | table serial_id, action serial_id  action 6577 drive 8872 swim   2) index=abc sourcetype="abc:viewing" status="new_viewing" | table serial_id, person_in_charge, total_person serial_id  person_in_charge total_person 6577 Tom 45 8872 Giga 23   may I know what is the best way to get this kind of output? serial_id  person_in_charge total_person action 6577 Tom 45 drive 8872 Giga 23 swim   My idea is to use join or append but I do not know how to "serial_id" in the second searching. Thanks in advance 
Why do we encounter this  "Does not meet the recommended minimum system"  only for ESSH03 even though all of the system are similar configuration?  
How do we come to conclusion which Data Model will be applied to specific use case? raw data like id: 8766899, timestamp:2020-08-31, host name:  cdoqbice2929, status="rebooted", message=system has ... See more...
How do we come to conclusion which Data Model will be applied to specific use case? raw data like id: 8766899, timestamp:2020-08-31, host name:  cdoqbice2929, status="rebooted", message=system has rebooted" Can someone help with this?  
Hi, I'm bring SRX data into Splunk but the fields aren't getting extracted by the Juniper Add-On. Can the Juniper Add-On parse SRX logs? And if so, what could be the issue? The logs are coming in, b... See more...
Hi, I'm bring SRX data into Splunk but the fields aren't getting extracted by the Juniper Add-On. Can the Juniper Add-On parse SRX logs? And if so, what could be the issue? The logs are coming in, but its not searchable since there's no field extractions.
Dear All, I encounter a question on setting up a blacklist ip use case. I create a blacklist.csv which stored over 500,000 record and the format is like BlacklistIP x.x.x.x abc.com y.y.y.y bcd... See more...
Dear All, I encounter a question on setting up a blacklist ip use case. I create a blacklist.csv which stored over 500,000 record and the format is like BlacklistIP x.x.x.x abc.com y.y.y.y bcd.com   I use the following search  index=test dst_ip=* OR src_ip=* [ | inputlookup blacklist.csv | fields BlacklistIP | rename BlacklistIP as query] however, I discovered that splunk is limited the subsearch to 10000 result. If the 1.1.1.1 is in col 1000 and the src_ip/dst_ip is 1.1.1.1, it appears in the search result. If the 3.3.3.3 is in col 30000, even the src_ip/dst_ip is 3.3.3.3, it is not appear in the search result. If the 4.4.4.4 is in col 50000, even the src_ip/dst_ip is 4.4.4.4, it is not appear in the search result. After i change the subsearch limit in the limit.conf, maxout = 1,000,000 maxtime = 240 ttl = 600 The result contain 3.3.3.3 but 4.4.4.4 is still not appear. Also, the search is taking a long time, may be around 5 to 6 mins. here is the hardware spec. Splunk Enterprise Server 8.0.4 Linux, 7.64 GB Physical Memory, 8 CPU Cores Mode: Standalone Is there any suggestion for me? Thank you for help!