All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, We have a few URLs being monitored by a splunk alert(query pasted below for reference) by making use of the "Website Monitoring" add on. It has however been observed, that a few URLS randoml... See more...
Hello, We have a few URLs being monitored by a splunk alert(query pasted below for reference) by making use of the "Website Monitoring" add on. It has however been observed, that a few URLS randomly generate a non 200 HTTP status codes that automatically get resolved in the next iteration. We've therefore been asked to implement a logic wherein an alert should only be raised if a URL fails two times consecutively. Query: index=urlperf sourcetype="web_ping" [| inputlookup URL_Title.csv] | stats latest(response_code) as response_code latest(_time) as _time by url | where response_code>=300 | eval Status="Down",Timestamp=strftime(_time,"%d/%m/%Y %H:%M:%S") | rename response_code as "HTTP Response Code" url as URL | table Timestamp,URL,"HTTP Response Code", Status | dedup URL An an example : Considering URL being monitored is "http://mywebsite.com" with frequency as 5 mins , the stake holders want an alert to be raised only for "case 2" and NOT for "case1" . Could some one please help, on how could we accomplish this through a splunk alert. case 1 :08:00 hrs url=http://mywebsite.com response_code=404 timed_out=False               08:05 hrs url=http://mywebsite.com response_code=200 timed_out=False case 2 :08:00 hrs url=http://mywebsite.com response_code=504 timed_out=False                08:05 hrs url=http://mywebsite.com response_code=401 timed_out=False Thank you in advance !
I am trying to set up Deep Learning toolkit on Splunk Cloud using Azure Kubernetes Service. I am able to connect to the containers and launch jupyter notebook, however when I try to execute an exampl... See more...
I am trying to set up Deep Learning toolkit on Splunk Cloud using Azure Kubernetes Service. I am able to connect to the containers and launch jupyter notebook, however when I try to execute an example model, the following error message is recieved, Error in 'fit' command: Error while initializing algorithm "MLTKContainer": Failed to load algorithm "mltkc.MLTKContainer". The algorithm that is used in the example is present in the app/models folder in the Jupyter notebook. Any thoughts on what might be wrong here?
we have aws EMR cluster where we need to check for job waiting time with respect to time, we need to create chart x-axis as time and y-axis as job waiting time. raw data is like this. {"tsu":163811... See more...
we have aws EMR cluster where we need to check for job waiting time with respect to time, we need to create chart x-axis as time and y-axis as job waiting time. raw data is like this. {"tsu":16381197,"app":"log.prod","hst":"ip-100-**-***-***.us","lvl":"INFO","ctr":"spark-***","kns":"prod","cid":"******","pod":"data-driver","env":"prod","cna":"prod-1","msg":"21/11/28 17:15:50 INFO GenerationExecutor$: sent metrics: data.job_waiting_time: 124"}   need data.job_waiting_time: 124 in y-axis and x-axis need time or I think we can use log time also 21/11/28 17:15:50.
Hello everyone, I am posting this question because I didn't find any solution :  I have a trial version of Splunk Enterprise, and i already added  forwarders in the servers i want to monitor I am ... See more...
Hello everyone, I am posting this question because I didn't find any solution :  I have a trial version of Splunk Enterprise, and i already added  forwarders in the servers i want to monitor I am trying to install the Splunk add-on for unix and Linux on these forwarders to be able to monitor their cpu, ram and disk usage The problem is that these machines are under ubuntu 20.04 without a graphic interface, and no option is available to download the .tgz file of this add-on directly via a command line, so i am unable to download this file on my forwarders Any ideas ?   PS : If no link/command is available to do so, is there another way to import the ram,cpu and disk data from these forwarders ?    Thank you in advance !
I have disabled a few of the Correlation searches and would like to delete them from the "Top Notable Events" panel in ES Security Posture page. There is some recommendation on this but the answers ... See more...
I have disabled a few of the Correlation searches and would like to delete them from the "Top Notable Events" panel in ES Security Posture page. There is some recommendation on this but the answers are quite old, is there any good way to achieve this with min impact as i understand that i would have to modify the KVstore lookup for it.
Hi, I have index data as below and i have kvstores per each account which has additional info.  Example Scenario (account numbers and corresponding kvstores:  Index data: AccountID Resour... See more...
Hi, I have index data as below and i have kvstores per each account which has additional info.  Example Scenario (account numbers and corresponding kvstores:  Index data: AccountID ResourceID Account1 Resource1.1 Account1 Resource1.2 Account2 Resource2.1 Account2 Resource2.2   KVStores: Account1_Collection ResourceID IP Resource1.1 1.1.0.0 Resource1.2 1.1.1.1   Account2_Collection ResourceID IP Resource2.1 2.2.0.0 Resource2.2 2.2.1.1 Required output: AccountID ResourceID IP Account1 Resource1.1 1.1.0.0 Account1 Resource1.2 1.1.1.1 Account2 Resource2.1 2.2.0.0 Account2 Resource2.2 2.2.1.1   I used approach mentioned in the answer here Solved: How to use a variable to determine which CSV looku... - Splunk Community,  ... | eval keyA=if(fieldX="value1"), fieldX, null()) | lookup lookupA keyA | eval keyB=if(fieldX="value2"), fieldX, null()) | lookup lookupB keyB | eval keyC=if(fieldX="value3"), fieldX, null()) | lookup lookupC keyC but this approach does not make it dynamic, if i have new value and hence new lookup, i need to update the searches.. I want to make the search dynamically pick the correct lookup based on the value in event. Thanks in advance, SN
Hi, I have configured IT Essential Works (4.9.2) with Exchange content pack (1.4.3) and  TA-Exchange-ClientAccess (4.0.3). By chance I was checking PowerShell event logs in our exchange server and ... See more...
Hi, I have configured IT Essential Works (4.9.2) with Exchange content pack (1.4.3) and  TA-Exchange-ClientAccess (4.0.3). By chance I was checking PowerShell event logs in our exchange server and I saw the error bellow. Log Name: Microsoft-Windows-PowerShell/Operational Source: Microsoft-Windows-PowerShell Date: 29/11/2021 11:34:13 Event ID: 4100 Task Category: Executing Pipeline Level: Warning Keywords: None User: SYSTEM Computer: XXXX.YYYY.ZZZZ Description: Error Message = Object reference not set to an instance of an object. Fully Qualified Error ID = System.NullReferenceException,Microsoft.Exchange.Management.SystemConfigurationTasks.SearchAdminAuditLog Context: Severity = Warning Host Name = ConsoleHost Host Version = 5.1.14393.4583 Host ID = 644d49a8-7f8f-4b1e-9250-959ff1a8b7b4 Host Application = Powershell -PSConsoleFile E:\Program Files\Microsoft\Exchange Server\V15\\bin\exshell.psc1 -command . 'C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1' Engine Version = 5.1.14393.4583 Runspace ID = 10a2c198-89dd-47c1-99c1-4d493d35a837 Pipeline ID = 1 Command Name = Search-AdminAuditLog Command Type = Cmdlet Script Name = C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1 Command Path = Sequence Number = 19 User = XXXXX\SYSTEM Connected User = Shell ID = Microsoft.PowerShell User Data: Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-PowerShell" Guid="{A0C1853B-5C40-4B15-8766-3CF1C58F985A}" /> <EventID>4100</EventID> <Version>1</Version> <Level>3</Level> <Task>106</Task> <Opcode>19</Opcode> <Keywords>0x0</Keywords> <TimeCreated SystemTime="2021-11-29T10:34:13.679546900Z" /> <EventRecordID>5879670</EventRecordID> <Correlation ActivityID="{2DF7DE6F-E0AC-000E-036D-F92DACE0D701}" /> <Execution ProcessID="62888" ThreadID="41736" /> <Channel>Microsoft-Windows-PowerShell/Operational</Channel> <Computer>XXXXXX.YYYY.ZZZZ</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="ContextInfo"> Severity = Warning Host Name = ConsoleHost Host Version = 5.1.14393.4583 Host ID = 644d49a8-7f8f-4b1e-9250-959ff1a8b7b4 Host Application = Powershell -PSConsoleFile E:\Program Files\Microsoft\Exchange Server\V15\\bin\exshell.psc1 -command . 'C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1' Engine Version = 5.1.14393.4583 Runspace ID = 10a2c198-89dd-47c1-99c1-4d493d35a837 Pipeline ID = 1 Command Name = Search-AdminAuditLog Command Type = Cmdlet Script Name = C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1 Command Path = Sequence Number = 19 User = XXXXXXX\SYSTEM Connected User = Shell ID = Microsoft.PowerShell </Data> <Data Name="UserData"> </Data> <Data Name="Payload">Error Message = Object reference not set to an instance of an object. Fully Qualified Error ID = System.NullReferenceException,Microsoft.Exchange.Management.SystemConfigurationTasks.SearchAdminAuditLog </Data> </EventData> </Event>   Log Name: Microsoft-Windows-PowerShell/Operational Source: Microsoft-Windows-PowerShell Date: 29/11/2021 11:34:09 Event ID: 4104 Task Category: Execute a Remote Command Level: Warning Keywords: None User: SYSTEM Computer: XXXXX.YYYY.ZZZZ Description: Creating Scriptblock text (1 of 1): ######################################################## # # Splunk for Microsoft Exchange # Exchange 2010/2013 Mailbox Store Data Definition # # Copyright (C) 2005-2021 Splunk Inc. All Rights Reserved. # All Rights Reserved # ######################################################## # # This returns the filename of the audit database - due to some # funkiness with permissions, deployment server and the local # directory, we're using the %TEMP% as the location. For the # NT Authority\SYSTEM account, this is normally C:\Windows\Temp # $AuditTempFile = $ENV:Temp | Join-Path -ChildPath "splunk-msexchange-mailboxauditlogs.clixml" [Console]::OutputEncoding = [Text.UTF8Encoding]::UTF8 $AuditDetails = @{} if (Test-Path $AuditTempFile) { $AuditDetails = Import-CliXml $AuditTempFile } # # Given a single audit record from the Search-MailboxAuditLog function Output-AuditRecord($Record) { $O = New-Object System.Collections.ArrayList $D = Get-Date $Record.LastAccessed -format 'yyyy-MM-ddTHH:mm:sszzz' [void]$O.Add($D) foreach ($p in $Record.PSObject.Properties) { [void]$O.Add("$($p.Name)=`"$($Record.PSObject.Properties[$p.Name].Value)`"") } Write-Host ($O -join " ") } function Output-AuditLog($Mailbox) { $Identity = $Mailbox.Identity $IdentityStr = $Identity.ToDNString() $LastSeen = (Get-Date).AddMonths(-1) if ($AuditDetails.ContainsKey($Identity)) { $LastSeen = $AuditDetails[$Identity] $AuditDetails.Remove($Identity) $AuditDetails[$IdentityStr] = $LastSeen } elseif ($AuditDetails.ContainsKey($IdentityStr)) { $LastSeen = $AuditDetails[$IdentityStr] } $LastRecord = $LastSeen Search-MailboxAuditLog -Identity $Identity -LogonTypes Owner,Delegate,Admin -ShowDetails -StartDate $LastSeen -EndDate (Get-Date)| sort LastAccessed | Foreach-Object { if ($_.LastAccessed -gt $LastSeen) { Output-AuditRecord($_) } $LastRecord = $_.LastAccessed } $AuditDetails[$IdentityStr] = $LastRecord } $Mailboxes = Get-Mailbox -Filter { AuditEnabled -eq $true } -Server $Env:ComputerName -ResultSize Unlimited $Mailboxes | Foreach-Object { If($_ -ne $null) { Output-AuditLog($_) }} # # Now that we have done the work, save off the Audit Temp File $AuditDetails | Export-CliXml $AuditTempFile ScriptBlock ID: 1e79b015-daf6-40c2-8c87-fedde7b4a866 Path: C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-Mailbox\bin\powershell\read-mailbox-audit-logs_2010_2013.ps1 Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-PowerShell" Guid="{A0C1853B-5C40-4B15-8766-3CF1C58F985A}" /> <EventID>4104</EventID> <Version>1</Version> <Level>3</Level> <Task>2</Task> <Opcode>15</Opcode> <Keywords>0x0</Keywords> <TimeCreated SystemTime="2021-11-29T10:34:09.300998400Z" /> <EventRecordID>5879669</EventRecordID> <Correlation ActivityID="{2DF7DE6F-E0AC-0010-CA3C-F92DACE0D701}" /> <Execution ProcessID="38808" ThreadID="54116" /> <Channel>Microsoft-Windows-PowerShell/Operational</Channel> <Computer>XXXX.YYYY.ZZZ</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="MessageNumber">1</Data> <Data Name="MessageTotal">1</Data> <Data Name="ScriptBlockText">######################################################## # # Splunk for Microsoft Exchange # Exchange 2010/2013 Mailbox Store Data Definition # # Copyright (C) 2005-2021 Splunk Inc. All Rights Reserved. # All Rights Reserved # ######################################################## # # This returns the filename of the audit database - due to some # funkiness with permissions, deployment server and the local # directory, we're using the %TEMP% as the location. For the # NT Authority\SYSTEM account, this is normally C:\Windows\Temp # $AuditTempFile = $ENV:Temp | Join-Path -ChildPath "splunk-msexchange-mailboxauditlogs.clixml" [Console]::OutputEncoding = [Text.UTF8Encoding]::UTF8 $AuditDetails = @{} if (Test-Path $AuditTempFile) { $AuditDetails = Import-CliXml $AuditTempFile } # # Given a single audit record from the Search-MailboxAuditLog function Output-AuditRecord($Record) { $O = New-Object System.Collections.ArrayList $D = Get-Date $Record.LastAccessed -format 'yyyy-MM-ddTHH:mm:sszzz' [void]$O.Add($D) foreach ($p in $Record.PSObject.Properties) { [void]$O.Add("$($p.Name)=`"$($Record.PSObject.Properties[$p.Name].Value)`"") } Write-Host ($O -join " ") } function Output-AuditLog($Mailbox) { $Identity = $Mailbox.Identity $IdentityStr = $Identity.ToDNString() $LastSeen = (Get-Date).AddMonths(-1) if ($AuditDetails.ContainsKey($Identity)) { $LastSeen = $AuditDetails[$Identity] $AuditDetails.Remove($Identity) $AuditDetails[$IdentityStr] = $LastSeen } elseif ($AuditDetails.ContainsKey($IdentityStr)) { $LastSeen = $AuditDetails[$IdentityStr] } $LastRecord = $LastSeen Search-MailboxAuditLog -Identity $Identity -LogonTypes Owner,Delegate,Admin -ShowDetails -StartDate $LastSeen -EndDate (Get-Date)| sort LastAccessed | Foreach-Object { if ($_.LastAccessed -gt $LastSeen) { Output-AuditRecord($_) } $LastRecord = $_.LastAccessed } $AuditDetails[$IdentityStr] = $LastRecord } $Mailboxes = Get-Mailbox -Filter { AuditEnabled -eq $true } -Server $Env:ComputerName -ResultSize Unlimited $Mailboxes | Foreach-Object { If($_ -ne $null) { Output-AuditLog($_) }} # # Now that we have done the work, save off the Audit Temp File $AuditDetails | Export-CliXml $AuditTempFile </Data> <Data Name="ScriptBlockId">1e79b015-daf6-40c2-8c87-fedde7b4a866</Data> <Data Name="Path">C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-Mailbox\bin\powershell\read-mailbox-audit-logs_2010_2013.ps1</Data> </EventData> </Event>   Any idea on what could be happening? I found some answer related to the user that is running the universal splunk forwarder in the server. It is currently configured as local SYSTEM and in some answers, found on internet, it was mentioned to use a domain account with exchange permissions. But I check the official documentation and I could not find any mention to it. Thanks.  
I was given a base search to manipulate and create Timechart accordingly. base search | eval file_line = file.":".line | eval errorList = message . ":" . file_line . ":" . level | top 25 message,... See more...
I was given a base search to manipulate and create Timechart accordingly. base search | eval file_line = file.":".line | eval errorList = message . ":" . file_line . ":" . level | top 25 message,file_line,level by applicationBuild | table applicationBuild level count file_line message The result was formatted as such build2021 ERROR 8 file1.java:111 ErrorMessage123 build2021 ERROR 4 file2.java:123 ErrorMessage456 build2021 ERROR 3 file3.java:456 ErrorMessage789   I want to plot a Timechart from the above result. Output should be Top 25 Error message above against time. I came out with base search | eval file_line = file.":".line | eval errorList = message.";".file_line.";".level | where errorList!="null" | timechart useother=f usenull=f count max(message,file_line,level,applicationBuild) by errorList limit=25   The legend of the result will append "count: " in front of the errorList that I cannot remove. Any idea how can I remove this from the legend? OR are there better ways to achieve the same result?    
Hi all, I have a text input for a table header. My requirement is , by default the table should show all the values and if any letters typed in the text box, the same should match with the table hea... See more...
Hi all, I have a text input for a table header. My requirement is , by default the table should show all the values and if any letters typed in the text box, the same should match with the table header and the values containing that sub string should be displayed. I created the text box but haven't figured out how to match this sub string to the header. Please help me in this..
Hi,  We are using  the Curl script to call splunk RestAPI to send the data out of  splunk  (to Kafka/ES) . We have 1+lakhs  events in every second . So while calling the rest api (calling every 5 se... See more...
Hi,  We are using  the Curl script to call splunk RestAPI to send the data out of  splunk  (to Kafka/ES) . We have 1+lakhs  events in every second . So while calling the rest api (calling every 5 secs) , it is getting time out . Sample  curl command for calling restapi    curl -k -u admin:changeme \ https://localhost:8089/services/search/jobs/ -d search="search index=sample sourcetype=access_* earliest=-5m" What is the limit of event count  we can extract at a time through Rest API Call? What is the  default timeout settings  ?Is it possible to change ?  Is there a better way to send splunk data  outside?  Tried Python script using Splunklib.client .That also failed .   Appreciate your inputs in advance . Regards Deev  
HI from the below query i will get calls count whose responseTime is morethan 10000 milliseconds index="ab_cs" host="aw-lx0456.vint.ent" source="cb-ss-service"AND((RequestedURL ="/man/*/details" ... See more...
HI from the below query i will get calls count whose responseTime is morethan 10000 milliseconds index="ab_cs" host="aw-lx0456.vint.ent" source="cb-ss-service"AND((RequestedURL ="/man/*/details" OR REQUESTED_URL="/man/*/contacts") OR (RequestedURL ="/contacts /*/details" OR REQUESTED_URL="/contacts/*/members"))AND (ResponseStatus OR HttpStatusCode)|sort -1 Timetaken | eval TimeTaken3=trim(replace(Timetaken, ",",""))| where TimeTaken3 >=10000 |stats count as ResponseOver10Sec But here I want to send an alert when count of ResponseOver10Sec is more than 2% of total transaction could you please suggest appropriate query??
REX command to create a field domain from website EX:  input : https://www.youtube.com/sd/td/gs-intro         output: www.youtube.com
Hello! We want to integrate McAfee ePO into a Splunk Cloud, but we only found tutorials on syslogging data. I've been looking and I don't think it's possible to syslog in to Splunk Cloud. How can w... See more...
Hello! We want to integrate McAfee ePO into a Splunk Cloud, but we only found tutorials on syslogging data. I've been looking and I don't think it's possible to syslog in to Splunk Cloud. How can we do it?   Thanks!
I want to send an alert when 2% of total transaction calls taking more than 10000 milliseconds could anyone please sugget appropriate query??
Splunk's VisualizationTrellis documentation page shows example searches for things like count by sourcetype, and later shows trellis-ed visualizations for multi-value items, but there are no example ... See more...
Splunk's VisualizationTrellis documentation page shows example searches for things like count by sourcetype, and later shows trellis-ed visualizations for multi-value items, but there are no example searches for them.  My data looks like this... {    audit: {      audit_enabled: Compliant,        control_access: NotCompliant,      firewall_on: NotCompliant,       etc: ...     } } I can create separate searches for each item in audit {} like this... source=device_audit  | stats count by audit.audit_enabled But there are many audit items. I'd like to trellis pie charts for each audit item without creating a separate search for each.  Is there are search I can use to trellis to produce three pie charts to show the split between compliant and notCompliant for each of the audit items (audit_enabled/control_access/firewall_on)? Thank you. 
Hi there, I'm sitting here trying to make sense of the different search types in Splunk (i.e. Dense, Sparse, Super-sparse, and Rare), how they affect performance and why that is.  I get that a ... See more...
Hi there, I'm sitting here trying to make sense of the different search types in Splunk (i.e. Dense, Sparse, Super-sparse, and Rare), how they affect performance and why that is.  I get that a Dense search, when you e.g. are searching for literally any index, then there is no point in utilising bloom filters because there is no need to rule out buckets to find specific events. However,  why isn't it beneficial for Sparse and Super-sparse searches to make use of Bloom filters?
Hi, I am trying to pull a data from a csv through deployment app but only the field names are getting indexed , data is not getting indexed. Number of records in csv is around 70000. Tried through ... See more...
Hi, I am trying to pull a data from a csv through deployment app but only the field names are getting indexed , data is not getting indexed. Number of records in csv is around 70000. Tried through dbconnect but from there also same issue. Is there any limit of data which can be indexed at a time. If yes, where can that be verified? Thanks  
  We have logs coming in from one of the source in CEF format. How to deal CEF Format data parsing in Splunk so that it get auto converted in field value pair. Post that i could alias those ... See more...
  We have logs coming in from one of the source in CEF format. How to deal CEF Format data parsing in Splunk so that it get auto converted in field value pair. Post that i could alias those fields basis on my datamodel need. Kindly suggest. Thanks in advance  
Hello, I am trying to track failed logons followed by a successful one using the transaction command and the following criteria: Limit the time span to 5 min,  add a startswith so each transaction ... See more...
Hello, I am trying to track failed logons followed by a successful one using the transaction command and the following criteria: Limit the time span to 5 min,  add a startswith so each transaction will begin with a logon failure, add an endswith so each transaction will end with logon success and add a | where to find when the eventcount exceeds 3 this is what I have so far  
Hi all, I have this need, compare a field with a series of error codes. I would not like to write in the search, any error codes, but I would like to use a lookup table. I then entered the error code... See more...
Hi all, I have this need, compare a field with a series of error codes. I would not like to write in the search, any error codes, but I would like to use a lookup table. I then entered the error codes in a column (Name = Errors) of the table, but when i  perform the search, they are not compared correctly. In the column, for example, is present: login.error.1004 In the search: tag = Log | lookup ServiziApp.csv ServiceName AS Service | search Functionality = "Access" errorCode! = Errors But the lines despite having a field = login.error.1004, are displayed. Checking the extracted fields, the errorCode field contains login.error.1004 and the Errors field also contains login.error.1004. Thanks in advance