All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have an index that ingests scan files and assigns a sourcetype based on the folder location. There are several scans for each host, which caused some issues when I tried to join or append. I'm try... See more...
I have an index that ingests scan files and assigns a sourcetype based on the folder location. There are several scans for each host, which caused some issues when I tried to join or append. I'm trying to build  a search that takes the host, scan, and version in one folder and compares it against the host, scan, and version in the other and create a table with the host, evaluate scan and version, and authoritative scan and version and tell me if we have the most up to date scans in the authoritative folder. What's the best method to create this search? Example data: HOST_FQDN: Host1 sourcetype: Evaluate SCAN: IE11 Version: 2   HOST_FQDN: Host1 sourcetype: Authoritative SCAN: IE11 Version: 3
There is a particular section of the Splunk configuration/installation manual titled Change Administrators group membership on each host that I am having an issue with.  It states:   “Confirm that ... See more...
There is a particular section of the Splunk configuration/installation manual titled Change Administrators group membership on each host that I am having an issue with.  It states:   “Confirm that all accounts that need access to the Administrators group on each host have been added to the Restricted Groups policy setting. Failure to do so can result in losing administrative access to the hosts on which you apply this GPO!”   However, implementing this results in removing any local admin privileges for local accounts on our client machines.  Therefore, I have the following questions:   1)      Am I misunderstanding the quoted text?  This seems like the intention of implementing this section should prevent exactly what is happening to us, which is the removal of local administration privileges.  2)      In either case to the answer for question 1: For our purposes, it is vital that we have at least one local account with admin privileges on each computer. Is it required that the Splunk Access GPO removes administration privileges from local accounts, or is this just a step that would normally improve security on a network in general?
Hi, I would like to intergrate our Splunk on-prem environment with our ServiceNow ITOM in order for the Splunk events to be sent from Splunk to the Event Management Addon for ServiceNow ITOM.   Do... See more...
Hi, I would like to intergrate our Splunk on-prem environment with our ServiceNow ITOM in order for the Splunk events to be sent from Splunk to the Event Management Addon for ServiceNow ITOM.   Do I need to use a ServiceNow MID? I would like for my Splunk environment to intiiate communicaiton wioht the MID server. But I am having a hard time findind docuemntation on how ton configure this.   Thanks  
I am searching a source that has events that have FieldA and FieldB. I need to find which events that have specific FieldA values (x or y) AND matching FieldB values (nonspecific). My current searc... See more...
I am searching a source that has events that have FieldA and FieldB. I need to find which events that have specific FieldA values (x or y) AND matching FieldB values (nonspecific). My current search is: Index=source  FieldA IN ("x", "y") I'm not sure how to filter the results to only show the events that have matching FieldB values.
How can I erex a line TRUE, FALSE, TRUE,, FALSE, FALSE, FALSE, , FALSE, FALSE  source =" an imported CSV" the multiple true and false on the line have different column names. I am trying to create a ... See more...
How can I erex a line TRUE, FALSE, TRUE,, FALSE, FALSE, FALSE, , FALSE, FALSE  source =" an imported CSV" the multiple true and false on the line have different column names. I am trying to create a label for each true and false following a reference sheet.              
Hi All Hoping someone can help me, I am trying to get the Palo Alto App working we are a Splunk cloud customer and have this app on our search-head    When I search for eventype=pan I see the logs... See more...
Hi All Hoping someone can help me, I am trying to get the Palo Alto App working we are a Splunk cloud customer and have this app on our search-head    When I search for eventype=pan I see the logs but they are NOT reclassified   Our set up is we have our Palo Alto firewalls pushing to a syslog server on standard port 514, this data at the moment is currently being ingested as one syslog stream via universal forwarder, where the sourcetype=syslog and index=syslog.    In inputs.conf in   /opt/splunk/etc/system/local I have configured the below    [monitor:///data/rsyslog/10.0.0.1/10.0.0.1.log] index = pan_logs sourcetype = pan:log host_segment = 3     The guide states to configure your TCP outputs in    / opt/splunkforwarder/etc/system/local/outputs.conf in this file we have  [tcpout] indexAndForward = 1   As a cloud customer we have our company app in root@syslog:/opt/splunk/etc/apps/OUR_COMPANY_APP/default   The outputs.conf has but no input file  = inputs1.name.splunkcloud.com:9997,  inputs2.name.splunkcloud.com:9997,  inputs3.name.splunkcloud.com:9997,  inputs4.name.splunkcloud.com:9997,  inputs5.name.splunkcloud.com:9997,  inputs6.name.splunkcloud.com:9997,    The input file being used is   oot@syslog:/opt/splunk/etc/apps/search/local   The PaloAlto app states to add your indexers to Create or modify/opt/splunkforwader/etc/system/local/outputs.conf    and add a tcpout stanza:    Could I copy over the outputs from root@syslog:/opt/splunk/etc/apps/OUR_COMPANY_APP/default to /opt/splunkforwader/etc/system/local/outputs.conf  
OK, I'm trying to improve performance by replacing some join queries with stats, but struggling on a filter. I have the below query, two source types where the common field between events is 'Corr... See more...
OK, I'm trying to improve performance by replacing some join queries with stats, but struggling on a filter. I have the below query, two source types where the common field between events is 'Correlator' . In source_one I have fields 'Correlator', 'sysplex' and 'servername'. In source_detail I have 'Correlator', 'sysplex' and multiple other fields, the one for this data is Sample_NAME. 'servername' in source_one could have multiple values and I want to filter on a match so search servername=xyz* I've tried a number of ways and I can't seem to manage to limit results to a filter on 'servername' without losing everything else, 'sysplex' which is in both sourcetypes filters just fine. Any thoughts would be appreciated. index=my_index sourcetype=source_one OR sourcetype=source_detail sysplex=ABC* | stats values(SAMPLE_NAME) AS SampleName values(SAMPLE_TIME) AS SampleTime by Correlator,SampleTime | eval _time=strptime(SampleTime,"%Y-%m-%d %H:%M:%S.%N") | timechart span=1m count by SampleName
Hello, I would like to know what software I can use to generate traffic for my lab. thanks. 
Hello, We designed a new model for NLP. We are running the model in the Jupyter Notebook noticing that the model was correctly load. The problem appears when we are fiting the model to Splunk: An e... See more...
Hello, We designed a new model for NLP. We are running the model in the Jupyter Notebook noticing that the model was correctly load. The problem appears when we are fiting the model to Splunk: An error says the model was not found . We are working with the command spacy.load() after uploading the model in the container (created at an external notebook using nlp.to_disk(output_dir)).   Any suggestions?   Thomas
Hi! I tried removing an app from a Search Head cluster, deleting it from the deployer's shcluster/apps directory and pushing the other apps, but this doesn't work properly and the app stays on my Sea... See more...
Hi! I tried removing an app from a Search Head cluster, deleting it from the deployer's shcluster/apps directory and pushing the other apps, but this doesn't work properly and the app stays on my Search Heads. Is there another way to remove it? Thanks, Mauro
Hello, We have a few URLs being monitored by a splunk alert(query pasted below for reference) by making use of the "Website Monitoring" add on. It has however been observed, that a few URLS randoml... See more...
Hello, We have a few URLs being monitored by a splunk alert(query pasted below for reference) by making use of the "Website Monitoring" add on. It has however been observed, that a few URLS randomly generate a non 200 HTTP status codes that automatically get resolved in the next iteration. We've therefore been asked to implement a logic wherein an alert should only be raised if a URL fails two times consecutively. Query: index=urlperf sourcetype="web_ping" [| inputlookup URL_Title.csv] | stats latest(response_code) as response_code latest(_time) as _time by url | where response_code>=300 | eval Status="Down",Timestamp=strftime(_time,"%d/%m/%Y %H:%M:%S") | rename response_code as "HTTP Response Code" url as URL | table Timestamp,URL,"HTTP Response Code", Status | dedup URL An an example : Considering URL being monitored is "http://mywebsite.com" with frequency as 5 mins , the stake holders want an alert to be raised only for "case 2" and NOT for "case1" . Could some one please help, on how could we accomplish this through a splunk alert. case 1 :08:00 hrs url=http://mywebsite.com response_code=404 timed_out=False               08:05 hrs url=http://mywebsite.com response_code=200 timed_out=False case 2 :08:00 hrs url=http://mywebsite.com response_code=504 timed_out=False                08:05 hrs url=http://mywebsite.com response_code=401 timed_out=False Thank you in advance !
I am trying to set up Deep Learning toolkit on Splunk Cloud using Azure Kubernetes Service. I am able to connect to the containers and launch jupyter notebook, however when I try to execute an exampl... See more...
I am trying to set up Deep Learning toolkit on Splunk Cloud using Azure Kubernetes Service. I am able to connect to the containers and launch jupyter notebook, however when I try to execute an example model, the following error message is recieved, Error in 'fit' command: Error while initializing algorithm "MLTKContainer": Failed to load algorithm "mltkc.MLTKContainer". The algorithm that is used in the example is present in the app/models folder in the Jupyter notebook. Any thoughts on what might be wrong here?
we have aws EMR cluster where we need to check for job waiting time with respect to time, we need to create chart x-axis as time and y-axis as job waiting time. raw data is like this. {"tsu":163811... See more...
we have aws EMR cluster where we need to check for job waiting time with respect to time, we need to create chart x-axis as time and y-axis as job waiting time. raw data is like this. {"tsu":16381197,"app":"log.prod","hst":"ip-100-**-***-***.us","lvl":"INFO","ctr":"spark-***","kns":"prod","cid":"******","pod":"data-driver","env":"prod","cna":"prod-1","msg":"21/11/28 17:15:50 INFO GenerationExecutor$: sent metrics: data.job_waiting_time: 124"}   need data.job_waiting_time: 124 in y-axis and x-axis need time or I think we can use log time also 21/11/28 17:15:50.
Hello everyone, I am posting this question because I didn't find any solution :  I have a trial version of Splunk Enterprise, and i already added  forwarders in the servers i want to monitor I am ... See more...
Hello everyone, I am posting this question because I didn't find any solution :  I have a trial version of Splunk Enterprise, and i already added  forwarders in the servers i want to monitor I am trying to install the Splunk add-on for unix and Linux on these forwarders to be able to monitor their cpu, ram and disk usage The problem is that these machines are under ubuntu 20.04 without a graphic interface, and no option is available to download the .tgz file of this add-on directly via a command line, so i am unable to download this file on my forwarders Any ideas ?   PS : If no link/command is available to do so, is there another way to import the ram,cpu and disk data from these forwarders ?    Thank you in advance !
I have disabled a few of the Correlation searches and would like to delete them from the "Top Notable Events" panel in ES Security Posture page. There is some recommendation on this but the answers ... See more...
I have disabled a few of the Correlation searches and would like to delete them from the "Top Notable Events" panel in ES Security Posture page. There is some recommendation on this but the answers are quite old, is there any good way to achieve this with min impact as i understand that i would have to modify the KVstore lookup for it.
Hi, I have index data as below and i have kvstores per each account which has additional info.  Example Scenario (account numbers and corresponding kvstores:  Index data: AccountID Resour... See more...
Hi, I have index data as below and i have kvstores per each account which has additional info.  Example Scenario (account numbers and corresponding kvstores:  Index data: AccountID ResourceID Account1 Resource1.1 Account1 Resource1.2 Account2 Resource2.1 Account2 Resource2.2   KVStores: Account1_Collection ResourceID IP Resource1.1 1.1.0.0 Resource1.2 1.1.1.1   Account2_Collection ResourceID IP Resource2.1 2.2.0.0 Resource2.2 2.2.1.1 Required output: AccountID ResourceID IP Account1 Resource1.1 1.1.0.0 Account1 Resource1.2 1.1.1.1 Account2 Resource2.1 2.2.0.0 Account2 Resource2.2 2.2.1.1   I used approach mentioned in the answer here Solved: How to use a variable to determine which CSV looku... - Splunk Community,  ... | eval keyA=if(fieldX="value1"), fieldX, null()) | lookup lookupA keyA | eval keyB=if(fieldX="value2"), fieldX, null()) | lookup lookupB keyB | eval keyC=if(fieldX="value3"), fieldX, null()) | lookup lookupC keyC but this approach does not make it dynamic, if i have new value and hence new lookup, i need to update the searches.. I want to make the search dynamically pick the correct lookup based on the value in event. Thanks in advance, SN
Hi, I have configured IT Essential Works (4.9.2) with Exchange content pack (1.4.3) and  TA-Exchange-ClientAccess (4.0.3). By chance I was checking PowerShell event logs in our exchange server and ... See more...
Hi, I have configured IT Essential Works (4.9.2) with Exchange content pack (1.4.3) and  TA-Exchange-ClientAccess (4.0.3). By chance I was checking PowerShell event logs in our exchange server and I saw the error bellow. Log Name: Microsoft-Windows-PowerShell/Operational Source: Microsoft-Windows-PowerShell Date: 29/11/2021 11:34:13 Event ID: 4100 Task Category: Executing Pipeline Level: Warning Keywords: None User: SYSTEM Computer: XXXX.YYYY.ZZZZ Description: Error Message = Object reference not set to an instance of an object. Fully Qualified Error ID = System.NullReferenceException,Microsoft.Exchange.Management.SystemConfigurationTasks.SearchAdminAuditLog Context: Severity = Warning Host Name = ConsoleHost Host Version = 5.1.14393.4583 Host ID = 644d49a8-7f8f-4b1e-9250-959ff1a8b7b4 Host Application = Powershell -PSConsoleFile E:\Program Files\Microsoft\Exchange Server\V15\\bin\exshell.psc1 -command . 'C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1' Engine Version = 5.1.14393.4583 Runspace ID = 10a2c198-89dd-47c1-99c1-4d493d35a837 Pipeline ID = 1 Command Name = Search-AdminAuditLog Command Type = Cmdlet Script Name = C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1 Command Path = Sequence Number = 19 User = XXXXX\SYSTEM Connected User = Shell ID = Microsoft.PowerShell User Data: Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-PowerShell" Guid="{A0C1853B-5C40-4B15-8766-3CF1C58F985A}" /> <EventID>4100</EventID> <Version>1</Version> <Level>3</Level> <Task>106</Task> <Opcode>19</Opcode> <Keywords>0x0</Keywords> <TimeCreated SystemTime="2021-11-29T10:34:13.679546900Z" /> <EventRecordID>5879670</EventRecordID> <Correlation ActivityID="{2DF7DE6F-E0AC-000E-036D-F92DACE0D701}" /> <Execution ProcessID="62888" ThreadID="41736" /> <Channel>Microsoft-Windows-PowerShell/Operational</Channel> <Computer>XXXXXX.YYYY.ZZZZ</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="ContextInfo"> Severity = Warning Host Name = ConsoleHost Host Version = 5.1.14393.4583 Host ID = 644d49a8-7f8f-4b1e-9250-959ff1a8b7b4 Host Application = Powershell -PSConsoleFile E:\Program Files\Microsoft\Exchange Server\V15\\bin\exshell.psc1 -command . 'C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1' Engine Version = 5.1.14393.4583 Runspace ID = 10a2c198-89dd-47c1-99c1-4d493d35a837 Pipeline ID = 1 Command Name = Search-AdminAuditLog Command Type = Cmdlet Script Name = C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-ClientAccess\bin\powershell\read-audit-logs_2010_2013.ps1 Command Path = Sequence Number = 19 User = XXXXXXX\SYSTEM Connected User = Shell ID = Microsoft.PowerShell </Data> <Data Name="UserData"> </Data> <Data Name="Payload">Error Message = Object reference not set to an instance of an object. Fully Qualified Error ID = System.NullReferenceException,Microsoft.Exchange.Management.SystemConfigurationTasks.SearchAdminAuditLog </Data> </EventData> </Event>   Log Name: Microsoft-Windows-PowerShell/Operational Source: Microsoft-Windows-PowerShell Date: 29/11/2021 11:34:09 Event ID: 4104 Task Category: Execute a Remote Command Level: Warning Keywords: None User: SYSTEM Computer: XXXXX.YYYY.ZZZZ Description: Creating Scriptblock text (1 of 1): ######################################################## # # Splunk for Microsoft Exchange # Exchange 2010/2013 Mailbox Store Data Definition # # Copyright (C) 2005-2021 Splunk Inc. All Rights Reserved. # All Rights Reserved # ######################################################## # # This returns the filename of the audit database - due to some # funkiness with permissions, deployment server and the local # directory, we're using the %TEMP% as the location. For the # NT Authority\SYSTEM account, this is normally C:\Windows\Temp # $AuditTempFile = $ENV:Temp | Join-Path -ChildPath "splunk-msexchange-mailboxauditlogs.clixml" [Console]::OutputEncoding = [Text.UTF8Encoding]::UTF8 $AuditDetails = @{} if (Test-Path $AuditTempFile) { $AuditDetails = Import-CliXml $AuditTempFile } # # Given a single audit record from the Search-MailboxAuditLog function Output-AuditRecord($Record) { $O = New-Object System.Collections.ArrayList $D = Get-Date $Record.LastAccessed -format 'yyyy-MM-ddTHH:mm:sszzz' [void]$O.Add($D) foreach ($p in $Record.PSObject.Properties) { [void]$O.Add("$($p.Name)=`"$($Record.PSObject.Properties[$p.Name].Value)`"") } Write-Host ($O -join " ") } function Output-AuditLog($Mailbox) { $Identity = $Mailbox.Identity $IdentityStr = $Identity.ToDNString() $LastSeen = (Get-Date).AddMonths(-1) if ($AuditDetails.ContainsKey($Identity)) { $LastSeen = $AuditDetails[$Identity] $AuditDetails.Remove($Identity) $AuditDetails[$IdentityStr] = $LastSeen } elseif ($AuditDetails.ContainsKey($IdentityStr)) { $LastSeen = $AuditDetails[$IdentityStr] } $LastRecord = $LastSeen Search-MailboxAuditLog -Identity $Identity -LogonTypes Owner,Delegate,Admin -ShowDetails -StartDate $LastSeen -EndDate (Get-Date)| sort LastAccessed | Foreach-Object { if ($_.LastAccessed -gt $LastSeen) { Output-AuditRecord($_) } $LastRecord = $_.LastAccessed } $AuditDetails[$IdentityStr] = $LastRecord } $Mailboxes = Get-Mailbox -Filter { AuditEnabled -eq $true } -Server $Env:ComputerName -ResultSize Unlimited $Mailboxes | Foreach-Object { If($_ -ne $null) { Output-AuditLog($_) }} # # Now that we have done the work, save off the Audit Temp File $AuditDetails | Export-CliXml $AuditTempFile ScriptBlock ID: 1e79b015-daf6-40c2-8c87-fedde7b4a866 Path: C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-Mailbox\bin\powershell\read-mailbox-audit-logs_2010_2013.ps1 Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-PowerShell" Guid="{A0C1853B-5C40-4B15-8766-3CF1C58F985A}" /> <EventID>4104</EventID> <Version>1</Version> <Level>3</Level> <Task>2</Task> <Opcode>15</Opcode> <Keywords>0x0</Keywords> <TimeCreated SystemTime="2021-11-29T10:34:09.300998400Z" /> <EventRecordID>5879669</EventRecordID> <Correlation ActivityID="{2DF7DE6F-E0AC-0010-CA3C-F92DACE0D701}" /> <Execution ProcessID="38808" ThreadID="54116" /> <Channel>Microsoft-Windows-PowerShell/Operational</Channel> <Computer>XXXX.YYYY.ZZZ</Computer> <Security UserID="S-1-5-18" /> </System> <EventData> <Data Name="MessageNumber">1</Data> <Data Name="MessageTotal">1</Data> <Data Name="ScriptBlockText">######################################################## # # Splunk for Microsoft Exchange # Exchange 2010/2013 Mailbox Store Data Definition # # Copyright (C) 2005-2021 Splunk Inc. All Rights Reserved. # All Rights Reserved # ######################################################## # # This returns the filename of the audit database - due to some # funkiness with permissions, deployment server and the local # directory, we're using the %TEMP% as the location. For the # NT Authority\SYSTEM account, this is normally C:\Windows\Temp # $AuditTempFile = $ENV:Temp | Join-Path -ChildPath "splunk-msexchange-mailboxauditlogs.clixml" [Console]::OutputEncoding = [Text.UTF8Encoding]::UTF8 $AuditDetails = @{} if (Test-Path $AuditTempFile) { $AuditDetails = Import-CliXml $AuditTempFile } # # Given a single audit record from the Search-MailboxAuditLog function Output-AuditRecord($Record) { $O = New-Object System.Collections.ArrayList $D = Get-Date $Record.LastAccessed -format 'yyyy-MM-ddTHH:mm:sszzz' [void]$O.Add($D) foreach ($p in $Record.PSObject.Properties) { [void]$O.Add("$($p.Name)=`"$($Record.PSObject.Properties[$p.Name].Value)`"") } Write-Host ($O -join " ") } function Output-AuditLog($Mailbox) { $Identity = $Mailbox.Identity $IdentityStr = $Identity.ToDNString() $LastSeen = (Get-Date).AddMonths(-1) if ($AuditDetails.ContainsKey($Identity)) { $LastSeen = $AuditDetails[$Identity] $AuditDetails.Remove($Identity) $AuditDetails[$IdentityStr] = $LastSeen } elseif ($AuditDetails.ContainsKey($IdentityStr)) { $LastSeen = $AuditDetails[$IdentityStr] } $LastRecord = $LastSeen Search-MailboxAuditLog -Identity $Identity -LogonTypes Owner,Delegate,Admin -ShowDetails -StartDate $LastSeen -EndDate (Get-Date)| sort LastAccessed | Foreach-Object { if ($_.LastAccessed -gt $LastSeen) { Output-AuditRecord($_) } $LastRecord = $_.LastAccessed } $AuditDetails[$IdentityStr] = $LastRecord } $Mailboxes = Get-Mailbox -Filter { AuditEnabled -eq $true } -Server $Env:ComputerName -ResultSize Unlimited $Mailboxes | Foreach-Object { If($_ -ne $null) { Output-AuditLog($_) }} # # Now that we have done the work, save off the Audit Temp File $AuditDetails | Export-CliXml $AuditTempFile </Data> <Data Name="ScriptBlockId">1e79b015-daf6-40c2-8c87-fedde7b4a866</Data> <Data Name="Path">C:\Program Files\SplunkUniversalForwarder\etc\apps\TA-Exchange-Mailbox\bin\powershell\read-mailbox-audit-logs_2010_2013.ps1</Data> </EventData> </Event>   Any idea on what could be happening? I found some answer related to the user that is running the universal splunk forwarder in the server. It is currently configured as local SYSTEM and in some answers, found on internet, it was mentioned to use a domain account with exchange permissions. But I check the official documentation and I could not find any mention to it. Thanks.  
I was given a base search to manipulate and create Timechart accordingly. base search | eval file_line = file.":".line | eval errorList = message . ":" . file_line . ":" . level | top 25 message,... See more...
I was given a base search to manipulate and create Timechart accordingly. base search | eval file_line = file.":".line | eval errorList = message . ":" . file_line . ":" . level | top 25 message,file_line,level by applicationBuild | table applicationBuild level count file_line message The result was formatted as such build2021 ERROR 8 file1.java:111 ErrorMessage123 build2021 ERROR 4 file2.java:123 ErrorMessage456 build2021 ERROR 3 file3.java:456 ErrorMessage789   I want to plot a Timechart from the above result. Output should be Top 25 Error message above against time. I came out with base search | eval file_line = file.":".line | eval errorList = message.";".file_line.";".level | where errorList!="null" | timechart useother=f usenull=f count max(message,file_line,level,applicationBuild) by errorList limit=25   The legend of the result will append "count: " in front of the errorList that I cannot remove. Any idea how can I remove this from the legend? OR are there better ways to achieve the same result?    
Hi all, I have a text input for a table header. My requirement is , by default the table should show all the values and if any letters typed in the text box, the same should match with the table hea... See more...
Hi all, I have a text input for a table header. My requirement is , by default the table should show all the values and if any letters typed in the text box, the same should match with the table header and the values containing that sub string should be displayed. I created the text box but haven't figured out how to match this sub string to the header. Please help me in this..
Hi,  We are using  the Curl script to call splunk RestAPI to send the data out of  splunk  (to Kafka/ES) . We have 1+lakhs  events in every second . So while calling the rest api (calling every 5 se... See more...
Hi,  We are using  the Curl script to call splunk RestAPI to send the data out of  splunk  (to Kafka/ES) . We have 1+lakhs  events in every second . So while calling the rest api (calling every 5 secs) , it is getting time out . Sample  curl command for calling restapi    curl -k -u admin:changeme \ https://localhost:8089/services/search/jobs/ -d search="search index=sample sourcetype=access_* earliest=-5m" What is the limit of event count  we can extract at a time through Rest API Call? What is the  default timeout settings  ?Is it possible to change ?  Is there a better way to send splunk data  outside?  Tried Python script using Splunklib.client .That also failed .   Appreciate your inputs in advance . Regards Deev