All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a lookup table with a few fields (FirstSeenDate, LastSeenDate, IP, etc...). I have a search created to show me the top 10 events in the table by count. What I want to do is add a part in the s... See more...
I have a lookup table with a few fields (FirstSeenDate, LastSeenDate, IP, etc...). I have a search created to show me the top 10 events in the table by count. What I want to do is add a part in the search to filter out anything that is older than 90 days in the FirstSeenDate column. 
Using fecth in reactjs: fetch('https://[SUBDOMAIN].splunkcloud.com:8088/services/collector/event/1.0', { method: 'POST', headers: { 'Access-Control-Allow-Origin': 'application/json', '... See more...
Using fecth in reactjs: fetch('https://[SUBDOMAIN].splunkcloud.com:8088/services/collector/event/1.0', { method: 'POST', headers: { 'Access-Control-Allow-Origin': 'application/json', 'Access-Control-Allow-Credentials': 'true', 'Access-Control-Allow-Methods': 'OPTIONS, GET, POST', 'Access-Control-Allow-Headers': 'Content-Type, Depth, User-Agent, X-File-Size, X-Requested-With, If-Modified-Since, X-File-Name, Cache-Control', Accept: 'application/json', 'Content-Type': 'application/json', Authorization: 'Splunk [TOKEN]' }, body: JSON.stringify({ sourcetype: '_json', index: 'main', host: 'mydata2', event: { foo: 'bar3', b: ['value1_3', 'value1_4'] } }) }) .then((res) => res.json()) .then((json) => { // eslint-disable-next-line no-console console.log('json---->', json); }); Using splunk-logging in reactjs:  import * as SplunkLogging from 'splunk-logging';   const SplunkLogger = SplunkLogging.Logger; const config = { token: [TOKEN], url: '[URL]:8088' }; // eslint-disable-next-line no-var const Logger = new SplunkLogger(config); Logger.requestOptions.strictSSL = true; // eslint-disable-next-line no-var const payload = { // Message can be anything; doesn't have to be an object message: { temperature: '70F', chickenCount: 500 } }; console.info('Sending payload', payload); Logger.send(payload, function (err, resp, body) { // If successful, body will be { text: 'Success', code: 0 } console.info('Response from Splunk', body); });    
Hi, I need to call Js from <a> achor tag can someone please help.   <div id="showMoreRecords" align="right"> <a>Show More Records</a> </div>     JS file code: require([ 'underscore', 'back... See more...
Hi, I need to call Js from <a> achor tag can someone please help.   <div id="showMoreRecords" align="right"> <a>Show More Records</a> </div>     JS file code: require([ 'underscore', 'backbone', '../app/monitorPro/components/ModalView', 'splunkjs/mvc', 'splunkjs/mvc/searchmanager', 'splunkjs/mvc/simplexml/ready!' ], function(_, Backbone, ModalView, mvc, SearchManager) { var order = mvc.Components.get("showMoreRecords"); // getting error here var tokens = mvc.Components.getInstance("submitted"); var poNumber = tokens.get("PONumber_ord"); var detailSearch = new SearchManager({ id: "detailSearch", earliest_time: "0", latest_time: "", preview: true, cache: false, search: 'index=testIndex source="testsource" PONumber=$PONumber_ord$ |table PONumber,CustomerName,RequestDate,shipperTimestamp' }, {tokens: true, tokenNamespace: "submitted"}); order.on("click", function(e) { e.preventDefault(); var modal = new ModalView({ title: poNumber, search: detailSearch }); modal.show(); }); });
Hello Splunkers, https://splunkbase.splunk.com/app/5037/ i am using this add-on to create a ticket in Jira, as an alert action. But after the set-up giving the JIRA URL and Credentials, it gives an... See more...
Hello Splunkers, https://splunkbase.splunk.com/app/5037/ i am using this add-on to create a ticket in Jira, as an alert action. But after the set-up giving the JIRA URL and Credentials, it gives an error for this query: index=_internal sourcetype=splunkd component=sendmodalert  
Hi All, Actually in our splunk environment there is no test environment prior and now its present, So i need to replicate all the alerts,dashboards and reports present in production to test enviro... See more...
Hi All, Actually in our splunk environment there is no test environment prior and now its present, So i need to replicate all the alerts,dashboards and reports present in production to test environment. I didnt have access to backend environment of Splunk. Is there any smarter way to do that instead of cloning each and every alert. Waiting for response. Thanks in Advance.
Hi, I have several files on a AWS s3 bucket and I have configured an input to get data from these files. Is there a way to make Splunk process them without unzip them manually ?  I use Splunk Entep... See more...
Hi, I have several files on a AWS s3 bucket and I have configured an input to get data from these files. Is there a way to make Splunk process them without unzip them manually ?  I use Splunk Enteprise 8.2.2.1 and AWS plugin. Thanks for your help Saïd  
I need a way to evaluate a simple math expression. The following query works, and expr evaluates to result with a value of 44.   | makeresults | eval result= [| makeresults count=1 | noop | ... See more...
I need a way to evaluate a simple math expression. The following query works, and expr evaluates to result with a value of 44.   | makeresults | eval result= [| makeresults count=1 | noop | head 1 | eval expr = "1*2+3*4+5*6" | return $expr]   But, I need to make the following type of query work.   | gentimes start=-1 | eval expr = "1*2+3*4+5*6" | table expr, $expr   Is there a way to do this ? Thanks.
Dear ALL, Is it possible to display a "yes" or "no" pop-up window before performing a search from the dashboard drilldown? To prevent operational mistakes, I would like to stop opening the search S... See more...
Dear ALL, Is it possible to display a "yes" or "no" pop-up window before performing a search from the dashboard drilldown? To prevent operational mistakes, I would like to stop opening the search SPL if there is a problem.
Hi team, I am using below  query  to display the value for perc25, perc50, perc75 and perc95  for 'latency' field for different subscribers. <base query> |chart perc25(latency) perc50(latency) per... See more...
Hi team, I am using below  query  to display the value for perc25, perc50, perc75 and perc95  for 'latency' field for different subscribers. <base query> |chart perc25(latency) perc50(latency) perc75(latency) perc95(latency) by subscriber the chart splunk returned is below, but it is not what I want.   My expected chart is below. I want : 1.  25, 50, 75 and 95 as the rating scale of x-axis, instead of subscribers. 2. Y axis to display the value returned by perc25, perc50, perc75 and perc95 2. subscribers as legends, instead of perc25, perc50, perc75 and perc95.   How to get this chart?  
Hi Team,   Thanks for your reply..!!   My question was:   I am having F5 waf device and we have done configured on f5 waf for sending logs to syslog server (which is splunk) with port 514. Now... See more...
Hi Team,   Thanks for your reply..!!   My question was:   I am having F5 waf device and we have done configured on f5 waf for sending logs to syslog server (which is splunk) with port 514. Now I want to know, if my device sending logs to splunk(syslog) then which type of forward is my waf device. In this case my waf will be UF or HF or else?   Regards, Suraj
I want to get below installer. splunk-add-on-for-unix-and-linux_523.tgz It seems same add-on with below URL because the downloaded file name for ver 8.3.0/8.3.1 is  similar with above an add-on ins... See more...
I want to get below installer. splunk-add-on-for-unix-and-linux_523.tgz It seems same add-on with below URL because the downloaded file name for ver 8.3.0/8.3.1 is  similar with above an add-on installer name what I want. https://splunkbase.splunk.com/app/833/ I also checked release history. But I was not able to find the way of getting past add-on files. Maybe install file for 5.2.3 is too old and Splunk won't support.. If there is the way of get pas released install file, please share me.. Thank you!
Hi All, Splunk cloud is not receiving the  logs form Windows Universal Forwarder. I see the below logs from Splunkd. can please help me to resolve the issue.   Splunkd logs: AutoLoadBalancedConne... See more...
Hi All, Splunk cloud is not receiving the  logs form Windows Universal Forwarder. I see the below logs from Splunkd. can please help me to resolve the issue.   Splunkd logs: AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Closing stream for idx=10.236.162.24:9997 10-08-2021 09:49:11.748 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Connected to idx=10.236.82.132:9997, pset=0, reuse=0. 10-08-2021 09:49:17.976 +0100 INFO TailReader [2576 tailreader0] - Batch input finished reading file='C:\Program Files\SplunkUniversalForwarder\var\spool\splunk\tracker.log' 10-08-2021 09:49:47.803 +0100 INFO TailReader [2576 tailreader0] - Batch input finished reading file='C:\Program Files\SplunkUniversalForwarder\var\spool\splunk\tracker.log' 10-08-2021 09:49:51.617 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Closing stream for idx=10.236.82.132:9997 10-08-2021 09:49:51.618 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Connected to idx=10.236.162.97:9997, pset=0, reuse=0.
I thought I was following OK practice as these were customisations to collections.conf and transforms.conf and savedsearches.conf in the local directory But it appears the app owner just got rid of ... See more...
I thought I was following OK practice as these were customisations to collections.conf and transforms.conf and savedsearches.conf in the local directory But it appears the app owner just got rid of them when I upgraded to 5.0.0 to 5.1.0 Working to recover the situation and have pinged the developer. The data should be recreated Was it my fault by adding stanzas to a commercial app ? or should I have been protected if I stuck to local copies ?    
Hi, I'm trying to build a search to find the count, min,max and Avg within the 99th percentile, all work apart from the count, not sure if I am missing something: index="main" source="C:\\inetpub\\... See more...
Hi, I'm trying to build a search to find the count, min,max and Avg within the 99th percentile, all work apart from the count, not sure if I am missing something: index="main" source="C:\\inetpub\\logs\\LogFiles\\*" |bin span=1d _time | eval ResponseTime= time_taken/1000000 | eval responseTime= time_taken/1000000 | timechart span=1mon p99(responseTime) as 99thPercentile | stats min(99thPercentile) as p99responseTimemin max(99thPercentile) as p99responseTimemax avg(99thPercentile) as p99responseTimeavg count(99thPercentile) by _time   Thanks   Joe
can any one help me with step by step ssl configuration from UF to HF and HF to Indexers?   Any help would be apprecitaed
I need to create a table that includes the filename, the domain name of which file came from, the source IP, the destination IP, and the date/time stamp. What should the search query be? Also with ... See more...
I need to create a table that includes the filename, the domain name of which file came from, the source IP, the destination IP, and the date/time stamp. What should the search query be? Also with this -  Find the executable they uploaded. Once found, detail the following in a single table: What was the filename? When was it uploaded?  Was the upload successful?  Where did it come from?
Hello I am looking a simple SPL to  to detect activity from users without MFA in AWS. I have the search below which suggest I am getting some where. but just need to confirm if there is more to it  ... See more...
Hello I am looking a simple SPL to  to detect activity from users without MFA in AWS. I have the search below which suggest I am getting some where. but just need to confirm if there is more to it    sourcetype=cloudtrail userIdentity.sessionContext.attributes.mfaAuthenticated=false
Hello, I have 4 python scripts  to parse data that we receive in Linux machine once a day where HF has installed. Currently, I am running my python scripts manually every day  in that Linux machine ... See more...
Hello, I have 4 python scripts  to parse data that we receive in Linux machine once a day where HF has installed. Currently, I am running my python scripts manually every day  in that Linux machine to perform that task . Is there any ways, I can  write Cron Expression  to automate my python scripts so that python scripts will run automatically once a day in that Linux machine where HF has installed. Thank you so much, any help will be highly appreciated.   
I want to delete this field (VID) from one of my search query, this is not available under  Field extractions. and what is the difference between (a and #) ?  
Hi,       I have recently integrated and migrated AWS Simple Queue Serivce (SQS) logs to splunk. I am trying to search for the SQS logs in splunk and its not returning anything. Below is the query ... See more...
Hi,       I have recently integrated and migrated AWS Simple Queue Serivce (SQS) logs to splunk. I am trying to search for the SQS logs in splunk and its not returning anything. Below is the query i am using,    index="application-index" aws_account_id="123456" sourcetype="aws:sqs" Please correct if i missed anything here. Thanks.