All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi While running a script from my app bin directory I am executing a search query so that it will write some file to the location where my forwarder is monitoring. I am using bin/ExecuteSplunkQuery.s... See more...
Hi While running a script from my app bin directory I am executing a search query so that it will write some file to the location where my forwarder is monitoring. I am using bin/ExecuteSplunkQuery.sh to execute my splunk query which is taking around 70 seconds to complete but my problem is I am getting below error in the file as it's waiting for 30 seconds and killing my job, so my question is where to increase that 30-second value. I am getting below error- Splunk is taking too much time to complete search query...Quiting........itrCnt:30
I'm sending my log files to splunk from a syslog server using a universal forwarder.  They are getting to the right index and the hostnames are being extracted fine.  The problem is, nothing else is ... See more...
I'm sending my log files to splunk from a syslog server using a universal forwarder.  They are getting to the right index and the hostnames are being extracted fine.  The problem is, nothing else is parsing properly.  When I choose sourcetype=pan all i get is pan_log.  Its not parsing out the data into the appropriate types (i.e. threat, url, etc).  Here is a copy of my inputs.conf [monitor://C:\Program Files (x86)\Syslogd\Logged Devices\] host_segment = 5 sourcetype = "pan:log" no_appending_timestamp=true index=pan_logs disabled = false   The source directory is formatted as follows:   C:\Program Files (x86)\Syslogd\Logged Devices\PaloAlto Firewalls\%firewall model%\%ip_address%--Syslog-2021-01-21.txt   any thoughts?   Ian
I have a set of trellis items that I want to show in a dashboard. With the results though I want to set the height dynamically to show all the trellis structures correctly. Is there a way I can get t... See more...
I have a set of trellis items that I want to show in a dashboard. With the results though I want to set the height dynamically to show all the trellis structures correctly. Is there a way I can get the count returned from the job as to how many trellis' the job created?
I am getting the below error because of two files has same first two lines including timestamps in the different folder. ERROR TailReader - File will not be read, seekptr checksum did not match (fil... See more...
I am getting the below error because of two files has same first two lines including timestamps in the different folder. ERROR TailReader - File will not be read, seekptr checksum did not match (file=filename.2021-01-19.txt). Last time we saw this initcrc, filename was different. You may wish to use larger initCrcLen for this sourcetype, or a CRC salt on this source. Consult the documentation or file a support case online at http://www.splunk.com/page/submit_issue for more info. The monitoring stanza has filename.*.txt. So if I increase the initcrc or crcSalt, then all the files under the folders will get re-indexed. Along with crcSalt, I tried to use ignoreOlderThan but still, the old files are getting re-indexed. Example: ignoreOlderThan=1d, means still yesterday files are getting re-indexed. Any better solution to prevent this?
All, I have a few questions related to splunk stream 1) If a windows computer has splunk stream app installed and it has a UF installed, what are some differences in logging activity will I get bet... See more...
All, I have a few questions related to splunk stream 1) If a windows computer has splunk stream app installed and it has a UF installed, what are some differences in logging activity will I get between the two? 2) Does the splunk stream app get deployed from the deployment server just as the UF does? 3) Does splunk stream log just web traffic?
Hello Splunkers, I'm developing a Splunk app with about 20 dashboards (20 XML files). Each dashboard has a time picker. The thing I have noticed is that when a user clicks on a new dashboard, the ti... See more...
Hello Splunkers, I'm developing a Splunk app with about 20 dashboards (20 XML files). Each dashboard has a time picker. The thing I have noticed is that when a user clicks on a new dashboard, the time picker resets. My question is: does Splunk have some sort of Global time picker, so that when a user selects a time value on the time picker it will follow them across multiple dashboards.   Attached below is a brief design of my application and what I am trying to do.  Any suggestions is much appreciated. Thank you, Marco
I am doing a pilot for Okta Cloud to Splunk Cloud integration with a view to see Okta Customer Authentication events to show up in Splunk using Trial licenses. Apparently, I need to raise a Support ... See more...
I am doing a pilot for Okta Cloud to Splunk Cloud integration with a view to see Okta Customer Authentication events to show up in Splunk using Trial licenses. Apparently, I need to raise a Support Case to install the add-on (https://splunkbase.splunk.com/app/3682/) Also, I am unable to raise Support Case using trial account.  Can someone please help
Hi, May I know how to check stationarity/ non-stationarity of time series data in Splunk Machine Learning Tool Kit. I tried using KALMAN algorithm however its giving false positives for couple of ap... See more...
Hi, May I know how to check stationarity/ non-stationarity of time series data in Splunk Machine Learning Tool Kit. I tried using KALMAN algorithm however its giving false positives for couple of applications. Now to refine the model I'm trying to use ARIMA algorithm, so for using this I would like to check for stationarity/non-stationarity of data in order to get AR (autoregressive) - p,  (integrated) - d and MA (moving average) - q values. Kindly help me how to get this accomplished to chose p,d,q values Thank you!
I'm currently setting up logging from an EKS cluster into Splunk and am using the Splunk Connect for Kubernetes. We've deployed this tool using the provided helm chart and followed the values.yaml to... See more...
I'm currently setting up logging from an EKS cluster into Splunk and am using the Splunk Connect for Kubernetes. We've deployed this tool using the provided helm chart and followed the values.yaml to basically log everything that it can just so we can test, but I'm running into an issue where some logs are not getting sent to Splunk and I cant seem to figure out why. The basic EKS logs from various components are being picked up including an application log which shows events like the below, however only the first event/log is being picked up by Splunk (These logs are obfuscated): {"log":" {\"MessageId\":\"80facc67-beef-1234-4567-1234b0afddcc\",\"ReceiptHandle\":\"aBc2d6Y2c1234/gZ2Tb1234/M6Tyzdf1234/76z1234gBh1234/zGb1234/M6ThB1234/Bghp12TcDqZ1234+/GBKsqxgcFWExq+GSYA=\",\"MD5OfBody\":\"049dfb3e741234\",\"Body\":\"{\\\"Records\\\":[{\\\"eventVersion\\\":\\\"2.1\\\",\\\"eventSource\\\":\\\"aws:s3\\\",\\\"awsRegion\\\":\\\"us-east-1\\\",\\\"eventTime\\\":\\\"2021-01-20T15:57:44.247Z\\\",\\\"eventName\\\":\\\"ObjectCreated:Put\\\",\\\"userIdentity\\\":{\\\"principalId\\\":\\\"AWS:BDG1234\\\"},\\\"requestParameters\\\":{\\\"sourceIPAddress\\\":\\\"1.1.1.1\\\"},\\\"responseElements\\\":{\\\"x-amz-request-id\\\":\\\"1GT6YWQERVC\\\",\\\"x-amz-id-2\\\":\\\"KGBtzde1fg6Fdz1234/uOgVxBtBdz1234\\\"},\\\"s3\\\":{\\\"s3SchemaVersion\\\":\\\"1.0\\\",\\\"configurationId\\\":\\\"test-name-here\\\",\\\"bucket\\\":{\\\"name\\\":\\\"test-name-here\\\",\\\"ownerIdentity\\\":{\\\"principalId\\\":\\\"GBT1RTYZ1234\\\"},\\\"arn\\\":\\\"arn:aws:s3:::test-name-here\\\"},\\\"object\\\":{\\\"key\\\":\\\"upload/test-name-here.xml\\\",\\\"size\\\":757,\\\"eTag\\\":\\\"6ts3Szdfa45hVx\\\",\\\"sequencer\\\":\\\"1234GBG1234\\\"}}}]}\",\"Attributes\":{\"SentTimestamp\":\"1611158266018\"}}\n","stream":"stdout","time":"2021-01-20T15:57:50.649004404Z"} {"log":"[2021-01-20T15:57:50.648Z] INFO (LOGS/1): received message, new data: test-name-here upload/test-name-here.xml\n","stream":"stdout","time":"2021-01-20T15:57:50.649112304Z"}   The second log is never getting to Splunk and both of these events are in the same log file.   Looking over the logs from the HEC on my receivers I don't see any parsing errors or anything that would indicate why these second group of logs are not being picked up. I've also checked on the containers themselves and found no errors from fluentd to help explain the problem.   Has anyone else run into an issue like this and have  a solution to why some events would be parsed and sent properly and others wouldn't even though there in the same file? I've found this article about a similar issue, but no resolution was provided. https://community.splunk.com/t5/All-Apps-and-Add-ons/Missing-logs-Splunk-Connect-for-Kubernetes/m-p/442300#M54362  
Hi guys I want to set tokens in one dashboard and link to another dashboard by use the drilldown option. I use the drilldown option with a single value panel which is just showing a text quot. The... See more...
Hi guys I want to set tokens in one dashboard and link to another dashboard by use the drilldown option. I use the drilldown option with a single value panel which is just showing a text quot. The link to the other dashboard works but in there splunk always uses the value in the anel for the tokens. I want to use the set tokens in thr first dashboard. Here is my code: <form> <label>Test Case</label> <fieldset submitButton="false"> <input type="dropdown" token="building" searchWhenChanged="true"> <label>RZ Building</label> <fieldForLabel>RZ_Building</fieldForLabel> <fieldForValue>RZ_Building</fieldForValue> <search> <query>index=pdu_de RZ_Building=* | dedup RZ_Building</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="room"> <label>Room</label> <fieldForLabel>RZ_Room</fieldForLabel> <fieldForValue>RZ_Room</fieldForValue> <search> <query>index=pdu_de RZ_Building=$building$ RZ_Room=* | dedup RZ_Room | sort RZ_Room</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> </fieldset> <row> <panel> <single> <search> <query>| makeresults | eval text1= "Blue PDU" | fields - _time</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">all</option> <option name="refresh.display">progressbar</option> <drilldown> <link target="_blank">/app/pdu_dev/geist_pdu_poc?form.building=$click.value$&amp;form.room=$click.value$</link> </drilldown> </single> </panel> </row> </form>
Hello, I am looking to split the log entry into tag using below link. https://community.splunk.com/t5/Splunk-Search/How-do-I-split-a-log-entry-into-tags/m-p/50109#M12034   Would like to know can ... See more...
Hello, I am looking to split the log entry into tag using below link. https://community.splunk.com/t5/Splunk-Search/How-do-I-split-a-log-entry-into-tags/m-p/50109#M12034   Would like to know can we use the code as is because I dont have tags defined.   Thanks
Hello, I am trying to create Lambda function and enabling HTTP event collector using below doc. https://dev.splunk.com/enterprise/docs/devtools/httpeventcollector/useawshttpcollector/createlambdafu... See more...
Hello, I am trying to create Lambda function and enabling HTTP event collector using below doc. https://dev.splunk.com/enterprise/docs/devtools/httpeventcollector/useawshttpcollector/createlambdafunctionnodejs/#Configure-HTTP-Event-Collector   After enabling HTTP event collector not able to access the url <SPLUNK_HEC_URL:8088>/services/collector and because of that lambda function is also not working.   Please help. Thanks
Hi guys I want to build two panels the first one shows the text "Si520"and the second one "Si519".  From the panels I want to use the drilldown option to connect to another dashboard. I tried to use... See more...
Hi guys I want to build two panels the first one shows the text "Si520"and the second one "Si519".  From the panels I want to use the drilldown option to connect to another dashboard. I tried to use a html panel but than I can't use the drilldown option. By research I found out I need to use the matchvalue command but I don't understand how to use it. The link below shows what I want. https://docs.splunk.com/Documentation/DashApp/0.9.0/DashApp/shapes#Apply_thresholding_to_strings the vizualition on the left bottom is my aim. But the code on the website doesn't helps me.
<drilldown> <set token="tok_port57">$click.name$</set> </drilldown> <row depends="$tok_port57$"> <panel> <title>Port 57 Details</title> <table> <search> <query>index=email | transaction keepe... See more...
<drilldown> <set token="tok_port57">$click.name$</set> </drilldown> <row depends="$tok_port57$"> <panel> <title>Port 57 Details</title> <table> <search> <query>index=email | transaction keepevicted=true ic | eval port = if(match(sender_group,"Secure_*"),"Port_57","Port_5") |search port= "$tok_port57$" | stats count by src src_host</query> <earliest>$time_token.earliest$</earliest> <latest>$time_token.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="refresh.display">progressbar</option> </table> </panel> </row> This is the drilldown setting for the single value. However instead of getting the field name as the token value, the keyword "result" is getting passed as token. Can someone tell me what I am doing wrong?
I got this error message com.splunk.HttpException: HTTP 400 -- Argument "eai:acl:sharing" is not supported by this handler.   This is my code. private JobArgs createArgs(com.splunk.Service servic... See more...
I got this error message com.splunk.HttpException: HTTP 400 -- Argument "eai:acl:sharing" is not supported by this handler.   This is my code. private JobArgs createArgs(com.splunk.Service service, final Dashboard dashboard, final Zones zones, final List<GlobalProperties> globalVars) { JobArgs jobargs = new JobArgs(); jobargs.put(DASHBOARD_NAME.value(), dashboard.getName()); jobargs.put(EAI_TYPE.value(), "views"); jobargs.put(EAI_DATA.value(), translateData(zones,dashboard,globalVars)); jobargs.put("eai:acl:sharing", "app"); jobargs.put(TEMP_SPLUNK_APP.value(), zones.getSplunk_app() == null ? "search" : zones.getSplunk_app()); return jobargs; } I am trying to set this key "sharing"   <s:key name="eai:acl"> <s:dict> <s:key name="app">search</s:key> <s:key name="can_change_perms">1</s:key> <s:key name="can_list">1</s:key> <s:key name="can_share_app">1</s:key> <s:key name="can_share_global">0</s:key> <s:key name="can_share_user">1</s:key> <s:key name="can_write">1</s:key> <s:key name="modifiable">1</s:key> <s:key name="owner">svc-gps</s:key> <s:key name="perms" /> <s:key name="removable">1</s:key> <s:key name="sharing">user</s:key> </s:dict> </s:key>   I dont see it is supported in https://docs.splunk.com/Documentation/Splunk/7.2.0/RESTREF/RESTknowledge#data.2Fui.2Fviews.2F.7Bname.7D   Is there any other way? Thank you.   Regards, Khairul
I want to calculate the download speed for each fetched transaction page in bytes per second. transaction pages consist of login page(that has user login details) and openloan pages consist of docum... See more...
I want to calculate the download speed for each fetched transaction page in bytes per second. transaction pages consist of login page(that has user login details) and openloan pages consist of documents. I want to have data in bytes per second I have written the below query that gives  me response time in second and i would like to add one more field here that would give me data in bytes per second for  fetched  transaction pages. index=app_iis source=*Inetpub* (host=JTCOTWCEMPLW* OR host=JTCOTWCEMPPW* OR host=JTCEP1WPLOSW* OR host=JTCEP1WPPORW*) http_method=GET Uri="*LOSWeb*" Uri="*aspx*" Time_taken!="-" |eval APClientSource=split(Uri,"/"), APClientName=mvindex(APClientSource,1) |search APClientSource="Login.aspx" OR APClientSource="OpenLoan.aspx" |eval "Time_taken(s)"='Time_taken'/1000 |bin _time span=5m |stats avg(Time_taken(s)) as avgURT,distinct_count(C_username) as usercount by host,APClientName,cs_uri_stem,_time |eval avgURT=round(avgURT,0),StdevURT=round(StdevURT,0)
Hello, I'm trying to find out if Enterprise Security is officially supported in containerized environment (particullary Docker). I know Splunk Enterprise is, but I can't find any information about t... See more...
Hello, I'm trying to find out if Enterprise Security is officially supported in containerized environment (particullary Docker). I know Splunk Enterprise is, but I can't find any information about the ES. If anyone can advise, I will be very grateful. Best regards Lukas Mecir
I`m trying to make alert with opening an issue in JIRA Service Desk. I filled some fields in this action like Project, Issue type, Summary and etc. But when alert triggered the issue doesn`t create i... See more...
I`m trying to make alert with opening an issue in JIRA Service Desk. I filled some fields in this action like Project, Issue type, Summary and etc. But when alert triggered the issue doesn`t create in Jira. What could be the problem? P.S. Plugin configuration is correct, I can get projects info and etc.
I want to set up the retention policy for our logs (18 months). I have edited the indexes.conf to specify  frozenTimePeriodInSecs however as per splunk documentation this setting, which we have set... See more...
I want to set up the retention policy for our logs (18 months). I have edited the indexes.conf to specify  frozenTimePeriodInSecs however as per splunk documentation this setting, which we have set up as well: maxTotalDataSizeMB takes precedence over frozenTimePeriodInSecs .  Is there any work around to leave the TotalDataSizeMB set up to a specific value and keep the logs only for 18 months regardless the TotalDataSize?  Thanks for help Dawid M
Hey folks,    I need help on field extraction. I have index=abs, source =123. When I search this in the Splunk, I can see some fields are auto extracted. for example ,  session=12345,  Status=S... See more...
Hey folks,    I need help on field extraction. I have index=abs, source =123. When I search this in the Splunk, I can see some fields are auto extracted. for example ,  session=12345,  Status=Success,NA,NA. In this case , I tried to create a new field extraction on top of this, but it was not working. I don't want that "Status=Success,NA,NA." I need to separate it as "Status=Success" ", Exception=NA", "SubAPITime=NA". when I tried to create a new field in search level (UI) and Index level(Index Cluster) , it is not working. I hope, we have to first remove the existing fields and then create our new field extraction. Please help me on this one !   props.conf for new field extraction which is not working : [source::123] TRANSFORMS-extract-app_rewards = rewards_qual   transforms.conf [rewards_qual] SOURCE_KEY = MetaData:Source REGEX = ^(?P<SessionId>[^,]+),(?P<User>[^,]+),(?P<DateTime>[^,]+),(?P<View>[^,]+),(?P<AppliedFilters>[^,]+),(?P<Status>[^,]+),(?P<Exception>[^,]+),(?P<SubAPITime>[^,]+),(?P<SubAPIName>[^,]+),(?P<TransactionId>[^,]+),(?P<HANATime>[^,]+),(?P<TotalTime>.+) FORMAT = SessionId::$1 User::$2 DateTime::$3 View::$4 AppliedFilters::$5 Status::$6 Exception::$7 SubAPITime::$8 SubAPIName::$1 TransactionId::$1 HANATime::$1 TotalTime::$1 WRITE_META = true   I tried with source and source type as well.   Thanks, Dharani.