All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello, I am getting an error message "Sorry (170037) This folder is no longer available" when trying to register for now 3 courses including Search Under the Hood, Data Models and Introduction to Ent... See more...
Hello, I am getting an error message "Sorry (170037) This folder is no longer available" when trying to register for now 3 courses including Search Under the Hood, Data Models and Introduction to Enterprise Security. what is  going on? 
Hello,   While trying to deploy the ES using the Deployer GUI, I want to Enable SSL However I faced the below:  
I created a Splunk Macros for regular expressions for URIs or URLs. Definitions and usages are in an article below. https://qiita.com/Joh256/private/659ef65897905890ef99 I also put them in an add-... See more...
I created a Splunk Macros for regular expressions for URIs or URLs. Definitions and usages are in an article below. https://qiita.com/Joh256/private/659ef65897905890ef99 I also put them in an add-on below. https://splunkbase.splunk.com/app/6595  
I created a Splunk Macros for regular expressions for IPv4 and IPv6 addresses. Definitions and usages are in an article below. https://qiita.com/Joh256/private/659ef65897905890ef99 I also put them... See more...
I created a Splunk Macros for regular expressions for IPv4 and IPv6 addresses. Definitions and usages are in an article below. https://qiita.com/Joh256/private/659ef65897905890ef99 I also put them in an add-on below. https://splunkbase.splunk.com/app/6595  
I created a Splunk Macros for regular expressions for IPv4 addresses. Definitions and usages are in an article below.  https://qiita.com/Joh256/private/659ef65897905890ef99. I also put them in an ... See more...
I created a Splunk Macros for regular expressions for IPv4 addresses. Definitions and usages are in an article below.  https://qiita.com/Joh256/private/659ef65897905890ef99. I also put them in an add-on below. https://splunkbase.splunk.com/app/6595  
@ITWhisperer is correct.  You should not use regex with JSON which contains structured data.  In fact, you also do not need spath with raw events because Splunk by default does that.  So, you can use... See more...
@ITWhisperer is correct.  You should not use regex with JSON which contains structured data.  In fact, you also do not need spath with raw events because Splunk by default does that.  So, you can use untable directly. A more semantic implementation of your intentions is to use JSON functions introduced in 8.2:   index=jenkins_artifact source="<path to json>/statistics.json" | eval Transaction_type = json_keys(_raw) | foreach mode=json_array Transaction_type [eval jsonTrans = mvappend(jsonTrans, json_object("Transaction Name", <<ITEM>>, "pct2ResTime", json_extract(_raw, <<ITEM>> . ".pct2ResTime")))] | fields - _raw Transaction* | mvexpand jsonTrans | spath input=jsonTrans | fields - json*   This is an emulation of your mock data:   | makeresults | eval _raw = "{ \"Transaction1\" : { \"transaction\" : \"Transaction1\", \"pct1ResTime\" : 3083.0, \"pct2ResTime\" : 4198.0, \"pct3ResTime\" : 47139.0 }, \"Transaction2\" : { \"transaction\" : \"Transaction2\", \"pct1ResTime\" : 1151.3000000000002, \"pct2ResTime\" : 1318.8999999999996, \"pct3ResTime\" : 6866.0 }, \"Transaction3\" : { \"transaction\" : \"Transaction3\", \"pct1ResTime\" : 342.40000000000003, \"pct2ResTime\" : 451.49999999999983, \"pct3ResTime\" : 712.5799999999997 } }" | spath ``` the above emulates index=jenkins_artifact source="<path to json>/statistics.json" ```   Output is Transaction Name pct2ResTime Transaction1 4198 Transaction2 1318.8999999999996 Transaction3 451.49999999999983  
The targeted server/endpoint for integration with this app is the machine that you would like to run commands on. The server/endpoint itself does not need to be integrated into SOAR, but rather SOAR ... See more...
The targeted server/endpoint for integration with this app is the machine that you would like to run commands on. The server/endpoint itself does not need to be integrated into SOAR, but rather SOAR needs credentials/certificates/tickets to authenticate with the winRM service on the target server/endpoint.
This app is listed as "Developer Supported", which means you may have more success by directly contacting the developer to ask if they plan to add the retrieval of archive events to future releases. ... See more...
This app is listed as "Developer Supported", which means you may have more success by directly contacting the developer to ask if they plan to add the retrieval of archive events to future releases. Email Support email: support@mimecast.com Contact Notes For technical support, visit https://community.mimecast.com/s/contactsupport
Not directly in the web UI, but you can set the number of page entries by changing the URL parameters. Go to the "Events" queue, then set the page size using the dropdown to any value other than the ... See more...
Not directly in the web UI, but you can set the number of page entries by changing the URL parameters. Go to the "Events" queue, then set the page size using the dropdown to any value other than the default. You will then see your URL look like: https://yoursoar.com/browse?page=1&per_page=25&filter=new_events&status=new You can then change per_page=25 to any number. For 100 entries, set per_page=100. You can do this with Playbooks, Events, and Cases.
The app was installed from splunkbase. I tried to add the inputs.conf file to change to a custom index. The new package was rejected when I uploaded to splunk cloud, even if I changed the app ID.   ... See more...
The app was installed from splunkbase. I tried to add the inputs.conf file to change to a custom index. The new package was rejected when I uploaded to splunk cloud, even if I changed the app ID.   Thank you!
Hi @fl66 , you could add a new custom index by interface and them modify your input to send logs to that index, where are these inputs, still on Splunk Cloud or on premise? If on Splunk Cloud. modi... See more...
Hi @fl66 , you could add a new custom index by interface and them modify your input to send logs to that index, where are these inputs, still on Splunk Cloud or on premise? If on Splunk Cloud. modify them by interface or uploading a new version of the app, if on premise, modify them in the on premise installed version. Ciao. Giuseppe.
Hi, I installed a splunk app and events are sent to default index. But I need to change the index to be a custom index. I tried to create  local/inputs.conf file and repackaged the app. The app was ... See more...
Hi, I installed a splunk app and events are sent to default index. But I need to change the index to be a custom index. I tried to create  local/inputs.conf file and repackaged the app. The app was rejected when I uploaded it to splunk cloud even if I changed the appID.    I also looked at Splunk ACS API, but could not figure out if that can be used to customize configuration files and what are the endpoint URL to use. thanks in advance.
See https://community.splunk.com/t5/Splunk-Search/Upgrade-to-5-x-some-of-my-existing-searches-are-taking-longer-to/m-p/158429
Do it ASAP to save network bandwidth usage, reduce search run time and unwanted i/o on SH which goes towards disk quota limit.
Can you paste a copy of your original event in a code sample format? Perhaps one of the double-quotes is wrong.
Assuming your event are as you have shown, you could do this | spath | table _time *.pct2ResTime | untable _time transaction pct2ResTime | eval "Transaction Name"=mvindex(split(transaction,"."),0) |... See more...
Assuming your event are as you have shown, you could do this | spath | table _time *.pct2ResTime | untable _time transaction pct2ResTime | eval "Transaction Name"=mvindex(split(transaction,"."),0) | table "Transaction Name" pct2ResTime If not, please share a more accurate representation of your events, preferably in a code block (as above) to preserve the formatting of the data.
Thank you for your help! but I'm unable to produce the table with this query.
No! I did not try with spath. Query i tried so far is below. Also,  Could you please help with spath, I'm very new to splunk index=jenkins_artifact source="<path to json>/statistics.json" | rex m... See more...
No! I did not try with spath. Query i tried so far is below. Also,  Could you please help with spath, I'm very new to splunk index=jenkins_artifact source="<path to json>/statistics.json" | rex max_match=0 "(?<keyvalue>\"[^\"]+\":\"[^\"]+\")" | mvexpand keyvalue | rex field=keyvalue "\"(?<key>[^\"]+)\":\"(?<value>[^\"]+)\"" | eval {key}=value | fields - keyvalue key value _raw host eventtype index linecount source sourcetype punct splunk_server tag tag::eventtype timestamp | untable date Transaction pct2ResTime | where like(Transaction,"%__%") | xyseries Transaction date pct2ResTime
We have a TrueSight integration with Splunk that is sending results when a certain event occurs. Sometimes no events are being sent, and I want to document only the first time when it happens, for e... See more...
We have a TrueSight integration with Splunk that is sending results when a certain event occurs. Sometimes no events are being sent, and I want to document only the first time when it happens, for example: Time 0 5 10 15 20 25 30 35 40 45 50 55 0 5 10 15 20 25 30 # of Events 3 4 0 0 0 8 15 2 0 5 55 66 0 0 0 0 0 8 9   I want to include also 0 values that occurs only the first time and not all the times when we have an alert.   Please assist  
Could you log in as the Splunk user on your indexer and then run btool for the stanzas relating the TLS-secured forwarding? /opt/splunk/bin/splunk btool inputs list SSL /opt/splunk/bin/splunk btool ... See more...
Could you log in as the Splunk user on your indexer and then run btool for the stanzas relating the TLS-secured forwarding? /opt/splunk/bin/splunk btool inputs list SSL /opt/splunk/bin/splunk btool inputs list splunktcp-ssl /opt/splunk/bin/splunk btool server list sslConfig Make sure that the settings are set according to the instructions in the article. If they are the wrong values, then add --debug to the btool commands to find the file which is setting the command. If there are no problems there, then do you find specific complaints in the splunkd log of the forwarder? E.g. "Invalid certificate", or does the connection time out? Have you been able to forward logs, even _internal logs, before setting up TLS?