All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

The list of capabilities doesn't explicitly list this. I want a particular power user to be able to use the rest API and see a result of all users, not simply themselves. They have the ability ... See more...
The list of capabilities doesn't explicitly list this. I want a particular power user to be able to use the rest API and see a result of all users, not simply themselves. They have the ability to execute REST already, but they only ever see their own user account when running this command: | rest /services/authentication/users splunk_server=local | table realname
I have two datasets in separate indexes that I would like to compare. i.e. dataset from search1 : item1 item2 item3 dataset from search2 : item1 item2 item3 item4 item5 I wou... See more...
I have two datasets in separate indexes that I would like to compare. i.e. dataset from search1 : item1 item2 item3 dataset from search2 : item1 item2 item3 item4 item5 I would like to produce a table that gives me a table of items that exist in search2 and not in search1 . i.e. itemfield searchname` ------------------------------------- item4 search2 item5 search2 How would I do this?
Using the WebTools App from Splunkbase, I've issued the following API calls to modify the 'Service' object attribute 'Enabled' to 0. I get a 200 response but the targeted service remains enabled. ... See more...
Using the WebTools App from Splunkbase, I've issued the following API calls to modify the 'Service' object attribute 'Enabled' to 0. I get a 200 response but the targeted service remains enabled. Examples of the API calls issued: | eval header="{\"Content-type\":\"application/json\"}" | eval data="{\"Enabled\": 0}" | curl method=post uri=https://localhost:8089/servicesNS/nobody/SA-ITOA/itoa_interface/service/dc3b486e-2ec5-4f09-9dab-3714fc5f536f/?is_partial_data=1 splunkauth=true debug=true headerfield=header datafield=data | eval header="{\"Content-type\":\"application/json\"}" | eval data="Enabled=0" | curl method=post uri=https://localhost:8089/servicesNS/nobody/SA-ITOA/itoa_interface/service/dc3b486e-2ec5-4f09-9dab-3714fc5f536f/?is_partial_data=1 splunkauth=true debug=true headerfield=header datafield=data To confirm the ITSI service 'enabled' state, I issue the following GET query: | eval header="{\"Content-Type\":\"application/json\"}" | curl method=get uri=https://localhost:8089/servicesNS/nobody/SA-ITOA/itoa_interface/service/dc3b486e-2ec5-4f09-9dab-3714fc5f536f splunkauth=true debug=true headerfield=header | spath input=curl_message | fields key object_type enabled permissions.user mod_source mod_timestamp Output: The mod_source and mod_timestamp coincide with my API 'disable' attempt. I appreciate any guidance or corrections. Thank you!
How to find the number of DNS servers issued by the DHCP servers. Also, how to identify all DHCP servers in a particular subnet (for eg: 10.1.0.0/24)
Using Java Class HttpEventCollectorLoggingHandler, with properties file, implementing the HttpEventCollectorErrorHandler. The Reply; Error Code and Error text from HttpEventCollectorErrorHandler is... See more...
Using Java Class HttpEventCollectorLoggingHandler, with properties file, implementing the HttpEventCollectorErrorHandler. The Reply; Error Code and Error text from HttpEventCollectorErrorHandler is: Reply: Failed to connect to hecdevsplunk.company.com/11.54.0.234:8088 ErrorCode: -1 ErrorText: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 9 path $ Failed write of Log Event: Failed to connect to hecdevsplunk.company.com/11.54.0.234:8088 HttpEventCollectorErrorHandler Any assistance would be appreciated, Thanks Steve.
Hi, Can someone please help me to configure palo alto Wildfire API with splunk cloud.
I need to disable the itsi backfill for all kpi. Is there a conf file statza meant for this like backfill_enabled=false which can be applied to all kpis at once.
I work at Couchbase and am planning to make some minor changes to the format of an entry in the http_access.log and http_access_internal.log. I would like to work with whoever owns this plug-in to g... See more...
I work at Couchbase and am planning to make some minor changes to the format of an entry in the http_access.log and http_access_internal.log. I would like to work with whoever owns this plug-in to gauge the impact of the change, if any, and plan for what to do about it. I can be contacted at steve.watanabe@couchbase.com
Hello All , I have a field called version which has values 1,2 etc for each different field "collection" .Now I want to create an alert whenever the version value changes (always goes higher) from... See more...
Hello All , I have a field called version which has values 1,2 etc for each different field "collection" .Now I want to create an alert whenever the version value changes (always goes higher) from 1 to 2 or 2 to 3 .Once the value changes the new events will have the new version value . For instance for collection A the version is 1 ..in future if that value changes to 2 then the new data will have value 2 and so on SAMPLE EVENT {"fileName":"Bggg","id":"5d0d78","isChained":false,"metaInfo":{"author":"","copyright":"","description":"","name":"lin","noOutputFilesMessage":"","outputMessage":"","url":"","urlText":""},"packageType":1,"public":false,"runCount":4,"runDisabled":false,"subscriptionId":"5d013c84d3c465","uploadDate":"\/Date(1583943892366)\/","version":null,"workerTag":"","collections":[{"collectionId":"5dc909225c9e1a89","collectionName":"A"}],"lastRunDate":"\/Date(1583948946000-0400)\/","publishedVersionId":"5e6910d3fa3a841ee8000611","publishedVersionNumber":2,"publishedVersionOwner":{"active":true,"email":"aacom","firstName":"ha","id":"5d0136be14d3c398","lastName":"gi","sId":null,"subscriptionId":"5d0136be17c395"},"subscriptionName":"u"} Thanks in Advance
Data resembles this pattern. | makeresults | eval _raw="{\"foo\": [{\"randstring1\": {\"fqdn\" : \"ibar.example.com\"}}, {\"randstring2\": {\"fqdn\" : \"jbar.example.com\"} }]}" I am tryi... See more...
Data resembles this pattern. | makeresults | eval _raw="{\"foo\": [{\"randstring1\": {\"fqdn\" : \"ibar.example.com\"}}, {\"randstring2\": {\"fqdn\" : \"jbar.example.com\"} }]}" I am trying to extract the two FQDNs when the containing field name foo{}.* is a random string. Any hints on how to get this data? I've tried a few different options with spath and can't seem to get it to work. I could try a rex, but I was really hoping to avoid that. Basically, what I want at the end is a field (multivalue in this case) that has as value ibar.example.com and jbar.example.com.
Hi guys! I am pretty new to this and in researching I have not found what I am looking for or did not recognize the answer when it was in front of me. Anyway, I am trying to create a bar chart that c... See more...
Hi guys! I am pretty new to this and in researching I have not found what I am looking for or did not recognize the answer when it was in front of me. Anyway, I am trying to create a bar chart that counts how many completed tickets, Average days it too to assign the ticket, Average days to complete the ticket after it was assigned and the Total Average Days per month. So what I am 'trying to do' is have a chart that shows... example: Month Completed Avg_Days_to_Assign Avg_Days_to_Complete_after_Assign Total_Avg_Days JAN 1 12 10 22 FEB 1 10 10 20 MAR 1 2 2 4 etc etc Below is what I have done: | dedup Work_Order_ID | eval Days_to_Assign=round((Actual_Start_Date-Submit_Date)/86400,0) | eval Days_to_Complete_after_Assign=round((Completed_Date-Actual_Start_Date)/86400,0) | eval Total_Days_from_Submit_to_Completion=round((Completed_Date-Submit_Date)/86400,0) | stats count(Status) as Completed, avg(Days_to_Assign) as Avg_Days_to_Assign, avg(Days_to_Complete_after_Assign) as Avg_Days_to_Complete_after_Assign, avg(Total_Days_from_Submit_to_Completion) as Avg_Total_Days_from_Submit_to_Completion by Completed_Date | fieldformat Completed_Date=strftime(Completed_Date,"%b") What I get is each ticket listed separately by month instead of getting just totals for each month. I get...ignore the numbers, I just plugged those in for an example Month Completed Avg_Days_to_Assign Avg_Days_to_Complete_after_Assign Total_Avg_Days JAN 1 12 10 22 JAN 1 10 10 20 FEB 1 9 10 19 FEB 1 4 4 8 MAR 1 1 1 2 Any help is greatly appreciated!
I am configuring splunk to monitor AD but I am not able to ping AD server from Splunk. How do I accomplish it. Actually I want to configure Splunk support Add on for Active Directory but not able to ... See more...
I am configuring splunk to monitor AD but I am not able to ping AD server from Splunk. How do I accomplish it. Actually I want to configure Splunk support Add on for Active Directory but not able to do so because my splunk is on AWS and AD is on prem. How do I do it?
@LukeMurphey According to the issue at the URL below, this problem was resolved in 2.7.5, but I am on version 2.9.2. https://github.com/LukeMurphey/splunk-website-monitoring/issues/31 Here'... See more...
@LukeMurphey According to the issue at the URL below, this problem was resolved in 2.7.5, but I am on version 2.9.2. https://github.com/LukeMurphey/splunk-website-monitoring/issues/31 Here's the error: 2020-03-11 12:46:05,880 ERROR Exception generated when attempting to get the proxy configuration stanza=web_ping://[web input name], see url=http://lukemurphey.net/projects/splunk-website-monitoring/wiki/Troubleshooting Traceback (most recent call last): File "D:\Program Files\Splunk\etc\apps\website_monitoring\bin\web_ping.py", line 941, in run_ping self.get_proxy_config(input_config.session_key, conf_stanza) File "D:\Program Files\Splunk\etc\apps\website_monitoring\bin\modular_input.zip\modular_input\shortcuts.py", line 31, in wrapper return function(*args, **kwargs) File "D:\Program Files\Splunk\etc\apps\website_monitoring\bin\web_ping.py", line 799, in get_proxy_config website_monitoring_config = self.get_app_config(session_key, stanza) File "D:\Program Files\Splunk\etc\apps\website_monitoring\bin\modular_input.zip\modular_input\shortcuts.py", line 31, in wrapper return function(*args, **kwargs) File "D:\Program Files\Splunk\etc\apps\website_monitoring\bin\web_ping.py", line 735, in get_app_config raise Exception("Could not get the website_monitoring configuration") Exception: Could not get the website_monitoring configuration The steps I performed to achieve this: - Install website monitor version 2.7.6 - Add a website input - I believe it was broken here, but I did not verify this - Upgrade to the latest website monitor (2.9.2)
Thruput has increased to around 385 kbps due to the increase in number of inputs to be more than 150. Need to know what is the recommended Heap Size to set? and How to set it?
Hi All, I do have cumbersome problem...I have a table built out from an inputlookup search. We have n-columns in this table but a key ones are the key_field and a timestamp_A. We would like to ru... See more...
Hi All, I do have cumbersome problem...I have a table built out from an inputlookup search. We have n-columns in this table but a key ones are the key_field and a timestamp_A. We would like to run a subsearch within the inputlookup where, for each value in key_field, we check the most recent value of a timestamp_B in another index. If this value is more recent than the one in timestamp_A we update the timestamp value with the B value, otherwise we leave the timestamp_A. Assuming that both timestamps are in the same format we have the following restrictions: the value in timestamp_A might be empty, in this case if exist a timestamp_B value OK otherwise we leave empty. Very much appreciated any help that could be provided.
Hi. I have two separate searches. Search1 returns events where field1 and field2 exist: search source=x resource=foo | table field1, field2 Search2 returns events where field2 and... See more...
Hi. I have two separate searches. Search1 returns events where field1 and field2 exist: search source=x resource=foo | table field1, field2 Search2 returns events where field2 and field3 exist: source=y resource=bar | stats count by field2, filed3 Events of Search2 do not contain mentions of field1 but there is one-to-one relation between field1 and field2 , shown by results of Search1. How to combine these two searches into one search so that all three fields  field1 ,  field2 and field3  are shown in a table?
Hi, if my string is "asdf .\r\n asdf" and I filter on that (Add to search) I get "No results found". Any idea how to fix this? Thanks, Gunnar
Greetings! I am trying to create a visualization that tells me how much time does it have from detected/uploaded (or even using the date I choose with a timelimit), so for example this values are... See more...
Greetings! I am trying to create a visualization that tells me how much time does it have from detected/uploaded (or even using the date I choose with a timelimit), so for example this values are about to expire in 7 days, so the date the containers have is 10/03/2020 and I want splunk to take the count of which containers are about to expire using the timelimit I choose, how can I achieve that? Something like: Value / days for expiring Thanks in advance!
Hi team! I am preparing an update process and there is likely to be a mismatch of versions between the indexer and the universal forwarder. I just read that and I have some questions. https:... See more...
Hi team! I am preparing an update process and there is likely to be a mismatch of versions between the indexer and the universal forwarder. I just read that and I have some questions. https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/Compatibilitybetweenforwardersandindexers "An M in a cell indicates that this version of forwarder can send both event data and metrics data to the corresponding version of indexer." What are these data metric? How can I use them? "An S in a cell indicates that this version of forwarder can send data to this version of indexer after you change the Secure Sockets Layer (SSL)/Transport Layer Security (TLS) version and cipher suite on the forwarder." What changes I have to do here? Thank you!
Hi, I'm new in Splunk and I'm trying to collect Syslog log to indexers. I have read in Splunk documentation that Splunk Enterprise could listen on a TCP or UDP port for data coming from the syslog se... See more...
Hi, I'm new in Splunk and I'm trying to collect Syslog log to indexers. I have read in Splunk documentation that Splunk Enterprise could listen on a TCP or UDP port for data coming from the syslog service on one or more machines, but that this option is no longer available in the latest versions. Can anyone help me know how to collect Syslog logs to analyze them with Splunk?