All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi @gcusello , We are getting an error  "[aapxxxx01] Streamed search execute failed because: Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too man... See more...
Hi @gcusello , We are getting an error  "[aapxxxx01] Streamed search execute failed because: Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times. Renew your Splunk license by visiting www.splunk.com/store or calling 866.GET.SPLUNK." I checked our daily usage limit and we have been violating our daily usage(62 GB)  limit from one week.     Could you please help to fix this.
Hi I am a developer who want to develop an Augmented Reality using Splunk AR. I have acquired Splunk enterprise developer's license. However I am not sure about developers's license will allow me to... See more...
Hi I am a developer who want to develop an Augmented Reality using Splunk AR. I have acquired Splunk enterprise developer's license. However I am not sure about developers's license will allow me to use Splunk AR.   Is it possible to use developers license to use Splunk AR ? Thanks vdharmadhikari
Hello all, I am attempting to ingest data via a python script that retrieves data from an API and then forwards the results into my personal Splunk Cloud instance using the HTTP event collector.  I ... See more...
Hello all, I am attempting to ingest data via a python script that retrieves data from an API and then forwards the results into my personal Splunk Cloud instance using the HTTP event collector.  I am running into an issue with the splunk field extractions when I run my python script and I have attached a screenshot below.  The results are being sent as one event when I want them to be sent as 19 individual events and the interesting fields are also including "near_earth_objects.2020-01-02") when I just want the fields to be "absolute_magnitude_h" since I intend to ingest historical data for multiple dates.  Is the issue with data formatting being sent from my python script or do I need to change a configuration in my splunk cloud instance. Any help would be greatly appreciated.
I have been attempting to search my index on my linux vm. I have already added the data files associated with the corresponding index, however, when I go to search the index it keeps giving me an err... See more...
I have been attempting to search my index on my linux vm. I have already added the data files associated with the corresponding index, however, when I go to search the index it keeps giving me an error of no events. Once I add the data again, it will find events but it will only last for one search. I'm not sure how to fix this error, I have followed all instructions on how to install indexes and the data but it's still not working. 
Will it be possible in Splunk to give user an option to show hide panels within a dashboard as per their needs.    So if a dashboard has 10 panels/metrics,and user wishes to see only 5 , can this be... See more...
Will it be possible in Splunk to give user an option to show hide panels within a dashboard as per their needs.    So if a dashboard has 10 panels/metrics,and user wishes to see only 5 , can this be done ? Would appreciate if someone can propose any idea. TIA 
I am using Splunk_TA_snow in my environment. Can I configure the TA based on user that is logged in? Below is the description of what I am looking for: So let's say there are 2 users: user A and Us... See more...
I am using Splunk_TA_snow in my environment. Can I configure the TA based on user that is logged in? Below is the description of what I am looking for: So let's say there are 2 users: user A and User B. User A configures account X. User B configures account Y. Now, when I login in Splunk with user A I should ONLY see account X  configured and should only be able to use account X.(With current TA I would see both the accounts as well as I would be able to edit both of those accounts. So though I am logged in as user A, I am free to do anything using the account of user B.) Splunk allows role-based configuration for each knowledge object. Ex: for eventtypes I can instruct Splunk who should be able to read it or write it but that doesn't hold true for the accounts that I create in the application. Is there any way using which I can achieve this? Thanks, in advance.
Hi all,   After searching for the answer, I only found some very old posts from 2015 or so... I am trying to use the Splunk Addon Builder to create a Rest call : https://api.spotify.com/v1/me In ... See more...
Hi all,   After searching for the answer, I only found some very old posts from 2015 or so... I am trying to use the Splunk Addon Builder to create a Rest call : https://api.spotify.com/v1/me In the example it's giving me an OAuth v2 token for the header but I don't seem to make it work by using OAuth as a parameter and the Auth token I've been given for my account as a value, or trying to pass it as a parameter in the url with ?OAuth=<token> Is there something I am not thinking about? I don't seem to find the answer on splunk doc or inputs.conf specs either. Thank you for your time as usual. Cheers   Laurent
We have a massive Splunk environment and QA process is pretty stringent when it comes to data onboarding. As part of that, we also do check the magix six props.conf attributes but process to check is... See more...
We have a massive Splunk environment and QA process is pretty stringent when it comes to data onboarding. As part of that, we also do check the magix six props.conf attributes but process to check is time consuming. Hence, wondering what approach others here are taking to make the process fast and efficient. Any help on this would be highly appreciated. Thanks!
I was hoping to find some fancy app-addon that gives me all the basic windows 10 monitoring stats, cpu, disk, network etc.   Looking at https://docs.splunk.com/Documentation/Splunk/8.1.1/Data/Monit... See more...
I was hoping to find some fancy app-addon that gives me all the basic windows 10 monitoring stats, cpu, disk, network etc.   Looking at https://docs.splunk.com/Documentation/Splunk/8.1.1/Data/MonitorWindowsperformance it gets all the data but I need to do the harder work and build all the dashboards manually. Then looking at https://splunkbase.splunk.com/app/742/ it seems to not have dashboards either just adds a bunch of inputs to also get the data in! I gotta be missing something here - there must be an app with dashboards etc that shows me this data??
How can i integrate my windows web server IIS logs to splunk. I am planning to setup lab for splunk by using windows server.Can i get help from someone for this.
 I have a time picker on my dashboard where I select between (start - end) time range. My dashboard populates all the panels. My dashboard has 6 panels. I want to see that time in a panel on my dashb... See more...
 I have a time picker on my dashboard where I select between (start - end) time range. My dashboard populates all the panels. My dashboard has 6 panels. I want to see that time in a panel on my dashboard. If I add a panel it wants me to select a panel type (what type do I need?)   My time picker has the token "Time" and it is shared to all panels.  How do I get the time select in the time picker to be on the dashboard (new Panel) so that it shows up in my pdf report?   Thank You
Hi guys, I should find a solution to ingest a file on a network share that is managed from two server behind a load balancer. In my case i should install an UF on every server and check the same fil... See more...
Hi guys, I should find a solution to ingest a file on a network share that is managed from two server behind a load balancer. In my case i should install an UF on every server and check the same file, if i have well understood the UF has not a kind of cluster/balancing awareness so if the file is updated from one server i risk to have the same data ingested from both of them. How i can manage this kind of behaviour and avoid that new data is ingested from the temporary passive server ? Unfortunately i cannot use a third server to monitor the file and bypass the issue. Thanks
I would like to run an audit report on who has created dashboards and reports in Splunk.  Any help is greatly appreciated! 
Hi, I am having a situation where a lookup table defines search filters that needs to be used as part of search query. The dynamic filter (data_owner_filter) is built from original search results an... See more...
Hi, I am having a situation where a lookup table defines search filters that needs to be used as part of search query. The dynamic filter (data_owner_filter) is built from original search results and subsearch filters are defined by lookup table, where filters can either be inclusive or exclusive. I have tried with a following kind of approach, but the problem of subsearch not being able to reach value defined as data_owner_filter:   <search> | eval data_owner_filter=mvindex(split(data_owner,"_"),1) | search ([| inputlookup lookup_table.csv | search static_filter="use_case_1" dynamic_filter=data_owner_filter rule_type="inclusive" | fields fieldx]) | search NOT ([| inputlookup lookup_table.csv | search static_filter="use_case_1" dynamic_filter=data_owner_filter rule_type="exclusive" | fields fieldx]) | table fieldx, fieldy, data_owner     Example of the lookup table (table can have hundreds of entries): static_filter  | dynamic_filter | rule_type | fieldx use_case_1 | 001                       | inclusive  | abc* use_case_1 | 001                       | exclusive | efg* use_case_1 | 002                       | inclusive  | bcd* use_case_1 | 002                       | inclusive  | abc* use_case_2 | 002                       | inclusive  | abc* use_case_2 | 002                       | exclusive | hij* ... The idea behind the whole approach is to have a single lookup table to handle various inclusions and exclusions for data related to different data owners (owner defined on data_owner_filter) while having a single search alert configured per use case (defined by "static_filter"). Any suggestion how this could be accomplished?
We just got a new splunk cloud instance/stack (we now have a total of 2 splunk cloud instances) and attempting to send data to it from our Universal Forwarder.   Currently the universal forwarder is ... See more...
We just got a new splunk cloud instance/stack (we now have a total of 2 splunk cloud instances) and attempting to send data to it from our Universal Forwarder.   Currently the universal forwarder is only sending data to one of those instances (the original).  I have installed the universal forwarder credentials from the new instance on the universal forwarder and can see both cloud apps in /opt/splunkforwarder/apps directory but we are only seeing forwards from the original still when I do a 'splunk list forward-server' command.   I have checked and made sure that the port is open on the new instance, I'm also able to telnet to the new instance from the universal forwarder.   Is there anything else I need to do to send to both instances?   Eventually we won't have all the data going to both instances as we'll be migrating some of the data over to the new one but at the moment we want both to be identical.  
Hi all,   I configured an EDL and URL feed from Autofocus by following the steps in https://splunk.paloaltonetworks.com/autofocus-and-minemeld.html.  However, when I try to review the details from ... See more...
Hi all,   I configured an EDL and URL feed from Autofocus by following the steps in https://splunk.paloaltonetworks.com/autofocus-and-minemeld.html.  However, when I try to review the details from the macros from the link above,  no results are returned.   From the log file: /opt/splunk/var/log/splunk/Splunk_TA_paloalto_minemeld_feed.log I get the following entry for the EDL feed: 2021-01-05 15:29:16,550 ERROR pid=208666 tid=MainThread file=base_modinput.py:log_error:309 | Get error when collecting events. Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/splunk_ta_paloalto/aob_py3/modinput_wrapper/base_modinput.py", line 128, in stream_events self.collect_events(ew) File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/minemeld_feed.py", line 72, in collect_events input_module.collect_events(self, ew) File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/input_module_minemeld_feed.py", line 84, in collect_events mmf_entries = get_feed_entries(helper, name, start, stats) File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/input_module_minemeld_feed.py", line 45, in inner ret_val = func(*args) File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/input_module_minemeld_feed.py", line 157, in get_feed_entries feed_entries = resp.json() File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/splunk_ta_paloalto/aob_py3/requests/models.py", line 897, in json return complexjson.loads(self.text, **kwargs) File "/opt/splunk/lib/python3.7/json/__init__.py", line 348, in loads return _default_decoder.decode(s) File "/opt/splunk/lib/python3.7/json/decoder.py", line 340, in decode raise JSONDecodeError("Extra data", s, end) json.decoder.JSONDecodeError: Extra data: line 1 column 4 (char 3)   From the URL feed, I get: 2021-01-08 12:12:19,748 ERROR pid=15255 tid=MainThread file=base_modinput.py:log_error:309 | Failed to get entries for "af_daily": 401 Client Error: Unauthorized for url: https://autofocus.paloaltonetworks.com/output/threatFeedResult?v=json&tr=1   I have verified/retried the credentials and the API key (for Autofocus) to confirm that I have the correct value.     Note: I do get results from accessing the EDL/URL feeds manually via cURL.     Please let me know what else I can try.
I am trying to index hierarchical XML log files into Splunk. The file contains several groups of data linked by ID fields. I need to flatten out the data before indexing so queries can search the dat... See more...
I am trying to index hierarchical XML log files into Splunk. The file contains several groups of data linked by ID fields. I need to flatten out the data before indexing so queries can search the data as a flat table. Here is an example of an XML log: <?xml version="1.0" encoding="utf-8"?><FleetData dateCreated="2021-01-08T15:20:07.1046931Z" xmlns:a="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.cleverdevices.com/FleetManager"><Locations><Location><CustomerLocationId/><Deleted>false</Deleted><LocationId>14</LocationId><Name>Orange</Name><NetworkAddress/><ObjectVersion>AAAAAAAAB+w=</ObjectVersion><OrganizationId>0</OrganizationId></Location><Location><CustomerLocationId/><Deleted>false</Deleted><LocationId>32</LocationId><Name>CMF</Name><NetworkAddress/><ObjectVersion>AAAAAAAAB9k=</ObjectVersion><OrganizationId>0</OrganizationId></Location></Locations><VehicleTypes><VehicleType><BusToolsBusType>-1</BusToolsBusType><Deleted>false</Deleted><Manufacturer>[Unassigned]</Manufacturer><Model/><ObjectVersion>AAAAD5sp1C8=</ObjectVersion><OrganizationId>-1</OrganizationId><VehicleTypeId>-1</VehicleTypeId></VehicleType><VehicleType><BusToolsBusType>28</BusToolsBusType><Deleted>false</Deleted><Manufacturer>NABI</Manufacturer><Model>Transit-Cummins (T23)</Model><ObjectVersion>AAAAD5sp2kE=</ObjectVersion><OrganizationId>0</OrganizationId><VehicleTypeId>33</VehicleTypeId></VehicleType></VehicleTypes><Vehicles><Vehicle><Deleted>false</Deleted><EffectiveOrganizationId>0</EffectiveOrganizationId><InService>true</InService><LicensePlate/><LocationId>14</LocationId><ModelYear>2019</ModelYear><ObjectVersion>AAAAD5u2qqg=</ObjectVersion><OrganizationId>-1</OrganizationId><VehicleId>3201</VehicleId><VehicleNumber>1</VehicleNumber><VehicleTypeId>-1</VehicleTypeId><Vin/></Vehicle><Vehicle><Deleted>false</Deleted><EffectiveOrganizationId>-1</EffectiveOrganizationId><InService>true</InService><LicensePlate/><LocationId>32</LocationId><ModelYear>2017</ModelYear><ObjectVersion>AAAAD5u3+Os=</ObjectVersion><OrganizationId>-1</OrganizationId><VehicleId>2702</VehicleId><VehicleNumber>10</VehicleNumber><VehicleTypeId>33</VehicleTypeId><Vin/></Vehicle></Vehicles></FleetData> The resulting index should look like this: dateCreated VehicleId BusToolsBusType LocationName VehicleInService VehicleDeleted 2021-01-08T15:20:07.1046931 3201 -1 Orange True False 2021-01-08T15:20:07.1046931 2702 28 CMF True False   I have searched the Splunk documentation and the Internet but cannot find any information on how to index these types of files. Any assistance with this would be greatly appreciated.
So I'm trying to get to a HEC/syslog type environment. Please don't tell me that sending WinEvents via syslog is dumb. I have a global policy issue at work where all log data must be collected and st... See more...
So I'm trying to get to a HEC/syslog type environment. Please don't tell me that sending WinEvents via syslog is dumb. I have a global policy issue at work where all log data must be collected and stored in syslog for 2 years, and that includes my WinEvents. Here's my situation. I have a syslog-ng (3.13) server that receives Snare WIndows Events. It then sends on data to a splunk Heavy Forwarder (currently using the stock syslog "log {};" command). I've installed both the Snare Explanded Syslog TA and the Splunk_Windows TA (6.0) on the HF and the SH. I have also modified the inputs and props for the Windows TA to enable the data I want and do some cleanup (I did read setup for the Windows TA). However, my data in the SH is still listed with the correct host (in this example a domain controller), the source is the TCP port I'm receiving data on in splunk, and the sourcetype is syslog. Not snare_syslog, not winevent_syslog (or whatever that sourcetype is). I'm confused and stuck. Even if I migrate to a syslog-http-HEC environment (it's in dev right now) how do I get splunk to properly format the incoming data with extracted fields and the correct sourcetype in this design?
Splunk TA O365 is installed on the Splunk Cloud IDM.  I created a custom index on the Splunk Cloud search head.  That index created index does show up on the IDM list of indexes.  But when creating a... See more...
Splunk TA O365 is installed on the Splunk Cloud IDM.  I created a custom index on the Splunk Cloud search head.  That index created index does show up on the IDM list of indexes.  But when creating a new input in the Splunk TA O365 on the IDM that index is not in the list of selectable indexes. 
I currently working with LDAP logs. Found from splunk base https://splunkbase.splunk.com/app/1151/. For this configuration for this apps in Splunk Web, there is require to have access to LDAP server?... See more...
I currently working with LDAP logs. Found from splunk base https://splunkbase.splunk.com/app/1151/. For this configuration for this apps in Splunk Web, there is require to have access to LDAP server? Or there are any suitable apps to analyze LDAP logs? Really appreciate if there is any solutions that I can explore.