All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Error 01-27-2021 08:08:46.410 -0300 WARN ScopedLDAPConnection - strategy="SIEM" LDAP Server returned warning in search for DN="OU=XX,DC=XX,DC=XX,DC=br". reason="Size limit exceeded" 1-27-2021 08:08... See more...
Error 01-27-2021 08:08:46.410 -0300 WARN ScopedLDAPConnection - strategy="SIEM" LDAP Server returned warning in search for DN="OU=XX,DC=XX,DC=XX,DC=br". reason="Size limit exceeded" 1-27-2021 08:08:46.411 -0300 ERROR AdminHandler:AuthenticationHandler - Failed to retrieve a group with these settings. Consult your LDAP admin or see splunkd.log with ScopedLDAPConnection set to DEBUG for more information. We have Splunk 8.1.1 over SUSE 12 and we are trying to connect to AD in order to allow some specific groups. The problem is that Splunk can only "see" a few groups. We have been changing de OU, all kind of conditions and the problem is the same. It is not a permission problem because other tools with the same user can see all groups. We have around +9.000 groups in AD.  Splunk is able to see just 354 groups. We tried to include a static group to minimize the number of occurrencies, but Splunk is not able to find the correct groups. It only see olders groups in AD. The new group that we create for this, it can't see. What are the options to find the problem? Any others passed for this?  
Hello there:) I'm pulling Azure data via  Splunk Add-on for Microsoft Office 365 Microsoft Azure Add-on for Splunk   both add-ons are CIM compliant, yet RWI is not able to recognize Azure AD sig... See more...
Hello there:) I'm pulling Azure data via  Splunk Add-on for Microsoft Office 365 Microsoft Azure Add-on for Splunk   both add-ons are CIM compliant, yet RWI is not able to recognize Azure AD signin logs - would there be anything else need to be configured?    Thanks a lot
Hi All, I would like to send individual emails , that i did using sendresults command. But am unable to send individual emails along with cc. Email CC abc@g.com manager@g.com 123@g.c... See more...
Hi All, I would like to send individual emails , that i did using sendresults command. But am unable to send individual emails along with cc. Email CC abc@g.com manager@g.com 123@g.com manager@g.com     In above table i want to send individually mail like below:-  Mail 1 -- to:abc@g.com cc:manager@g.com Mail2 -- to:123@g.com cc:manager@g.com
Hi,  I am forwarding  logs to indexer and  also to third party server  from my universal forwarder I am sure what we are configured on inputs.conf  that only logs will send to indexer Now I have a... See more...
Hi,  I am forwarding  logs to indexer and  also to third party server  from my universal forwarder I am sure what we are configured on inputs.conf  that only logs will send to indexer Now I have added another third party server on my  outputs.conf , so my question will it sent whole logs or only the inputs.conf file log ..?  
Hello,  I have the following search string:    index = app_events_dbdetect_actimize_event_us_uat sourcetype = txndata | rangemap field=Time_taken_AIS_tomcat 0s-to-0.05s=0-50 0.05s-to-0.10s=51-100 ... See more...
Hello,  I have the following search string:    index = app_events_dbdetect_actimize_event_us_uat sourcetype = txndata | rangemap field=Time_taken_AIS_tomcat 0s-to-0.05s=0-50 0.05s-to-0.10s=51-100 0.10s-to-0.15s=101-150 0.15s-to-0.20s=151-200 0.20s-to-0.30s=201-300 0.30s-to-0.50s=301-500 0.50s-to-1s=501-1000 1s-to-2s=1001-2000 2s-to-3s=2001-3000 3s-to-5s=3001-5000 5s-to-30s=5001-30000 >30s=30001-99999    Which is passed to a pivot table as a dataset.  The output is like this: Now, there are some things that I want to know:  a) how can I sort the Column Values to be fraasdetppu1, fraasdetppu2, fraasdetppu10, fraasdetppu20...? It currently displays the information as fraasdetppu1, fraasdetppu10, fraasdetppu2, fraasdetppu21... b) If the output has 0 values for 0s to 0.05s (for example), can I have it displayed here? I cannot seem to find where to get them displayed even if the "scored" items are 0 for the other ranges. c) The grey line that has the totals shown for each server: can I add custom text under "scored" column? d) Can a "Total" column be added right before the server names? It needs to sum up the count values of each server, for that specific "scoring" speed.  Thank you!
Hi Splunker, I tried to use timestamp as "rising column" but it didnt work. Please help!!!   when input type = Rising              
Hi Splunker; I have file without timestamp, and Splunk monitoring this file, once any new logs coming to this file, Splunk read all the logs in this file (old and new logs), so how I can to do confi... See more...
Hi Splunker; I have file without timestamp, and Splunk monitoring this file, once any new logs coming to this file, Splunk read all the logs in this file (old and new logs), so how I can to do configuration to Splunk read only the new logs coming to the file.   Please help me.   Best Regards; 
Hello community, we are using the german localization which is fine for the general ease of use of our users to navigate through Splunk.  But the localization also leads to automatic translation ... See more...
Hello community, we are using the german localization which is fine for the general ease of use of our users to navigate through Splunk.  But the localization also leads to automatic translation of parts of the labels and input fields (e.g. multiselect input fields)  in user-created dashboards. The latter is particularly irritating, because the translation isn't always 100 percent fitting and most of the time you get an ugly mix of translated and english values. Is there a way to prevent a dashboard  from getting auto-translated - without having to set the language specifier and localization specifier to english? Regards, Jens
hi there, i monitor windows security event log from the DC with RAW SYSLOG. i can see in Splunk the raw data (without the default Syslog RFC's) also i can see that the data as xml view. When i dow... See more...
hi there, i monitor windows security event log from the DC with RAW SYSLOG. i can see in Splunk the raw data (without the default Syslog RFC's) also i can see that the data as xml view. When i downloaded the Splunk add on for windows i configured the WinEventLog source type to my UDP data input (where only windows  security event log from the DC is delivered) i can see that the fields are extracted with the XML headers  example: System.EventId EventData.LogonType   I Can only Receive syslog  thanks in advance
Hi Splunk, We have data like this: ( how to get the result like on the table StartError EndError and SumCall ?) I have tried with command ' eventstats first(datetime) as StartError last(datetime) as... See more...
Hi Splunk, We have data like this: ( how to get the result like on the table StartError EndError and SumCall ?) I have tried with command ' eventstats first(datetime) as StartError last(datetime) as EndError by status_code' but the result is not like we expected. Thank you datetime Name app_version status_code StartError EndError sumcall 2021-01-25T11:22:34.848Z AAAA 1.0.0 403 Forbidden 2021-01-25T11:22:34.848Z 2021-01-25T12:01:45.478Z 3 2021-01-25T11:24:23.242Z AAAA 1.0.0 403 Forbidden 2021-01-25T12:01:45.478Z AAAA 1.0.0 403 Forbidden 2021-01-25T10:07:25.753Z AAAA 1.0.0 200 OK -     2021-01-26T07:55:51.835Z BBBB 1.0.0 401 Unauthorized 2021-01-26T07:55:51.835Z 2021-01-26T07:55:51.835Z 1 2021-01-26T08:00:14.970Z BBBB 1.0.0 200 OK - -   2021-01-25T13:48:21.898Z CCCC 1.0.0 403 Forbidden 2021-01-25T13:48:21.898Z 2021-01-25T13:48:40.131Z 5 2021-01-25T13:48:23.851Z CCCC 1.0.0 403 Forbidden 2021-01-25T13:48:25.338Z CCCC 1.0.0 403 Forbidden 2021-01-25T13:48:38.672Z CCCC 1.0.0 403 Forbidden 2021-01-25T13:48:40.131Z CCCC 1.0.0 403 Forbidden
I have a very long regex query (12,000) character long- it consist o different hostname and IP Address combinations. Now when i run the regex it shows :: Regex: regular expression is too large.  ... See more...
I have a very long regex query (12,000) character long- it consist o different hostname and IP Address combinations. Now when i run the regex it shows :: Regex: regular expression is too large.   As per checking the Regex can only accommodate - 8190 character. In the image you can see i use "a" letter 8190 time. but if i add another letter it will show the error.  Can somebody explain to me why is this happening and how can i execute my regex properly.            
I have the input working for long time  after it stopped working I have reinstalled the Add-on 1.2.4 Now I am a lot of data I need to import  how you would recommend to setup the input (delay_thro... See more...
I have the input working for long time  after it stopped working I have reinstalled the Add-on 1.2.4 Now I am a lot of data I need to import  how you would recommend to setup the input (delay_throttle , query_window_size ,interval ) ?   [splunk@ilissplfwd05 local]$ cat inputs.conf [ms_o365_message_trace://o365tracking] delay_throttle = 720 index = o365 input_mode = continuously_monitor interval = 30 office_365_account = o365tracking query_window_size = 30 start_date_time = 2021-01-21T00:00:01 disabled = 0 [splunk@ilissplfwd05 local]$
Hello, Splunk Enterprise Trial license expires 60 days after we install the Splunk Enterprise instance. My query is, can I switch to Enterprise version in the middle of the ongoing 60 days trial ver... See more...
Hello, Splunk Enterprise Trial license expires 60 days after we install the Splunk Enterprise instance. My query is, can I switch to Enterprise version in the middle of the ongoing 60 days trial version, by providing the license. Will there be any data loss by doing so? Thanks and regards, Tanaya Mukhopadhyay.
Hi - i'm working on a simple dashboard where user will pick a certain date in a multipicker. Once date is being picked it will display data on every panel. My issue is that whenever i picked multiple... See more...
Hi - i'm working on a simple dashboard where user will pick a certain date in a multipicker. Once date is being picked it will display data on every panel. My issue is that whenever i picked multiple dates for example: 27-Nov-2020 20-Nov-2020 30-Oct-2020 it will always get to the point that it will display the highest numeric value  (30-Oct-2020) despite that I picked the other dates first. what i am looking for is to display date based on what did i select in the picker regardless of its numeric value and they should get display from right to left i tried to change the date format in the lookup to see if it won't get sorted automatically but to no avail. it seems that date with the highest numerical values will automatically get sorted ascendingly
Hello I use the search below `wire` | fields AP_NAME USERNAME LAST_SEEN | eval USERNAME=upper(USERNAME) | eval LAST_SEEN=strptime(LAST_SEEN, "%Y-%m-%d %H:%M:%S.%1N") | lookup aps.csv NAME as AP... See more...
Hello I use the search below `wire` | fields AP_NAME USERNAME LAST_SEEN | eval USERNAME=upper(USERNAME) | eval LAST_SEEN=strptime(LAST_SEEN, "%Y-%m-%d %H:%M:%S.%1N") | lookup aps.csv NAME as AP_NAME OUTPUT Building Country Site | lookup fo_all HOSTNAME as USERNAME output SITE BUILDING_CODE | eval Building=upper(Building) | eval Site=upper(Site) | eval SITE=upper(SITE) | eval LAST_SEEN = strftime(LAST_SEEN, "%Y-%m-%d %H:%M") | stats last(LAST_SEEN) as "Last check date", last(AP_NAME) as "Access point", last(Site) as "Geolocation site", last(Building) as "Geolocation building", last(SITE) as "SNOW site", last(BUILDING_CODE) as "SNOW building" by USERNAME | where NOT ('Geolocation building' = 'SNOW building') | rename USERNAME as Hostname | sort -"Last check date" As you can see in the where clause, I just need to display the Geolocation building fields which are differents than the SNOW building But it works randomly because I have fields where Geolocation building = SNOW building and where Geolocation building fields are not equal to SNOW building fields I tried : | where NOT like ('Geolocation building','SNOW building') | where NOT match ('Geolocation building','SNOW building') But it changes anything! Other problem I need to do a conditional formating on SNOW building field but no colors are displayed! <format type="color" field="SNOW building"> <colorPalette type="map">{"ZB12":#4FA484,"G39":#AF575A,"ZD30":#294E70,"A50":#53A051,"E74":#B6C75A,"ZH38":#F8BE34}</colorPalette> </format> What is the global problem please?  
Good Morning Everyone!   I am trying to see what components are in my Splunk environment.  I just inherited a system with splunk on it and as far as I know I am on a management server and i am acce... See more...
Good Morning Everyone!   I am trying to see what components are in my Splunk environment.  I just inherited a system with splunk on it and as far as I know I am on a management server and i am accessing a splunk web client which i presume is the search head.... (that's one component down...i think). I understand Splunk enterprise needs a forwarder...and an indexer and a search head to function correctly...but without knowing what components i have inherited i am not really sure that it is working.   also I have done some initial research on an message i received upon barely logging in... "The minimum free disk space (5000MB) reached for /opt/splunk/var/run/splunk/dispatch on an indexer..   ^A)my research has shown me that its possible splunk is forwarding to itself. B) i can remedy the error by editing the .conf file responsible for setting the min. quota c) assess the storage available and allocate more space to said directory.   knowing the above options ...what do you think is best in my scenario? again i am super new to this enviornment  
Hi, I have simplified my query as much as possible. Basically I am looking at two issues with this: 1: I cannot perform the joins because a subsearch can only contain 50.000 results, which is not ... See more...
Hi, I have simplified my query as much as possible. Basically I am looking at two issues with this: 1: I cannot perform the joins because a subsearch can only contain 50.000 results, which is not enough to make my query join properly (potential matches are truncated). A "stats" should be able to do this instead, but I am not sure how to replace both the joins (also considering the next issue)... 2: In my example, I have the first join based on the fieldname "name". In "SourcetypeA" that "name" is a single, unique value. In "sourcetypeB" however, that "name" is part of an array called "names" which contains multiple names, one of which is the "name" value from "SourcetypeA". An mvexpand on "SourcetypeB" would resolve that issue, but that creates even more subsearch results (it breaks up event in multiple events) which makes issue number 1 above even worse. How can I get around both these issues? index=indexA sourcetype=sourcetypeA | join name [search sourcetype=sourcetypeB | fields name fieldB] | join fieldB [search sourcetype=sourcetypeC | fields fieldC ] | table name fieldB fieldC  
Hi ,  I have a report that is ingested in splunk. Due to the report format not correctly ingested by splunk, I had done some preprocessing of the report and named reportfile.rep and picked up by spl... See more...
Hi ,  I have a report that is ingested in splunk. Due to the report format not correctly ingested by splunk, I had done some preprocessing of the report and named reportfile.rep and picked up by splunk every 15 mins. This report is delayed almost 40 minutes(as the processing and transferring of data takes time) , so the time stamp indexed by splunk is around 40 minutes delayed from the report. example:   sample file: reportfile.rep ReportID=a004_012721.1400,Queue=xxx,AgentList=xxxx ReportID=a004_012721.1400,Queue=xxx   _time = 2021-01-27T14:40:04.000+11:00 So report was for 14:00 however _time is 14:40.  Is there anyway I can overwrite _time value to be picked up from the report file.  I had seen some examples in splunk answers using transforms.conf and props.conf. However this is based on using the actual file name and not the content inside the file.     
I am trying to average the sum of power consumption readings between 2 days and compare that sum to a 3rd day. If the 3rd day’s total power consumption is 20% higher than the average of the previous ... See more...
I am trying to average the sum of power consumption readings between 2 days and compare that sum to a 3rd day. If the 3rd day’s total power consumption is 20% higher than the average of the previous 2 days, I would like to flag the day as having more power consumption than usual. The main issue I have is in trying to do this comparison as I’m unsure if it’s possible to store data as variables similar to programming and am unable to do the full search/compute/compare in 1 line, in particular trying to target “specific dates relative to current date”. I am having difficulty trying to implement my logic process in splunk as I am still relatively new to the system and am unsure about the capabilities and syntax of this platform.  
Need to  Setup monitoring or alerting  Splunk license utilization.   Is there any query we can use to set up alert for splunk liscence utilization and warning if it is increase   Thanks, Sahil