All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We went through an upgrade to the latest version of Slunk Enterprise, no problem. However, when we started to upgrade the apps  - specifically a Palo Alto plug-in app - Splunk crashed and now the spl... See more...
We went through an upgrade to the latest version of Slunk Enterprise, no problem. However, when we started to upgrade the apps  - specifically a Palo Alto plug-in app - Splunk crashed and now the splunkd service will not stay started. I'm not sure which logs to look at, either, to try to figure out what happened. I have gone through the splunkd.log file but all I see is a bunch of errors (shown below). I know that this isn't much to go on, but can anyone help walk me through trying to troubleshoot this?  Thank you in advance, Eric 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: from splunk.appserver.mrsparkle.lib.util import make_splunkhome_path 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\appserver\mrsparkle\lib\util.py", line 30, in <module> 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import splunk.search.Parser 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\search\Parser.py", line 7, in <module> 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import httplib2 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunktamscs\httplib2\__init__.py", line 28, in <module> 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import email.FeedParser 02-11-2021 11:12:26.015 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: ModuleNotFoundError: No module named 'email.FeedParser' 02-11-2021 11:12:26.513 -0800 WARN PersistentScript - Process {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: exited with code 1 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: Traceback (most recent call last): 02-11-2021 11:12:27.014 -0800 ERROR AdminManagerExternal - Received malformed XML from external handler: 02-11-2021 11:12:27.014 -0800 ERROR AdminManagerExternal - Unable to xml-parse the following data: %s 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py", line 19, in <module> 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: from mscs_util import mscs_consts, get_proxy_info_from_endpoint, get_logger, check_account_isvalid 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunktamscs\mscs_util.py", line 13, in <module> 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: from splunk.appserver.mrsparkle.lib.util import make_splunkhome_path 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\appserver\mrsparkle\lib\util.py", line 30, in <module> 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import splunk.search.Parser 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\search\Parser.py", line 7, in <module> 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import httplib2 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunktamscs\httplib2\__init__.py", line 28, in <module> 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import email.FeedParser 02-11-2021 11:12:27.014 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: ModuleNotFoundError: No module named 'email.FeedParser' 02-11-2021 11:12:27.014 -0800 WARN PersistentScript - Process {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: exited with code 1 02-11-2021 11:12:27.051 -0800 ERROR AdminManagerExternal - Received malformed XML from external handler: 02-11-2021 11:12:27.051 -0800 ERROR AdminManagerExternal - Unable to xml-parse the following data: %s 02-11-2021 11:12:29.073 -0800 WARN PersistentScript - Process {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: exited with code 1 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: Traceback (most recent call last): 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py", line 19, in <module> 02-11-2021 11:12:29.574 -0800 ERROR AdminManagerExternal - Received malformed XML from external handler: 02-11-2021 11:12:29.574 -0800 ERROR AdminManagerExternal - Unable to xml-parse the following data: %s 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: from mscs_util import mscs_consts, get_proxy_info_from_endpoint, get_logger, check_account_isvalid 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunktamscs\mscs_util.py", line 13, in <module> 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: from splunk.appserver.mrsparkle.lib.util import make_splunkhome_path 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\appserver\mrsparkle\lib\util.py", line 30, in <module> 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import splunk.search.Parser 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\Python-3.7\lib\site-packages\splunk\search\Parser.py", line 7, in <module> 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import httplib2 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: File "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunktamscs\httplib2\__init__.py", line 28, in <module> 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: import email.FeedParser 02-11-2021 11:12:29.574 -0800 ERROR PersistentScript - From {"E:\Program Files\Splunk\bin\Python3.exe" "E:\Program Files\Splunk\etc\apps\Splunk_TA_microsoft-cloudservices\bin\splunk_ta_mscs_rh_azureaccount.py" persistent}: ModuleNotFoundError: No module named 'email.FeedParser' 02-11-2021 11:12:30.629 -0800 INFO ClientSessionsManager - Adding client: ip=10. uts=windows-x64 id=53CDFDC2-FA78-4D51-B339-B52DFC28F996 name=53CDFDC2-FA78-4D51-B339-B52DFC28F996
I'm trying to search between 2 indexes that correlates field value to return back certain fields. For example index a has the fieldname named src_ip and index b has a fieldname named src. The values... See more...
I'm trying to search between 2 indexes that correlates field value to return back certain fields. For example index a has the fieldname named src_ip and index b has a fieldname named src. The values are the same, but the fieldname are different. I want to use these values to correlate the data, but I want to also return field names that aren't in index a, but located in index b.   Here's my current quey.   index=a categories="media" | where bytes_out > bytes_in | fields _time, cs_user, src_ip, cs_auth_group, cs_host, cs_method, status, bytes_in, bytes_out, cs_User_Agent | eval src=src_ip | join src [ search index=b | fields log_subtype, cat]  
Hi Everyone,   I have one requirement. I have created one Refresh button and on clicking on that its showing me the latest data. But when I am selecting any filter and clicking on Refresh Button i... See more...
Hi Everyone,   I have one requirement. I have created one Refresh button and on clicking on that its showing me the latest data. But when I am selecting any filter and clicking on Refresh Button its not taken the filters into Account. Like I have one user textbox when I am putting any name in it and clicking on Refresh button its taking default values. Can someone guide me on that. Below is my code for the same: <label>Nifi Process Dashboard</label> <fieldset submitButton="true" autoRun="true"> <html> <a href="nifi_process_dashboard_clone" class="btn btn-primary">Refresh</a> </html> <input type="time" token="field1" searchWhenChanged="true"> <label>Date/Time</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> </input> <input type="text" token="process_tok1"> <label>Processor Id</label> <default>*</default> </input> <input type="text" token="ckey" searchWhenChanged="true"> <label>Parent Chain</label> <default></default> <prefix>parent_chain="*</prefix> <suffix>*"</suffix> <initialValue></initialValue> </input> <input type="text" token="usr"> <label>User</label> <default>*</default> </input> </fieldset> Thanks in advance.
I have a requirement to write the custom curl command and stream the output as success/fail and sendemail if the output is success. I have a python script to achieve this as pretty simple. import o... See more...
I have a requirement to write the custom curl command and stream the output as success/fail and sendemail if the output is success. I have a python script to achieve this as pretty simple. import os result = os.popen("curl -ILk https://user-prdserver.domain.com/wrapper.json --resolve user-prdserver@domain.com:$port$:$vip$ -H 'Host:$host$' -v").read() print result  Also I have pretty good reference on how to set it up. But I am not sure how to convert this script with Splunk command requirement. so that it will be compatible in SPL and can stream the output. When I add this as it is and run from SPL as ....my search|curlit Could not locate the time (_time) field on some results returned from the external search command 'curlit'.
Hello !  I am sorry if the issue has already been addressed. Several topics talk about it but I haven't been able to adapt it to my situation and I am new to Splunk. So I have data always in the... See more...
Hello !  I am sorry if the issue has already been addressed. Several topics talk about it but I haven't been able to adapt it to my situation and I am new to Splunk. So I have data always in the same form:   WorkInfo="Job:Initialize job|Result:succeeded|TaskName:|TaskVersion:|IssuesMessages: <br> Job:Checkout to s|Result:succeeded|TaskName:|TaskVersion:|IssuesMessages: <br> Job:VisualStudioTestPlatformInstaller|Result:succeeded|TaskName:VisualStudioTestPlatformInstaller|TaskVersion:1.151.3|IssuesMessages: <br> Job:TestComplete adapter install|Result:succeeded|TaskName:InstallTestCompleteAdapter|TaskVersion:1.73.382|IssuesMessages: <br> Job:Tests Run : Campagne Globale|Result:skipped|TaskName:VSTest|TaskVersion:2.170.1|IssuesMessages: <br> Job:Post-job: Checkout Talent to s|Result:succeeded|TaskName:|TaskVersion:|IssuesMessages: <br> Job:Finalize Job|Result:succeeded|TaskName:|TaskVersion:|IssuesMessages: <br> Job:Tests Run 1 Tests|Result:failed|TaskName:VSTest|TaskVersion:2.170.1|IssuesMessages:[error] Test Run Failed. \\ [warning] Vstest failed with error. Check logs for failures. There might be failed tests. \\ failed with exit code 1 \\ [error] Vstest failed with error. Check logs for failures. There might be failed tests. <br> Job:QA TestsUI|Result:failed|TaskName:|TaskVersion:|IssuesMessages: <br> Job:Report build status|Result:succeeded|TaskName:|TaskVersion:|IssuesMessages: <br> Job:QA TestsUI_Deploy tc2|Result:failed|TaskName:|TaskVersion:|IssuesMessages:"   So we have several blocks which are separated by a <br>. In these blocks i have : Job: Result: TaskName: TaskVersion: IssuesMessages: Each of this information is separated by a pipe |. So I would like to display them as a table like this:   I'm not comfortable with regex expression and all my attempts at trying with split and makemv delim have been unsuccessful... Thank you for your help !
I'm running a query to label memory thresholds for our app clusters, I would like to create a field called "eff_mem_threshold" based  off the number of blades app name.  But for the life of me I can'... See more...
I'm running a query to label memory thresholds for our app clusters, I would like to create a field called "eff_mem_threshold" based  off the number of blades app name.  But for the life of me I can't figure out why this case statement isn't working.  What I mean is that  it only returns the eff_mem_threshold value of the first  pair for each app and blade count. I've added an example below the case statement Case statement query|eval eff_mem_threshold =case(APP_NAME="EXCH_AD" and Blades<=5, 40,APP_NAME="EXCH_AD" and Blades>=17,46,APP_NAME="EXCH_AD" and Blades>=6 and Blades<=16,44,APP_NAME="VCO" and Blades<=5, 56,APP_NAME="VCO" and Blades>=17,64,APP_NAME="VCO" and Blades>=6 and Blades<=16,61,APP_NAME="SQL" and Blades<=5, 68,APP_NAME="SQL" and Blades>=17,78,APP_NAME="SQL" and Blades>=6 and Blades<=16,74) what I see APP_NAME Blades eff_mem_threshold EXCH_AD 15 40 EXCH_AD 4 40 SQL 17 68 SQL 9 68 VCO 17 56 VCO 4 56 What I'd want to see APP_NAME Blades eff_mem_threshold EXCH_AD 15 44 EXCH_AD 4 40 SQL 17 64 SQL 9 61 VCO 78 56 VCO 4 56  
I'm trying to configure my forwarder on a Windows server to send the Web Application Proxy logs.  I'm using this format but it doesn't seem to be sending.  I also added ADFS and that worked.  Syntax ... See more...
I'm trying to configure my forwarder on a Windows server to send the Web Application Proxy logs.  I'm using this format but it doesn't seem to be sending.  I also added ADFS and that worked.  Syntax is as follows: [WinEventLog://AD FS/Admin] disabled = 0 [WinEventLog://Web Application Proxy/Admin] Disabled = 0 I couldn't find a documented list for all of the different windows logs.  Any help would be appreciated. 
I have an index for which I desire to retain 45 days of events . I have multiple values set under indexes.conf for the same :- frozenTimePeriodInSecs = 3888000 maxDataSize=auto_high_volume maxH... See more...
I have an index for which I desire to retain 45 days of events . I have multiple values set under indexes.conf for the same :- frozenTimePeriodInSecs = 3888000 maxDataSize=auto_high_volume maxHotBuckets = 10 maxWarmDBCount = 15 maxTotalDataSizeMB = 512000 Although desired retention is 45 days but index is only able to retain 21 days of data. Please be noted that I do understand meaning of all such parameters and how they may effect overall retention. But I am looking for some method / search query which can show/evaluate - what/which parameter amongst all above is taking precedence (based on ingested volume, bucket count reach, etc) for an index and rolling buckets to age and hence letting only 21 days of data to retain. 
I'm setting an alert that will run everdy business day at 9AM and triggers only if the sum of a field is 0 for 2 consecutive business day. To do that, I want to set my time range for my search to ... See more...
I'm setting an alert that will run everdy business day at 9AM and triggers only if the sum of a field is 0 for 2 consecutive business day. To do that, I want to set my time range for my search to return events from the last two business days. For example, if it is Monday, it will return data for Thursday and Friday, if it is Tuesday, it will return data for Friday and Monday. I tried to use "earliest" and "latest" at the beginning of my search  but I can't get it to work.... Any Help would be welcome !
Hello, I would like some help I am trying to combine 2 events from my index and 2 event coming from a lookup file, into 2 table lines. Like this: index=blala .. | table env host start end durat... See more...
Hello, I would like some help I am trying to combine 2 events from my index and 2 event coming from a lookup file, into 2 table lines. Like this: index=blala .. | table env host start end duration week yymm | append [ | inputlookup mylookup.csv | eval st_time=strptime(startdate, "%Y-%m-%d") | eval en_time=strptime(enddate, "%Y-%m-%d") | addinfo | where info_min_time>= st_time AND info_max_time<=en_time ] | table env host start end duration week yymm | eval env = "DEV" | table env host start end duration week yymm This is the rough output env  host    start                                     end                                         duration  week          yymm DEV host1 02/06/2021:10:29:52 02/06/2021:11:20:16    50             2021-05 DEV                                                                                                                                                     2105 DEV host2 02/06/2021:10:29:33 02/06/2021:11:07:42     38           2021-05 DEV                                                                                                                                                     2105 And this is what i am trying to accomplise DEV host1 02/06/2021:10:29:52 02/06/2021:11:20:16    50             2021-05   2105 DEV host2 02/06/2021:10:29:33 02/06/2021:11:07:42     38           2021-05    2105 I tried several commands, but i am unable to do so. Thank you in advance regards, Harry
Hi, when using the parameter query window size in the input to retrieve Azure AD signins the backoff time is not applied.  For example, if the query limit is 10 minutes, and the checkpoint (the las... See more...
Hi, when using the parameter query window size in the input to retrieve Azure AD signins the backoff time is not applied.  For example, if the query limit is 10 minutes, and the checkpoint (the last event retrieved is from now - 5 minutes) then the query sent to the graph endpoint is between now() - 5 minutes and now() + 5 minutes. Shouldn't the backoff time apply also when using a query limit?  Thanks.
Hi Team, We have a service in Splunk which calls 3 different APIs and do some business logic and responds back a Code(P, W, F). I have my events some what looks like below :  interaction-id is the c... See more...
Hi Team, We have a service in Splunk which calls 3 different APIs and do some business logic and responds back a Code(P, W, F). I have my events some what looks like below :  interaction-id is the common field. event1: myservice transaction begins event2: myservice calls first-api event3: myservice call to first-api is successful event4: myservice calls second-api event5: myservice calls to second-api is success event6: myservice calls third-api event7: myservice calls to third-api is success event8: myservice is respond with result code 'W' Now I need a table with these columns:   _time interaction-id is first-api successful ? is second-api successful ? is third-api successful? FInal Code sometime someinteractionId Yes yes yes W " " No yes yes X   Please help me with the query.   
Hi! im trying to blacklist events with code 4672 and with SubjectUserSid DOMAIN\SRV-XXX-AAA-99$  ive tried this line: blacklist2 = $XmlRegex="<EventID>4672<\/EventID>.*<Data Name='SubjectUserSid'>D... See more...
Hi! im trying to blacklist events with code 4672 and with SubjectUserSid DOMAIN\SRV-XXX-AAA-99$  ive tried this line: blacklist2 = $XmlRegex="<EventID>4672<\/EventID>.*<Data Name='SubjectUserSid'>DOMAIN\\SRV\-XXX\-AAA\-99\$"   but it isnt working.   Example: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-a5ba-3e3b0328c30d}'/><EventID>4672</EventID><Version>0</Version><Level>0</Level><Task>12548</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2021-02-11T15:18:48.860247900Z'/><EventRecordID>350156866</EventRecordID><Correlation ActivityID='{af83069e-fb2f-000b-110a-83af2ffbd601}'/><Execution ProcessID='188' ThreadID='41464'/><Channel>Security</Channel><Computer>srv-xxx-aaa--99.DOMAIN.LOCAL</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>DOMAIN\SRV-XXX-AAA-99$</Data><Data Name='SubjectUserName'>SRV-XXX-AAA-99$</Data><Data Name='SubjectDomainName'>DOMAIN</Data><Data Name='SubjectLogonId'>0x88feea93</Data><Data Name='PrivilegeList'>SeSecurityPrivilege SeBackupPrivilege SeRestorePrivilege SeTakeOwnershipPrivilege SeDebugPrivilege SeSystemEnvironmentPrivilege SeLoadDriverPrivilege SeImpersonatePrivilege SeDelegateSessionUserImpersonatePrivilege</Data></EventData></Event>   i've test this on regex101, has a match but in splunk isn't working. any suggestion will be appreciated.
Hello All,   May I request you to help me with the query below    I have two fields "customertripid & success" Customertripid has a unique id for a transaction - the transaction offers re-attemp... See more...
Hello All,   May I request you to help me with the query below    I have two fields "customertripid & success" Customertripid has a unique id for a transaction - the transaction offers re-attempts on the same customertripid - so one transaction equal to one customertripid Problem : Success=False I want to capture all the events with unique customertripid where success=false (include those which passed eventually in reattempts) - I want to count them and use it to do % Success=Pass That is giving correct counts...basically picking up the last attempt on each customertripid either 'passed' or 'failed' See the count here of fails when 'success=false' and when 'success=true' Regards Nishant            
Hi guys, I´ve been trying to integrate Splunk with LDAP but I´m encountering this error:   What could this be?   Thanks in advance
We are trying to set up Okta Identity Cloud Add-on for Splunk as the following https://splunkbase.splunk.com/app/3682/#/details We can see the following error in the internal log 2021-02-11 14:11:5... See more...
We are trying to set up Okta Identity Cloud Add-on for Splunk as the following https://splunkbase.splunk.com/app/3682/#/details We can see the following error in the internal log 2021-02-11 14:11:58,524 DEBUG pid=15786 tid=MainThread file=connectionpool.py:_make_request:437 |.com:443 "GET /api/v1/users?filter=lastUpdated+gt+%221970-01-01T00%3A00%3A00.000Z%22+and+lastUpdated+lt+%222021-02-11T14%3A11%3A53.270Z%22&limit=1000 HTTP/1.1" 401 None 2021-02-11 14:11:58,525 DEBUG pid=15786 tid=MainThread file=base_modinput.py:log_debug:288 | metric=user | message=_okta_client returned response to our request rid=YCU7LobAly6BohSnrIgL3gAADBs 2021-02-11 14:11:58,526 ERROR pid=15786 tid=MainThread file=base_modinput.py:log_error:309 | Get error when collecting events. Traceback (most recent call last): File "/TA-Okta_Identity_Cloud_for_Splunk/bin/ta_okta_identity_cloud_for_splunk/aob_py2/modinput_wrapper/base_modinput.py", line 128, in stream_events self.collect_events(ew) File "TA-Okta_Identity_Cloud_for_Splunk/bin/okta_identity_cloud.py", line 68, in collect_events input_module.collect_events(self, ew) File "TA-Okta_Identity_Cloud_for_Splunk/bin/input_module_okta_identity_cloud.py", line 829, in collect_events users = _collectUsers(helper) File "/TA-Okta_Identity_Cloud_for_Splunk/bin/input_module_okta_identity_cloud.py", line 448, in _collectUsers users = _okta_caller(helper, resource, params, method, opt_limit) File "/TA-Okta_Identity_Cloud_for_Splunk/bin/input_module_okta_identity_cloud.py", line 249, in _okta_caller response = _okta_client(helper, url, params, method) File /TA-Okta_Identity_Cloud_for_Splunk/bin/input_module_okta_identity_cloud.py", line 411, in _okta_client response.raise_for_status() File "TA-Okta_Identity_Cloud_for_Splunk/bin/ta_okta_identity_cloud_for_splunk/aob_py2/requests/models.py", line 940, in raise_for_status raise HTTPError(http_error_msg, response=self) HTTPError: 401 Client Error: Unauthorized for url: ..com/api/v1/users?filter=lastUpdated+gt+%221970-01-01T00%3A00%3A00.000Z%22+and+lastUpdated+lt+%222021-02-11T14%3A11%3A53.270Z%22&limit=1000 As the client said the API Token made by a super admin user with all the permissions Please advise.
I used query index=testindex _raw=* and successfully returned 200+ result. However, when I added stats index=testindex _raw=* | stats count by host, no result returned. Is there anything missing wh... See more...
I used query index=testindex _raw=* and successfully returned 200+ result. However, when I added stats index=testindex _raw=* | stats count by host, no result returned. Is there anything missing when I use stats command? Below is the splunk search result of the 1st query (without stats):   21/02/11 21:23:45.000   2021_2_10-15_0_0_,1274423072.0 Major = 1274423072.0 Time = 2021_2_10-15_0_0_ host = splunktest index = testindex source = C:\git\splunktest\first.txt sourcetype = csv   21/02/11 21:23:45.000   2021_2_10-14_59_0_,1274423072.0 Major = 1274423072.0 Time = 2021_2_10-14_59_0_ host = splunktest index = testindex source = C:\git\splunktest\first.txt sourcetype = csv   21/02/11 21:23:45.000   2021_2_10-14_58_0_,1274423072.0 Major = 1274423072.0 Time = 2021_2_10-14_58_0_ host = splunktest index = testindex source = C:\git\splunktest\first.txt sourcetype = csv
Hello,  So I am having a report in a panel that is having 65 columns.  Due to the fact that the first row contains an information that needs to be freezed when scrolling from left to right.  Now, ... See more...
Hello,  So I am having a report in a panel that is having 65 columns.  Due to the fact that the first row contains an information that needs to be freezed when scrolling from left to right.  Now, I have tried the following approach:  <panel id="Scoring_Server_Totals_Day"> <html depends="$alwaysHideCSS$"> <style> #Scoring_Server_Totals_Day { width:85% !important; font-size: 85%; } #Table_Scoring_Server_Totals_Day div [data-view="views/shared/results_table/ResultsTableMaster"] td:nth-child(1) { position: fixed; } #Table_Scoring_Server_Totals_Day div [data-view="views/shared/results_table/ResultsTableMaster"] th:nth-child(1) { position: fixed; } </style> </html> <table id="Table_Scoring_Server_Totals_Day"> <title>Txns scored per server &amp; instance today</title> <search ref="Scoring Server Totals"></search> <option name="count">50</option> <option name="drilldown">none</option> </table> </panel> However output is defective - it overlaps the Serv01:::1 details when it should've kept the "Scored" column separately. Despite it works for the scrolling  on left-to-right, it gets stuck even when scrolling up-to-down.  I have using position: sticky instead of position: fixed, but it ignores it and it fails to remain sticky when scrolling left-to-right.  Any idea what's wrong in the code? Thanks, Gabriel
Hi, has anyone worked with Assets and identity from Splunk Enterprise Security? I already have the App "Splunk Supporting Add-on for Active Directory" installed From the app I do connection tests a... See more...
Hi, has anyone worked with Assets and identity from Splunk Enterprise Security? I already have the App "Splunk Supporting Add-on for Active Directory" installed From the app I do connection tests and they are successful but when I enter Splunk ES I do not see Assets and Identity information What should I check?    
Hi All,   I am trying to generate a output using stats command where I want to display table like below Hostname    FTName       Total      Error Code    Error_Count     Error_rate% ABC          ... See more...
Hi All,   I am trying to generate a output using stats command where I want to display table like below Hostname    FTName       Total      Error Code    Error_Count     Error_rate% ABC                 some_ft       1000       8945                300                       30.0 I used below query which is giving me output without Error Code, if I add Error_code in stats by command it is giving total count of that error code but I want total to be total request that Ft got and out of that 8945 error code got 300 errors. How to achieve this.   index=xyz sourcetype=app_team   log_message.FT=some_ft|rename log_message.CODE as FTCODE|stats count as Total_Requests ,count(eval(FTCODE=="8945")) as Errors by server_host, log_message.FT | eval Error_rate=round(Errors/Total_Requests*100,2).+"%"|rename log_message.FT as FT Hostname FT Total_Requests Errors Error_rate ABC some_ft 259 14 5.41