All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

My organization is struggling to successfully incorporate data science into existing security processes successfully. I'm having a hard time finding resources that help me assess the maturity level o... See more...
My organization is struggling to successfully incorporate data science into existing security processes successfully. I'm having a hard time finding resources that help me assess the maturity level of data science in my environment and how to mature it further with possible use cases and strategies to focus on. Does anyone know if there's any resources out there to help my organization head in the right direction?
Hi, Our system holds XML logs and the way it is structured, some of values are held inside a common set of name/value attribute pair which repeats number of times within the XML. Index name is 'a... See more...
Hi, Our system holds XML logs and the way it is structured, some of values are held inside a common set of name/value attribute pair which repeats number of times within the XML. Index name is 'applogs'. Example XML:       <RECORD> <ORDER> <OrderDate>21-11-2022</OrderDate> <OrderRef>12345678></OrderRef> <OrderAttributes> <OrderAttributeName>Attribute1</OrderAttributeName> <OrderAttributeValue>Value1<OrderAttributeValue> <OrderAttributeName>Attribute2</OrderAttributeName> <OrderAttributeValue>Value2<OrderAttributeValue> <OrderAttributeName>Attribute3</OrderAttributeName> <OrderAttributeValue>Value3<OrderAttributeValue> </OrderAttributes> </ORDER> </RECORD>       I want to extract the individual attributes to display in a table something like this: OrderDate OrderRef Attribute1 Attribute2 Attribute3 21-11-2022 12345678 Value1 Value2 Value3   I have tried SPATH but not able to pull Attribute1 & Value1 pair as there are multiple instances of OrderAttributeName & OrderAttributeValue tags, so have hit the buffers. Any suggestions on how can I make it work?
I have a dashboard with dropdown select. From this dropdown, once I select a api "/api/apiresponse/search/" then in the search results it will show this 2000-1-1 1:0:0.00 INFO : logType=API_RESPONSE... See more...
I have a dashboard with dropdown select. From this dropdown, once I select a api "/api/apiresponse/search/" then in the search results it will show this 2000-1-1 1:0:0.00 INFO : logType=API_RESPONSE, duration=100, request={"headers":"Accept":"application/json","Content-Type":"application/json"},"method":"POST", "body":{"body"},"parameters":{},"uri":"/api/apiresponse/search/"}, configLabel=, requestId=Thisoneismatching11111, response={"headers":{"statusCode":"OK"}, requestUri=/api/apiresponse/search/, threadContextId=Thisoneismatching22222, message=COMPLETED request /api/apiresponse/search/, source = /apps/logs/api_response.log sourcetype = response_log this is my search query for API_response index=main *_RESPONSE | spath input=request | spath input=response | lookup abc.csv uri OUTPUT opName | search Name="$Nme$" opName="$opeNme$" uri="$apis$" Downstream response log 2000-1-1 1:0:0.00 INFO logType=DOWNSTREAM_RESPONSE, duration=100, request={"headers":{"Accept":"application/json","Content-Type":"application/json"},"method":"POST", "body":{"uri":"https://abcdefg.com/downresponseservice/api/downresponse"}, configLabel=, requestId=Thisoneismatching11111, response={"OK":{"statusCode":"OK"}}, requestUri=https://abcdefg.com/downresponseservice/api/downresponse, threadContextId=Thisoneismatching22222, message=<<< Outbound REST response, source = /apps/logs/downstream_response.log sourcetype = response_log same way Is there a way to get downstream response? All the  Api logs and downstream logs has two matching field  requestId andthreadContextId I would need from the selected api, it has to get api_response logs and downstream logs related to only that api.
Hello,   I have a table with a custom Splunk Query, and a custom Click on an Cell.. work fine if I select to filter any filter than "Real Time" .  Using real time, this click doesn't work, appear... See more...
Hello,   I have a table with a custom Splunk Query, and a custom Click on an Cell.. work fine if I select to filter any filter than "Real Time" .  Using real time, this click doesn't work, appears to recreate the query every second, and when I click the action is not triggered.   How can I fix that?   Thanks
I’m looking to get in touch with the developer of the Splunk Add-on for Salesforce Streaming API to see if the source can be shared or made open source. Does anyone know how I can contact them?
pls i created this index summary and it was working. but when i checked data for the next day it doesnt show data.
Hi there, I would like to connect my ESET Server to SC4S to send syslog messages. I know that Eset is not listed on supported known vendors. Is it possible to connect Eset to SC4S?   thanks and r... See more...
Hi there, I would like to connect my ESET Server to SC4S to send syslog messages. I know that Eset is not listed on supported known vendors. Is it possible to connect Eset to SC4S?   thanks and regards, pawelF
Delay in index time and search time data.. There is a delay of 10 hours  index=test_shift "*10987867*" | eval indextime=strftime(_indextime,"%d/%m/%Y %H:%M:%S") | table _raw _time indextime ... See more...
Delay in index time and search time data.. There is a delay of 10 hours  index=test_shift "*10987867*" | eval indextime=strftime(_indextime,"%d/%m/%Y %H:%M:%S") | table _raw _time indextime _time is 2022-11-15 13:42:31 indextime is 2022-11-15 23:27:33 The environment is in cluster and if one indexer is down also , but the delay shouldnt be there . kindly suggest me what is the root cause so that i check in my environment Thanks in advance and your answer will be  helpful for me 
I am trying to compare a static column(Baseline) with multiple columns(hosts) and if there is a difference I need to highlight that cell in red   Component   BASELINE HOSTA HOSTB HOSTC GP... See more...
I am trying to compare a static column(Baseline) with multiple columns(hosts) and if there is a difference I need to highlight that cell in red   Component   BASELINE HOSTA HOSTB HOSTC GPU 20 20 5 7 GPU1 5 7 7 5 FW 2.4.2  2.4.2  2.4.2  2.4.3 IP 1.1.1.1 1.1.1.2 1.1.1.1 1.1.1.1 ID [234 , 336] [234 , 336] [134 , 336] [234 , 336]     <form theme="dark"> <label>Preos Firmware Summary - Liquid Cooled</label> <fieldset submitButton="false"> <input type="multiselect" token="tok_host" searchWhenChanged="true"> <label>Host</label> <valueSuffix>,</valueSuffix> <fieldForLabel>host</fieldForLabel> <fieldForValue>host</fieldForValue> <search> <query>index=pre Type=Liquid_Cooled | stats count by host | dedup host</query> <earliest>-90d@d</earliest> <latest>now</latest> </search> <default>*</default> <delimiter> </delimiter> <choice value="*">All</choice> </input> <input type="multiselect" token="tok_component" searchWhenChanged="true"> <label>Component</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>Component</fieldForLabel> <fieldForValue>Component</fieldForValue> <search> <query>index=pre Type=Liquid_Cooled host IN ($tok_host$) "IB HCA FW" OR *CPLD* OR BMC OR SBIOS OR *nvme* OR "*GPU* PCISLOT*" OR *NVSW* | rex field=_raw "log-inventory.sh\[(?&lt;id&gt;[^\]]+)\]\:\s*(?&lt;Component&gt;[^\:]+)\:\s*(?&lt;Hardware_Details&gt;.*)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*CPLD\:\s*(?&lt;Hardware&gt;[^.*]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*BMC\:\s*version\:\s*(?&lt;Hardware1&gt;[^\,]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*SBIOS\s*version\:\s*(?&lt;Hardware2&gt;[^ ]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*nvme\d*\:.*FW\:\s*(?&lt;Hardware3&gt;[^ ]+)" | rex field=_raw "VBIOS\:\s*(?&lt;Hardware4&gt;[^\,]+)" | rex field=_raw "NVSW(\d\s|\s)FW\:\s*(?&lt;Hardware5&gt;(.*))" | rex field=_raw "IB\s*HCA\sFW\:\s*(?&lt;Hardware6&gt;(.*))" | eval output = mvappend(Hardware,Hardware1,Hardware2,Hardware3,Hardware4,Hardware5,Hardware6) | replace BMC WITH "BMC and AUX" in Component | search Component IN("*") | stats latest(output) as output latest(_time) as _time by Component host | fields - _time | eval from="search" | join Component [| inputlookup FW_Tracking_Baseline.csv | search Component!=*ERoT* Component!=PCIeRetimer* Component!="BMC FW ver" | table Component Baseline | eval from="lookup" | rename Baseline as lookup_output | fields lookup_output Component output] | stats count(eval(lookup_output==output)) AS case BY host Component output lookup_output | replace 1 WITH "match" IN case | replace 0 WITH "No match" IN case | stats values(Component) as Component by host lookup_output case output | stats count by Component | dedup Component</query> <earliest>-90d@d</earliest> <latest>now</latest> </search> <valueSuffix>"</valueSuffix> <delimiter> ,</delimiter> <valuePrefix>"</valuePrefix> </input> </fieldset> <row> <panel> <table> <search> <query>index=preos_inventory sourcetype = preos_inventory Type=Liquid_Cooled host IN ($tok_host$) "IB HCA FW" OR *CPLD* OR BMC OR SBIOS OR *nvme* OR "*GPU* PCISLOT*" OR *NVSW* | rex field=_raw "log-inventory.sh\[(?&lt;id&gt;[^\]]+)\]\:\s*(?&lt;Component&gt;[^\:]+)\:\s*(?&lt;Hardware_Details&gt;.*)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*CPLD\:\s*(?&lt;Hardware&gt;[^.*]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*BMC\:\s*version\:\s*(?&lt;Hardware1&gt;[^\,]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*SBIOS\s*version\:\s*(?&lt;Hardware2&gt;[^ ]+)" | rex field=_raw "log-inventory.sh\[\d*\]\:\s*nvme\d*\:.*FW\:\s*(?&lt;Hardware3&gt;[^ ]+)" | rex field=_raw "VBIOS\:\s*(?&lt;Hardware4&gt;[^\,]+)" | rex field=_raw "NVSW\d\s*FW\:\s*(?&lt;Hardware5&gt;(.*))" | rex field=_raw "IB\s*HCA\sFW\:\s*(?&lt;Hardware6&gt;(.*))" | eval output = mvappend(Hardware,Hardware1,Hardware2,Hardware3,Hardware4,Hardware5,Hardware6) | replace BMC WITH "BMC and AUX" in Component | stats latest(output) as output latest(_time) as _time by Component host | eval from="search" | fields - _time | chart values(output) by Component host limit=0 | fillnull value="No Data" | join Component [ | inputlookup FW_Tracking_Baseline.csv | search Component!=*ERoT* Component!=PCIeRetimer* Component!="BMC FW ver" | table Component Baseline | eval from="lookup" | fields Baseline Component output] | fields Component Baseline * | fillnull value="No Data"</query> <earliest>-90d@d</earliest> <latest>now</latest> </search> <option name="count">50</option> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>     
Hi I'm trying to update App Permissions to attach a role to the app however getting no luck currently.  Tried to use the .update() method to with correct parameters but doesn't look like it handles... See more...
Hi I'm trying to update App Permissions to attach a role to the app however getting no luck currently.  Tried to use the .update() method to with correct parameters but doesn't look like it handles the access parameters with 'read' and 'write.  Any other method this may work? 
Hello there, I have been trying to use splunk check-integrity  to check the integrity of some indexes. I have the error: Integrity check error for bucket with path=..\Splunk\var\lib\splunk\defaul... See more...
Hello there, I have been trying to use splunk check-integrity  to check the integrity of some indexes. I have the error: Integrity check error for bucket with path=..\Splunk\var\lib\splunk\defaultdb\db\b_1668784284_1668613842_9, Reason=Journal has no hashes. I do know that the events indexed before adding  enableDataIntegrityControl=true to indexes.conf are not treated, but I thought the buckets created after the use of enableDataIntegrityControl=true would have a hash. bur it is not the case. Am I missing somthing?    Thank you.
Hello All,  When using the "stats count by column1, column2, column3, column4" I get the below result  Existing table :  column1 column2 column3 column4 XXXXXXXX YYYYY A... See more...
Hello All,  When using the "stats count by column1, column2, column3, column4" I get the below result  Existing table :  column1 column2 column3 column4 XXXXXXXX YYYYY A 123 XXXXXXXX YYYYY B 123 XXXXXXXX YYYYY C 123 XXXXXXXX YYYYY D 123 XXXXXXXX YYYYY E 123   Where as I need this result : column1 column2 column3 column4 XXXXXXXX YYYYY A 123     B       C       D       E     Could somebody please help me with the query?  Thanks, 
  I have a log file with events that indicate activities in a server. I am interested in the Login and Logout activities - I need to create a report of active sessions. I managed to order the event... See more...
  I have a log file with events that indicate activities in a server. I am interested in the Login and Logout activities - I need to create a report of active sessions. I managed to order the events so that I can get Login-Logout events consecutively for each user.    
Hi, splunkers I am just wonder this phase. But it's not critical for me now. I try to integrate apache web server and splunk itsi. But I stuck in creating services phase because | savedsearch DA-I... See more...
Hi, splunkers I am just wonder this phase. But it's not critical for me now. I try to integrate apache web server and splunk itsi. But I stuck in creating services phase because | savedsearch DA-ITSI-WEBSERVER-WebServer_Entity_Search is not giving result.   Actually my real question is about KPI searches. I want to show Website 4xx erros in default services page but in Website-4xx Errors search includes tag=activity, so I don't find these tag any add-on. For these reason My web site 4xx error search not working properly. How I fix these issue ? Also these tag not including in Splunk ITSI content pack.     Thank you.  
Hi everyone, I try to set an attribute to true for all elements having a certain ID, when 2 defined activities are available for that certain ID. In my optinion the corresponding SQL query would be... See more...
Hi everyone, I try to set an attribute to true for all elements having a certain ID, when 2 defined activities are available for that certain ID. In my optinion the corresponding SQL query would be:   Update t set isvalid = true where id in (select id from t group by id having activity = 'a' and activity = 'b')   A result might look like: id activitiy istrue 001 a true 001 b true 001 c true 002 a false 002 c false 002 d false 003 a true 003 b true   Is there an option to execute this in SPL?   Thanks, Lukas
Hi, I have a dashboard with a base search a number of chain searches. My base search is very long and the chain searches are a just different stats commands. However, the dashboard does not render ... See more...
Hi, I have a dashboard with a base search a number of chain searches. My base search is very long and the chain searches are a just different stats commands. However, the dashboard does not render the results unless I place a stats command also in the base search. This where I am running into trouble as I need to find a stats command that is generic enough to go before all the unique stats command for each panel. Example, Base search: index = ABC ....... Chain search1: | stats count by XYZ| head 10 Chain search2: | stats count by MNO| head 10 This renders when I open the query in "Open in Search" but no results are generated for any panel on the dashboards for the same queries. The dashboard panels only render when I add a stats command at the base search like Base search: index = ABC ....... |stats count by GHI, However, this stats query on the base search precludes me fro adding individual stats command for each panel. Is there any generic stats command I can add to the base search? Thanks!
Hi. I'm trying to apply a rule for dropping and, meanwhile, get only some events in Indexers. Here we are, props.conf     [mysourcetype] TRANSFORMS-filter = drop     transforms.conf ... See more...
Hi. I'm trying to apply a rule for dropping and, meanwhile, get only some events in Indexers. Here we are, props.conf     [mysourcetype] TRANSFORMS-filter = drop     transforms.conf     [drop] REGEX = drop_event1|drop_event2|drop_eventX DEST_KEY = queue FORMAT = nullQueue       This is the standard way for dropping. And it works!   But, at the same time, i can't get a way to make both work with drop and get transformation, props.conf     [mysourcetype] TRANSFORMS-filter = drop,filter     transforms.conf     [drop] REGEX = drop_event1|drop_event2|drop_eventX DEST_KEY = queue FORMAT = nullQueue [filter] REGEX = get_event1|get_event2|get_eventX DEST_KEY = queue FORMAT = indexQueue       I would like to explain Splunk 8, FIRST: drop all events containing pattern regex "drop_event1|drop_event2|drop_eventX" SECOND: get only events containing pattern regex "get_event1|get_event2|get_eventX" It does not work! Splunk, after correctly dropping, gets all (".*"), except as said "drop_event1|drop_event2|drop_eventX" Any suggestion?
I am using outlier visualization in my dashboard to detect outliers during business hours from 5A.M to 7P.M. But when i run query using other visualizations like line chart etc display correct time i... See more...
I am using outlier visualization in my dashboard to detect outliers during business hours from 5A.M to 7P.M. But when i run query using other visualizations like line chart etc display correct time in tooltip but when i use Outlier visualization it display wrong time in tooltip... but the graph values are correct in visualization how to solve this is issue?
what is the cause and solution for the following error? ERROR HttpClientRequest - HTTP client error=Connection closed by peer while accessing server=https://aws-ix-s2ioadata-backet-598294183213.s3-... See more...
what is the cause and solution for the following error? ERROR HttpClientRequest - HTTP client error=Connection closed by peer while accessing server=https://aws-ix-s2ioadata-backet-598294183213.s3-ap-northeast-1.amazonaws.com for request=https://aws-ix-s2ioadata-backet-598294183213.s3-ap-northeast-1.amazonaws.com/warmdata/ioaadsecurity/dma/f7/bb/812~2B94CFE8-9DC3-4E45-9ADF-CAF58B0CDDFE/89513704-8894-4CFC-AC58-9BF7D36B3B59_DM_Splunk_SA_CIM_Web/guidSplunk-2B94CFE8-9DC3-4E45-9ADF-CAF58B0CDDFE/metadata_checksum.
Hello there. I tried to set up perfmon inputs to capture state of my windows 10 test box. Aaaaand. It's not working. And I have no idea how I can debug it further. The inputs seem to be defined... See more...
Hello there. I tried to set up perfmon inputs to capture state of my windows 10 test box. Aaaaand. It's not working. And I have no idea how I can debug it further. The inputs seem to be defined properly (I don't understand why there are two identical definitions for perfmon://CPU and perfmon://Processor but while testing I tried running with just one perfmon input enabled and the result was the same so it's definitely not the result of overlapping inputs). PS C:\Program Files\SplunkUniversalForwarder\bin> .\splunk.exe btool inputs list perfmon://CPU [perfmon://CPU] counters = % Processor Time; % User Time; % Privileged Time; Interrupts/sec; % DPC Time; % Interrupt Time; DPCs Queued/sec; DPC Rate; % Idle Time; % C1 Time; % C2 Time; % C3 Time; C1 Transitions/sec; C2 Transitions/sec; C3 Transitions/sec disabled = 0 host = dziura index = winmetrics instances = * interval = 300 mode = multikv object = Processor useEnglishOnly = true PS C:\Program Files\SplunkUniversalForwarder\bin> .\splunk.exe btool inputs list perfmon://Process [perfmon://Process] counters = % Processor Time; % User Time; % Privileged Time; Virtual Bytes Peak; Virtual Bytes; Page Faults/sec; Working Set Peak; Working Set; Page File Bytes Peak; Page File Bytes; Private Bytes; Thread Count; Priority Base; Elapsed Time; ID Process; Creating Process ID; Pool Paged Bytes; Pool Nonpaged Bytes; Handle Count; IO Read Operations/sec; IO Write Operations/sec; IO Data Operations/sec; IO Other Operations/sec; IO Read Bytes/sec; IO Write Bytes/sec; IO Data Bytes/sec; IO Other Bytes/sec; Working Set - Private disabled = 0 host = dziura index = winmetrics instances = * interval = 300 mode = multikv object = Process useEnglishOnly = true [perfmon://Processor] counters = % Processor Time; % User Time; % Privileged Time; Interrupts/sec; % DPC Time; % Interrupt Time; DPCs Queued/sec; DPC Rate; % Idle Time; % C1 Time; % C2 Time; % C3 Time; C1 Transitions/sec; C2 Transitions/sec; C3 Transitions/sec disabled = 0 host = dziura index = winmetrics instances = * interval = 300 mode = multikv object = Processor useEnglishOnly = true The list inputstatus shows: C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe exit status description = exited with code -1 time closed = 2022-11-21T09:27:17+0100 time opened = 2022-11-21T09:27:14+0100 I raised logging level for modularinputs and execprocessor to DEBUG but still it's not helpful: 11-21-2022 09:27:10.491 +0100 DEBUG ModularInputs [6028 MainThread] - Found scheme="perfmon". 11-21-2022 09:27:10.491 +0100 DEBUG ModularInputs [6028 MainThread] - Locating script for scheme="perfmon"... 11-21-2022 09:27:10.491 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.bat". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.cmd". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.py". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.js". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.exe". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.bat". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.cmd". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.py". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.js". 11-21-2022 09:27:10.492 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\perfmon.exe". 11-21-2022 09:27:10.493 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\bin\perfmon.bat". 11-21-2022 09:27:10.493 +0100 DEBUG ModularInputs [6028 MainThread] - Found script ""C:\Program Files\SplunkUniversalForwarder\etc\system\bin\perfmon.cmd"" to handle scheme "perfmon". 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - Introspecting scheme=perfmon: exited: status=done, exit=0 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - XML scheme path "\scheme\script": "script" -> "splunk-perfmon.path" 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - XML endpoint path "\scheme\endpoint\id": "id" -> "win-perfmon" 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - Setting up values from introspection for scheme "perfmon". 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - Locating script for scheme="perfmon"... 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\splunk-perfmon.path". 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\splunk-perfmon.path.exe". 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\splunk-perfmon.path". 11-21-2022 09:27:10.614 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\windows_x86_64\bin\splunk-perfmon.path.exe". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\bin\splunk-perfmon.path". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\bin\splunk-perfmon.path.exe". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\bin\splunk-perfmon.path". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - No regular file="C:\Program Files\SplunkUniversalForwarder\etc\system\bin\splunk-perfmon.path.exe". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - Found script ""C:\Program Files\SplunkUniversalForwarder\bin\scripts\splunk-perfmon.path"" to handle scheme "perfmon". 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - For scheme "perfmon" found script "splunk-perfmon.path" at path ""C:\Program Files\SplunkUniversalForwarder\bin\scripts\splunk-perfmon.path"" 11-21-2022 09:27:10.615 +0100 DEBUG ModularInputs [6028 MainThread] - Setting "id" to "win-perfmon". 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - In configure(), looking at stanza: [script://C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe] -> {host -> dziura, source -> perfmon, sourcetype -> perfmon} 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - Stanza='script://C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe' isModInput=true isIntrospectionInput=false 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - getInterpreterPathFor(): scriptPath=C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe pyVersStr= 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - After normalization script is ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"" 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - stanza=script://C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe interval=18446744073709551.615 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - Creating an ExecedCommand, cmd='"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"', args={"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"}, runViaShell=false 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - ExecProcessorSharedState::addToRunQueue() path='"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"' restartTimerIfNeeded=0 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - adding ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"" to runqueue 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - cmd='"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"' Added to run queue 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - Creating InputStatusHandler for group="modular input commands" key="C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe" 11-21-2022 09:27:11.098 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - Done configuring ExecedCommand: command='"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"' runViaShell=0 tickStarted=0 running=0 state=WAITING_ON_RUNQUEUE interval=18446744073709551.615 11-21-2022 09:27:14.883 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - Running: "C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe" on PipelineSet 0 11-21-2022 09:27:14.883 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - PipelineSet 0: Created new ExecedCommandPipe for ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"", uniqueId=5 11-21-2022 09:27:16.532 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - PipelineSet 0: Got EOF from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"", uniqueId=5 11-21-2022 09:27:17.048 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - PipelineSet 0: Ran script: "C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe", took 2.172 seconds to run, 0 bytes read 0 events read, status=done, exit=4294967295 11-21-2022 09:27:17.048 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - PipelineSet 0: Destroying ExecedCommandPipe for ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"" id=5 11-21-2022 09:27:17.048 +0100 DEBUG ExecProcessor [12380 ExecProcessor] - cmd='"C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe"' Not added to run queue  The only relevant entry here is the line with "exit=4294967295" which corresponds to the inputstatus message that the process exited with -1. But I still don't know why. I accept that the reason may be completely on the windows side but I would like to be able to diagnose why. Oh, and yes - I did try the lodctr.exe /r - nothing changes. The UF is running as LOCAL SYSTEM so it should not have permission issues. Also I can run perfmon.msc and it's showing the counters properly. Any more debug ideas? I'm stuck.