Getting Data In

Why is perfmon data not getting forwarded to indexer when all the other source/sourcetype data gets forwarded with indexed?

sdubey_splunk
Splunk Employee
Splunk Employee

What was done as part of troubleshooting?

Checked the indexer and found no IO issues.

Restart splunk on myPRODServer server(universal forwarder)

found no errrors( splunk started successfully)

Logged in to search head

search -> index=perfmon host=myPRODServer --> We do not see any data.

grep permon metrics.log -> do not see any message/logs

checked splunkd.log and did not find any conclusive errors.

Collected the output of command "splunk list inputstatus" . Found the below error:

Issue perfmon.exe exited with code -1

C:\Program Files\SplunkUniversalForwarder\bin\splunk-perfmon.exe

exit status description = exited with code -1

time closed = 2018-10-10T14:20:37+0800

time opened = 2018-10-10T14:20:36+0800

Exit code "-1" ie negative is undefined.

Requested customer to restart the splunk daemon in debug mode (splunk start --debug) and collect diag.

Found no conclusive errors in diag(diag was collected when splunk was started in debug mode)

No entry/log for perfmon in metrics.log.

All the other source/sourcetype works but perfmon data is not getting forwarder to indexer.

0 Karma
1 Solution

sdubey_splunk
Splunk Employee
Splunk Employee

Summary of fix below

  1. perfmon was not collecting CPU/Memory statistics.

  2. We installed https://docs.microsoft.com/en-us/sysinternals/downloads/sysinternals-suite and found that splunk-perfmon was not getting started hence data was not collected.

  3. We found that https://social.technet.microsoft.com/wiki/contents/articles/19374.windows-performance-monitor-unable... : Windows Performance Monitor: "Unable to add these counters" and applied the fix. After applying the fix customer started seeing the perfmon counters raw data getting ingested. This was issue on OS side rather than on Splunk.

Error/Symptoms:Windows Performance Monitor: "Unable to add these counters"

Steps to Fix:
We have to rebuild Performance Counters with LODCTR from an elevated command prompt.

  1. Launch Command Prompt as Administrator (right click Runs As Administrator).

  2. Drop into the C:\WINDOWS\System32 directory by typing CD\ then CD Windows\System32

  3. To rebuild your resource counters type the following command:
    lodctr /r
    This will rebuild your counter values and may take a few moments so please be patient.

  4. If you wish to query the counters to make sure they were correctly set, use:
    lodctr /q
    This will give you something like the following:
    Performance Counter ID Queries [PERFLIB]:
    Base Index: 0x00000737 (1847)
    Last Counter Text ID: 0x000031D2 (12754)
    Last Help Text ID: 0x000031D3 (12755)

[.NET CLR Data] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData

[.NET CLR Networking] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData
and etc
5.Now reopen the Performance Monitor and check it. You will see that the error was fixed.

Note 1:

In the query if you have provider name that is Disabled, Use this Cmdlet to enable it.
lodctr /e:

for example (think in above query Performance Counters is disable ) :
lodctr /e:Performance Counters

After enabling Performance counters were able to ingest/search CPU/Memory statistics.

View solution in original post

0 Karma

sdubey_splunk
Splunk Employee
Splunk Employee

Summary of fix below

  1. perfmon was not collecting CPU/Memory statistics.

  2. We installed https://docs.microsoft.com/en-us/sysinternals/downloads/sysinternals-suite and found that splunk-perfmon was not getting started hence data was not collected.

  3. We found that https://social.technet.microsoft.com/wiki/contents/articles/19374.windows-performance-monitor-unable... : Windows Performance Monitor: "Unable to add these counters" and applied the fix. After applying the fix customer started seeing the perfmon counters raw data getting ingested. This was issue on OS side rather than on Splunk.

Error/Symptoms:Windows Performance Monitor: "Unable to add these counters"

Steps to Fix:
We have to rebuild Performance Counters with LODCTR from an elevated command prompt.

  1. Launch Command Prompt as Administrator (right click Runs As Administrator).

  2. Drop into the C:\WINDOWS\System32 directory by typing CD\ then CD Windows\System32

  3. To rebuild your resource counters type the following command:
    lodctr /r
    This will rebuild your counter values and may take a few moments so please be patient.

  4. If you wish to query the counters to make sure they were correctly set, use:
    lodctr /q
    This will give you something like the following:
    Performance Counter ID Queries [PERFLIB]:
    Base Index: 0x00000737 (1847)
    Last Counter Text ID: 0x000031D2 (12754)
    Last Help Text ID: 0x000031D3 (12755)

[.NET CLR Data] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData

[.NET CLR Networking] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData
and etc
5.Now reopen the Performance Monitor and check it. You will see that the error was fixed.

Note 1:

In the query if you have provider name that is Disabled, Use this Cmdlet to enable it.
lodctr /e:

for example (think in above query Performance Counters is disable ) :
lodctr /e:Performance Counters

After enabling Performance counters were able to ingest/search CPU/Memory statistics.

0 Karma

sdubey_splunk
Splunk Employee
Splunk Employee

Steps to Fix:
We have to rebuild Performance Counters with LODCTR from an elevated command prompt.

  1. Launch Command Prompt as Administrator (right click Runs As Administrator).

  2. Drop into the C:\WINDOWS\System32 directory by typing CD\ then CD Windows\System32

  3. To rebuild your resource counters type the following command:
    lodctr /r
    This will rebuild your counter values and may take a few moments so please be patient.

  4. If you wish to query the counters to make sure they were correctly set, use:
    lodctr /q
    This will give you something like the following:
    Performance Counter ID Queries [PERFLIB]:
    Base Index: 0x00000737 (1847)
    Last Counter Text ID: 0x000031D2 (12754)
    Last Help Text ID: 0x000031D3 (12755)

[.NET CLR Data] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData

[.NET CLR Networking] Performance Counters (Enabled)
DLL Name: %systemroot%\system32\netfxperf.dll
Open Procedure: OpenPerformanceData
Collect Procedure: CollectPerformanceData
Close Procedure: ClosePerformanceData
and etc
5.Now reopen the Performance Monitor and check it. You will see that the error was fixed.

Note 1:

In the query if you have provider name that is Disabled, Use this Cmdlet to enable it.
lodctr /e:

for example (think in above query Performance Counters is disable ) :
lodctr /e:Performance Counters

0 Karma

AF_Ops
New Member

The lodctr /r command seemed to have resolved the problem of not getting CPU/RAM perfmon data from one of my servers, but logical disk data is still missing.

I am a little confused about the step "Now reopen the Performance Monitor and check it. You will see that the error was fixed." which Performance Monitor? The built in Windows one? I can see Logical Disk in the Windows PerfMon. That data isn't going into Splunk though.

Just for background, the same UF config is working as expected on another server. Both servers are MS Server 2012 R2.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...