Getting Data In

Powershell Script on UF - high CPU Usage

Bar_Ronen
Loves-to-Learn Lots

Hi,

I have 4 powershell scripts I wrote for MSSQL servers, simple Invoke-Query PS command to query the database health state (in terms of database running queries resources usage, etc) and send the output as JSON to splunk.

Those are really short scripts, and when running then manually from the server they run really fast.

But when the scripts runs from the input:

 

script://runpowershell.cmd script_name.ps1​

 

It takes longer time for the scripts and the powershell.exe process to end (1-2 minutes), and during that period the CPU is on 100% (viewing live from Task Manager, when the 4 powershell.exe processes are first in list when ordering by CPU usage, high first)

Can’t understand why.

Notes:

I use the common runpowershell.cmd script method to execute the powershell.exe with ExecutionPolicy Bypass flag to avoid errors running the script.

I’m aware of the SQL Server Add-on and the DB Connect method (I’ve took the SQL queries from the add-on templates, but I’m going to monitor hundreds of MSSQL Servers, and I didn’t want to configure hundreds of DB Connect connections and inputs for each server (the single HF is single point of failure for all MSSQL monitoring + performance + a lot of time to configure for hundreds of servers)

So I’m converting the DB connect SQL templates queries to PS scripts to deploy from DS, so each MSSQL UF will run the query locally and will send the output as to Splunk.

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...