hello everyone,
I saw multiple post regarding this but couldn't really understand the architect behind.
We have 3000 forwarders checked in to the server. We have Windows and Linux server classes. We also created apps to push the configuration files to all the forwarders.
But now we would like to implement resource monitoring thru the agent. I wrote a batch script that would query the CPU usage and memory every minute, then output to a txt file, and get pushed out to the Splunk indexer.
I knew that if I need to run the script on the remote machine, I would need to place the script under $SPLUNK_HOME/etc/apps/MYAPP/bin
.
Question is, how I can do that? Isn't that as simple as putting the script in the "app" and then restart the splunk agent, then the agents will pick up whenever it's there? Am I right? And what configuration do I need to specifically to make the script run?
I am new to Splunk, haven't got chance to take any training as the company did not provide any. I learn as I go.
Thank you
Hi yongphang, You'll want to utilize the deployment server to have the app delivered to the forwarders : http://docs.splunk.com/Documentation/Splunk/6.2.5/Updating/Updateconfigurations
One thing to keep in mind is that, if you are using the Universal Forwarder it does not come with python bundled. Your scripted inputs will have to utilize native OS utilities, or otherwise have any dependencies already satisfied.
Let me know if this makes sense 😄
Add a script:// entry to inputs.conf (<-- click that for docs), it'll automagically index stdout.
If you have a mix of windows and linux deployment servers, make sure the linux forwarders are supplied by a linux deployment server with the x bit set.
thanks Martin, what do u mean by specifiying x bit set?
I dont know anything about that, can you explain a lil bit over here?
I appreciate that
Linux file permissions - Read, Write, eXecute. IIRC those permissions are carried over to the UF, and a shell script without the bit set won't execute.
it is mixture of windows and unix/linux, i was creating a batch script for windows and bash for unix/linux.
I got the output file monitored under input.conf, how about to get the script to be run? Also under input.conf??
How can i tell its running? How about making the splunk suck in the output directly instead of picking up the txt file??
Thank you
Is it a windows or linux deployment server? I recall there were some issues around windows deployment servers not being able to correctly transmit the x bits to allow script execution.
Did you put the script into inputs.conf? Did you put a monitor entry into inputs.conf to monitor the files created by the script?
Consider reading the input directly, instead of writing to a file and monitoring that file.
here is a link on how to fix the execution problem http://answers.splunk.com/answers/70039/windows-deployment-server-to-nix-deployment-client-permissio... if you deploy from a windows deployment server to *nix clients
Hi yongphang, You'll want to utilize the deployment server to have the app delivered to the forwarders : http://docs.splunk.com/Documentation/Splunk/6.2.5/Updating/Updateconfigurations
One thing to keep in mind is that, if you are using the Universal Forwarder it does not come with python bundled. Your scripted inputs will have to utilize native OS utilities, or otherwise have any dependencies already satisfied.
Let me know if this makes sense 😄