Getting Data In

monitor a directory with new files that get created dynamically

ogazitt
Explorer

I am a newbie splunk user running 4.3.2 on windows (azure).

My setup is to run an indexer/search-head on a VM role (in AWS terms, a vanilla WS2008R2 AMI which has the full splunk distro installed), and run a universal forwarder on each of my web/worker role instances (in AWS terms, each role instance is a VM). An Azure startup script runs the splunk UF MSI using elevated (local admin) permissions on each of the web/worker roles as they get deployed, and I pass it the forwarding server information (indexer-vm-role-name:9997). This all works fine.

When an Azure role comes up, it creates a new local resource directory (this is a directory on the C drive which is named using a newly assigned GUID, so I don't know what it will be ahead of time). I shell-exec (using Process.Start()) "splunk add monitor " which generates some output that confirms that the directory is now monitored. I save that output to a file ("splunkinit.log") which shows up on the search-head, so I know I have forwarding and monitoring all set up correctly.

My issue is that once I start creating additional files in that directory (which contain JSON-formatted trace information), they don't show up on the indexer/search-head.

Do I need to shell-exec a "splunk add monitor" for each new file I create in this directory? If that's true, what's the use in monitoring a directory? I worry that if I really have to do this, the overhead of my tracing mechanism will be prohibitively expensive for the density of usage (I log EVERYTHING from my app).

Are there any other solutions? I started reading about writing files to the spool directory / sinkhole, but my issue is that the web/worker role processes that generate the JSON traces run in a restricted user account (which is also dynamically created), so I can't write from those processes into any directory that splunk owns.

Tags (2)
0 Karma
1 Solution

ogazitt
Explorer

Turns out I had multiple issues but the main one was that my process hadn't closed the JSON log files yet. So while splunk did actually see them and add them to the monitored inputs, they were write-locked so splunk couldn't read them. That's what a *nix guy gets for running on Windows 😉

View solution in original post

ogazitt
Explorer

Turns out I had multiple issues but the main one was that my process hadn't closed the JSON log files yet. So while splunk did actually see them and add them to the monitored inputs, they were write-locked so splunk couldn't read them. That's what a *nix guy gets for running on Windows 😉

Get Updates on the Splunk Community!

Why You Can't Miss .conf25: Unleashing the Power of Agentic AI with Splunk & Cisco

The Defining Technology Movement of Our Lifetime The advent of agentic AI is arguably the defining technology ...

Deep Dive into Federated Analytics: Unlocking the Full Power of Your Security Data

In today’s complex digital landscape, security teams face increasing pressure to protect sprawling data across ...

Your summer travels continue with new course releases

Summer in the Northern hemisphere is in full swing, and is often a time to travel and explore. If your summer ...