Getting Data In

monitor a directory with new files that get created dynamically

ogazitt
Explorer

I am a newbie splunk user running 4.3.2 on windows (azure).

My setup is to run an indexer/search-head on a VM role (in AWS terms, a vanilla WS2008R2 AMI which has the full splunk distro installed), and run a universal forwarder on each of my web/worker role instances (in AWS terms, each role instance is a VM). An Azure startup script runs the splunk UF MSI using elevated (local admin) permissions on each of the web/worker roles as they get deployed, and I pass it the forwarding server information (indexer-vm-role-name:9997). This all works fine.

When an Azure role comes up, it creates a new local resource directory (this is a directory on the C drive which is named using a newly assigned GUID, so I don't know what it will be ahead of time). I shell-exec (using Process.Start()) "splunk add monitor " which generates some output that confirms that the directory is now monitored. I save that output to a file ("splunkinit.log") which shows up on the search-head, so I know I have forwarding and monitoring all set up correctly.

My issue is that once I start creating additional files in that directory (which contain JSON-formatted trace information), they don't show up on the indexer/search-head.

Do I need to shell-exec a "splunk add monitor" for each new file I create in this directory? If that's true, what's the use in monitoring a directory? I worry that if I really have to do this, the overhead of my tracing mechanism will be prohibitively expensive for the density of usage (I log EVERYTHING from my app).

Are there any other solutions? I started reading about writing files to the spool directory / sinkhole, but my issue is that the web/worker role processes that generate the JSON traces run in a restricted user account (which is also dynamically created), so I can't write from those processes into any directory that splunk owns.

Tags (2)
0 Karma
1 Solution

ogazitt
Explorer

Turns out I had multiple issues but the main one was that my process hadn't closed the JSON log files yet. So while splunk did actually see them and add them to the monitored inputs, they were write-locked so splunk couldn't read them. That's what a *nix guy gets for running on Windows 😉

View solution in original post

ogazitt
Explorer

Turns out I had multiple issues but the main one was that my process hadn't closed the JSON log files yet. So while splunk did actually see them and add them to the monitored inputs, they were write-locked so splunk couldn't read them. That's what a *nix guy gets for running on Windows 😉

Get Updates on the Splunk Community!

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...

Combine Multiline Logs into a Single Event with SOCK: a Step-by-Step Guide for ...

Combine multiline logs into a single event with SOCK - a step-by-step guide for newbies Olga Malita The ...