Getting Data In

UF on a standalone > USB transfer > indexed into a Splunk instance on a sperate standalone instance

Joshua2
Observer

I have around 60 standalone windows laptops that are not networked.

I looking to install a UF to capture the windows logs and have them stored on the local drive "c:\logs"

The logs will then be transfered to a USB for archiving and indexed into splunk for NIST800 compliance eg login success/failure.

I am struggling to find the correct syntax for the UF to save locally as it asks for a host and Port.

 

josh

Labels (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Joshua2 ,

as also @KendallW said, this isn't the way to work of Splunk: you cannot locally store data n an UF.

UF has a local cache that stores data if the Indexers aren't available, but only for a few time and it isn't possible to copy cached logs in an usb drive.

You should review your requirements with a Splunk Certified Architect or a Splunk Professional Services specialist to find a solution: e.g. send logs to a local syslog or copy them in text files (using a script) and then store them in the usb drive, but as I said, this solution must be designed by an expert, this isn't a question for the Community.

Ciao.

Giuseppe

0 Karma

KendallW
Contributor

Hi @Joshua2 I won't judge the solution design, but the intended use of the Universal Forwarder is to forward logs, not store them locally for later manual transfer. 

You can do this with the UF by setting up local indexing on each machine, however you would have to pay for license usage as the data is indexed at the UF tier, and then again pay for license usage when it is transferred and indexed to your Splunk Enterprise instance later. So you'd be paying twice to index the same data. 
Also note there are performance implications for local indexing, and there are very limited parsing options on the UF, so you'd need to set up parsing later at the indexer anyway. 

If you're ok with that option, you can do it by setting the indexAndForward setting in ouptuts.conf:

[tcpout]
defaultGroup = local_indexing

[tcpout:local_indexing]
indexAndForward = true

 

A better option to store the logs locally would be to use a third party log collection tool like Fluentd or LogStash, or write your own Powershell scripts.

Ideally you would use Splunk for its intended purpose by directly forwarding the logs from the 60 UFs to a Splunk indexer (or HF), however I understand that may not be possible in this case. 

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...