Getting Data In

Splunk Enterprise in a VM environment

hsasecadmin
New Member

i have virtual environments 70 VMs i install trail version of Splunk to use it as SIEM also i have DC to connect all VMs , i don't know how
can i collect all log data to specific file and send this file to splunk reciver . please help me to do that .
this is the first time i use the splunk

0 Karma

Richfez
SplunkTrust
SplunkTrust

Updated slightly: You DID specify the barest pieces of your environment. 70 domain attached VMs.

Your use case of using Splunk as a SIEM is very important here and will drive much of where you might be aiming.

Let me try backing into an answer for you.

Suppose you are a Windows only shop (or nearly "only", anyway). From the tiny little bit of information, I'm guessing you are most likely a fairly typical Small to Medium Business running Windows, AD and all that good stuff. You have a couple of SQL boxes, a few file servers and a few dozen servers for apps. This could be completely wrong, at which point just skip to the bottom.

For what you've said is your end goal, I'd suggest using whatever you use for automating client installs (SCCM?) and pushing out the UF pre-configured to talk to your deployment server and send in the basic Event Logs (System, Application and Security). You'll probably expand that later into some other logs, but it'll do for a start. You'll also want at least one DC sending in AD information if you are running AD, and at some point you'll have to find a solution for your DNS and possibly DHCP (all that and more is covered in the Splunk App for Windows Infrastructure setup as well, which I would recommend).

Of course using deployment clients means you'll need to install a deployment server to manage those deployment clients.

That set of inputs off 70 servers will make a Splunk indexer moderately busy, so the starting point is to determine on what hardware the indexer will be set up on. Will it be physical? Virtual? Either can be made to work, but Splunk has some fairly high minimum specs EVEN if it's a VM. Probably most important and hardest is ensuring it has enough IOPS on storage, even when everything else is hammering the disks. I'd plan on setting up two indexers in a cluster so you have high availability. You'll probably also want one Search Head.

But that's only half the battle. The other half is your networking equipment which needs to be handled differently. For them you'll want to set up a syslog-ng or rsyslog server (or Kiwi syslog for Windows, whatever) and point all your networking equipment to syslog to it, and also collect that via a UF and send it in. Firewalls are especially important. This will require configuration of both the devices and Splunk to make sure all that data's being ingested properly.

You'll need to think about your retention for the different types of data, too, and how you'll want access divvied up. This will help you determine how to arrange your indexes and how much disk space you'll need overall (which is a different problem from how much disk speed you'll need). 2 years of retention for 25 GB/day would require about 5 TB of space on each (leaving plenty of room left over) so your IOPS will be more important than space probably.

And you'll then need to consider licensing. I have an environment significantly larger but not so much so that I can't at least guess at the size of license you'll need. I'd say - and again this is entirely a guess! - that you'd need 10 GB/day minimum, probably closer to 20 GB/day. It could be 5 GB/day or it could be 150 GB/day, you'll have to start the process to determine that yourself. Your Splunk Sales Rep can help with this too. Did I mention you should make friends with your sale rep? Call them up, chat with them a bit.

You'll also want to check out Splunk Education for some training on managing this environment.

So, in the space of a few minutes I've outlined needing 3 servers (probably physical) of reasonable specs plus a more moderately provisioned (probably Virtual Machines) LIcensing/Deployment Server and another Cluster Master, a few dozen disks underneath the indexers, a licensing cost that isn't insignificant and an investment in training and effort.

Can you do this on less? Very possibly. You could probably get by with one server doing indexing and searching plus one moderate VM as your License/DS and as long as you stay below, oh, 50 GB/day or something it'll probably be OK as long as you don't underprovision the server. It'll probably won't save you much time down the road, but it'll be a bit faster to set up especially if you are new to all this. The point is there is actual work involved in just determining what your hardware requirements will be.

sundareshr
Legend

Here's some good documentation on how you can do this.

http://docs.splunk.com/Documentation/Splunk/6.4.2/Data/Getstartedwithgettingdatain

0 Karma

Richfez
SplunkTrust
SplunkTrust

hsasecadmin,

It sounds like you are jumping in with both feet and no life jacket. A hearty "Welcome" is in order and I applaud you, but you might want to start just a little smaller and expand into the rest. We can help with that.

But in order to really help you well, we need more information on your environment - is it mostly Windows, mostly Linux, one old AS/400, "we do web hosting", do you run AD or an SSO? "70 VMs" is useful, but what's in them? With that information we can get you started and point you to the right docs for your situation.

Also your use case: "To use it as a SIEM " is a great use of Splunk, but it's a HUGE topic and again your environment will determine what exactly you can do and how you should proceed. If you wanted Splunk as a SIEM I would very heartily recommend getting in touch with your Splunk Sales Rep and start a conversation about training and having Splunk Professional Services come help you.

Thanks, looking forward to your responses!

skoelpin
SplunkTrust
SplunkTrust

Welcome to Splunk @hsasecadmin

There's many ways to get data into Splunk but the easiest way would be to set up a forwarder on the remote host and have it monitor a directory and send new data to the indexer. First you need to go download the Splunk Universal Forwarder and install it on the remote host. After you installed it, you then need to set up your inputs.conf and outputs.conf.. You could have also set this up during the install if you specified it

So go to %Splunk%/etc/system/local and create a file inputs.conf and it will look like this

[monitor:///PATH_TO_LOG_FILE*.log]
 disabled = false
 index = THE_INDEX_YOU_SPECIFIED
 host = REMOTE_HOSTNAME

Then you should set up your outputs.conf which will point to the indexer.

[tcpout]
 defaultGroup = INDEXER_IP_9997

 [tcpout:INDEXER_IP_9997]
 server = INDEXER_IP:9997

 [tcpout-server://INDEXER_IP:9997]

Go to your %Splunk%/bin folder and /splunk restart

Look at the deployment server to streamline this process

skoelpin
SplunkTrust
SplunkTrust

Yes you will need to install the forwarder on every remote host if you want to forward data. Once the forwarder is installed you can then use the Splunk deployment server to configure the forwarders so you don't have to manually do it on each machine. I've also attached older threads where users have created scripts to install forwarders on many remote machines without the manual labor

What operating systems are your remote hosts running?

https://answers.splunk.com/answers/100989/forwarder-installation-script.html
https://answers.splunk.com/answers/34896/simple-installation-script-for-universal-forwarder.html

hsasecadmin
New Member

thanks a lot for reply .
as i told you i have 70 Virtual machine joined to Domain controller , is it necessary to download the universal forwarder on all VM's or only on DC ?

0 Karma

Richfez
SplunkTrust
SplunkTrust

Yes. Probably everywhere. Since you are wanting to use it as a SEIM.

0 Karma

skoelpin
SplunkTrust
SplunkTrust

See my answer above

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...