Getting Data In

Trying to understand the basics.

rboursaw
New Member

I am very new to Splunk and am trying to figure out if this will assist us in resolving some of our monitoring needs. My goal is to have a central web interface where I can see information about all of our servers within our customer environments. I am picturing a scenario where there will be a server at each customer site that collects information from all other servers and then a forwarder will send it back to the main Splunk server within my network??? I've been reading a lot of the documentation but am not totally understanding if this can be accomplished with this application. We do not have direct access in to our customer's network, so having something collect the info and then send it back through 443 or something like that to a host within our network would be preferred. Any direction would be greatly appreciated.

Thank you,

Bob

Tags (1)
0 Karma

gnovak
Builder

hi. I can try to describe my environment a bit as I think it may be kind of what you are looking for.

We have 4 main indexers setup using a distributed search. This allows for us to use 1 central splunk server for searching across multiple indexers and use it as our central "console" so to speak.

Let's just say for this example the 4 main indexers are:

New York (Search Head)
Chicago
Texas
Miami

We have the lightweight forwarder setup on all machines in each location where there are logs we want to monitor and index. For example, we have all of our webservers setup to be lightweight forwarders in each location.

On each lightweight forwarder we have a custom inputs.conf file that specifies what files and directories we want to monitor and what the sourcetypes in splunk for these files and directories will be.

for example:

[monitor:///opt/log/.../web_server/web_server.log]
disabled = false
sourcetype = WEB

This means any webserver log in /opt/log/*/web_server will automatically be indexed in splunk under the sourcetype WEB. This allows you to search for it easily by just typing sourcetype=WEB in splunk.

Once logs are specified to be indexed in inputs.conf, they are indexed at the splunk server for that location. For example, the Miami splunk server has all the log data for all the servers that also reside in Miami....The Chicago server has all log data for all of the servers in Chicago....The Texas splunk server has all the log data for all the machines that reside in Texas, etc.

The last part of this is setting up a distributed search where you can use a splunk server in say, New York to search the New York splunk server along with the one in Miami, Chicago and Texas all at the same time and present all data to you from all sources in one window.

This is known as Distributed search. On the main splunk server in New York, you would add distributed search Peers. You do this by going to Manager, Distributed Search, Search Peers. You would be adding an entry for Chicago, Miami and Texas.

On the Chicago, Miami, and Texas splunk server, you want to go to Manager, Distributed Search, Distributed search setup. You will turn on distributed search and broadcast to other splunk servers. We also specified a Heartbeat multicast IP address and port

More info is here

http://www.splunk.com/base/Documentation/4.1.6/Admin/Whatisdistributedsearch

Hope this helps! This is how my environment is at the moment....

0 Karma

araitz
Splunk Employee
Splunk Employee

Sure, Splunk can help you! Let me point you to some links in our documentation to get you on your way.

You will use a forwarder, most likely a light forwarder, installed on the customer's equipment and network. Usually, it takes no more than 5 minutes to install and configure a light forwarder to monitor files, performance, and other data and send the information on to a Splunk server.

As you mentioned, you will want to configure SSL encryption and mutual authentication to better secure your customer's data in transit. You should be able to provide an app or add-on to your customers that has the necessary configuration files and certificates to allow input and secure forwarding.

We use a similar multi-tiered approach at Splunk to the one you describe, and so do our customers - we call this an intermediate forwarder.

Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...