Getting Data In
Highlighted

Heavy Forwarder Forwarding Question

Engager

I am a Splunk novice and have created a splunk indexer cluster in a windows environment. I have two heavy forwarders gathering event log data from machines in each heavy forwarder’s specific subnet. When I log onto either indexer cluster member or the search head, I can see that event log data is being collected from both heavy forwarders in the Main index – good so far.

Now, on the Master Node, I updated the indexes.conf file (located at \etc\master-apps_cluster\local) and created two new indexes – one index named Cat and one named Dog. After Distributing the Configuration Bundle, both indexers in the indexer cluster now show the Cat and Dog indexes – now this part is good.

I cannot for the life of me figure out how to get one heavy forwarder to forward all of the event log data it collected to go to the Cat index instead of the Main index and for the other heavy forwarder to forward to Dog instead of Main. Can someone help me? I appreciate assistance. Thanks!

0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Splunk Employee
Splunk Employee

On the Heavy Forwarder, modify the inputs.conf file and specify the index via the "index= " specification for each data source to define which index (Cat, Dog, Main, etc.) it should go to:

http://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf

Great examples of this and information here in the inputs.conf spec documentation.

View solution in original post

Highlighted

Re: Heavy Forwarder Forwarding Question

Engager

PDAIGLE came up with the correct answer in one of the comments below!

0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Ultra Champion

Something like this should work in inputs.conf -

[tcp://<host>:<port>]
connection_host = DNS
disabled = 0
sourcetype = <sourcetype>
source = <source>
index = <index name>
0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Engager

Thank you for the responses! If I am to understand this correctly, I need to have a stanza for every host that is forwarding its event logs to the heavy forwarder? If I have 500 hosts sending event logs to one heavy forwarder, I will need 500 stanzas in that heavy forwarder's inputs.conf file? This means that every time a machine is added to that subnet and receives the universal forwarder via SCCM, I will need to go to the heavy forwarder and update the inputs.conf file? I hope that this is wrong.

I was able to edit the inputs.conf file to import perfmon data into the Dog and Cat indexes; however, I was not able to get the event logs that are forwarded to the Heavy Forwarders from several hundred machines to get to the proper index (this data still shows up in the Main index). I wish there was simply a place I can change this default Main Index in the inputs.conf file to say Index = Dog or Cat to get the logs to it.

I tried the following inputs conf file for getting the event logs forwarded to the Dog index instead of Main based on ddrillic response.

[default]
host = SplunkHF-Dog01

[tcp://SplunkHF-Dog01:9997]
connection_host = DNS
disabled = 0
sourcetype = WinEventLog:Security
source = WinEventLog:Security
index = Dog

What am I doing wrong? What else can I try?

0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Ultra Champion

No, you just need to set the index= for all your existing inputs. No need to define whole new inputs.

Or, if you want to apply it to all your inputs, you can also specify it under [default] stanza, and make sure there are no explicit index=main settings elsewhere.

Highlighted

Re: Heavy Forwarder Forwarding Question

Engager

FrankVl, I appreciate the response! I added index = Dog to my default stanza and the logs are still going to Main (and I did reboot the heavy forwarder as well as the members of the indexer cluster). Let me make sure that I have everything correct...

The inputs.conf file I am editing is in $SPLUNK_HOME/etc/system/local.

The entire inputs.conf file is
[default]
host = SplunkHF-Dog01
index = Dog

The original inputs.conf file that was there was just missing the index line.

No other settings have been changed on the heavy forwarder conf files. Should I update another conf file? As this conf file is in the local folder, should settings in this one take precedence over any other inputs conf files? Thanks for the help!!

0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Engager

I know that I have to put index = Dog somewhere in an inputs.conf file, but where? I added it to my inputs.conf file's default stanza in $SPLUNK_HOME/etc/system/local and rebooted to no avail. What this boils down to is how do I change the Heavy Forwarder's default index of Main to Dog?

0 Karma
Highlighted

Re: Heavy Forwarder Forwarding Question

Ultra Champion

Can you share the inputs.conf for the actual inputs that you configured? It could be that those contain explicit "index=main" setting.

If you don't know where to find those inputs.conf, try the following on your Heavy Forwarder from $SPLUNK_HOME/bin/: ./splunk cmd btool inputs list --debug

And then share the output of that command here (mask any sensitive information if needed)

Highlighted

Re: Heavy Forwarder Forwarding Question

Engager

FrankVI, I truly appreciate your help! Unfortunately, my splunk system is totally disconnected form the Internet: however, I have run the tool. I did compare the outputs of a similar Heavy Forwarder that has not been set to point to Dog and the Heavy Forwarder that points to Dog using the command that you gave. Every place that index = Dog on the one heavy Forwarder has Index = default on the other, but both are dumping all of their contents into Main. In fact, on the Heavy Forwarder pointing to Dog, there is no index pointing to Main or default. I even rebooted the heavy forwarder again and still no luck. Do you have any other ideas? In the meantime, I can start the process to try and get the output from the command over to the Internet. Again, thank you!

0 Karma