We have a sample local ".txt" file to analyse some logs stored locally in the Heavy Forwarder, in its /tmp/ folder.
For this purpose, a sourcetype has been configured in the Heavy Forwarder to parse the log as we wish.
All this was set up from the web interface.
Back in the day, we were wrong and created the index in the Heavy Forwarder to assign it from the "Input Settings" of the "Add Data" menu. But then we discovered that this should not be done this way.
So, we created the index with name "test" in our cluster master and it was replicated correctly to the two peers indexers. The index is now created but with no information. And it does not appear in the Search Head.
Unfortunately, when assigning the index where it should be saved from the menu Add Data of the Heavy Forwarder, the index "test" that is created in the Indexers does not appear.
In addition, even when the index was created in both indexers and Heavy Forwaders the events wouldn't get to the indexers after selecting the "test" index. In that case it did appear on HF menu I guess because it was created locally there.
Thanks for your answer. Yes, sorry, I'll try to make it more straight forward. I think the first question should be:
- What index should I choose in the "Add data" menu of the HF -the screenshot- ? The index "test" is not appearing there even is created on indexers. By the way, also happens with other indexes.
If the source is on HF then select that index where you want to send it on indexes. If source is on some UF (somewhere else than this HF) then you shouldn't select any index. In most cases you should set index on inputs.conf on source system only.
When you are using GUI on HF you must first add that index with HF's GUI (Setting -> Indexes -> add new or something similar). When you are defining index on indexer it has defined only on indexer or indexers if it was defined on cluster master. You must copy (indexes.conf or create it with GUI) it manually to all other nodes where you want to use it with GUI.
The source is a file on HF, where it have been parsed. My goal is to sent those parsed events to an index, located in the indexers.
I believe that you mean creating the same index locally in the HF. Correct me if wrong, but that would store the data locally on HF rather than sending it over the indexers
See on my previous answer how to send all events to indexers. Then you could create that index definition to your hf without that it stores event on local hf. As I said it’s the best practices on distributed splunk environment.
I'm not 100% sure what your question is 😉
In real life the only place where those indexes must be configured is indexer(s). All other places those are just for your informations and those can be used for selecting targets etc., but in reality in most cases you don't need those especially when you are done configurations with files. If you are using only GUI then it's easier to you just create/copy those definitions from IDX side also to SH and HF side and then you can select those name when needed via GUI.
In distributed environment you should configure all other instances than indexers to send all logs/events to indexers. Here is instructions:
After that all those events should be on your indexers.