Deployment Architecture

why collect command not working ??

ramarcsight
Explorer

Hello everyone
I have a SH and two IDX
I run a search in SH and using "collect" command i push the results to a index=sql which is available in IDX1 and IDX2 (load balanced based upon the availability)
so it was pushing results until may28 and after that it stopped pushing

i ran the search separately still it doesnot push the results.
but it says "Successfully wrote file to '/opt/splunk/var/spool/splunk/63g3hs73g37sh_events.stash_new'."

so what is happening?

is there anyway i can see whether the data is getting push to IDX1 or IDX2

Thank you

Tags (1)
0 Karma

jkat54
SplunkTrust
SplunkTrust

The load balancing sounds ackward too. Are you using a load balancer in front of your indexers? Splunk uses what is known as software load balancing. It’s a “poor man’s” or simpler, cheaper solution to an F5 or dedicated NLBs. Splunk will send to one indexer for x amount of time and then to the other for the same amount of time, and then keep switching back and forth.

0 Karma

Sahr_Lebbie
Path Finder

@jkat54is right, recently ran into this issue trying to summarize data from a shared peer. I needed to enable forwarding. In my case, just pushed out our forwarding app from the DS. This allowed the stash created to be forwarded to the indexing layer.  Now collect works wonderfully.

@ramarcsighthopefully you have yours solved now

🙂

0 Karma

jkat54
SplunkTrust
SplunkTrust

Two of the most possible issues would be:
1. Forwarding to the indexers has been disabled, or otherwise circumvented.
2. The inputs.conf setting for stash has been malformed or overrided

Another option might be that you’ve forced the OS
to give the files the splunk service user is creating wrong permissions!

0 Karma

jkat54
SplunkTrust
SplunkTrust

Those are your splunk indexers where you are forwarding data generated on your search head to. They are where the index named sql needs to be... but it sounds to me like you may be confusing things a bit.

When you use | collect index=sql

Splunk writes to a “stash” file which is then read by your search head and indexed. Since your search head is forwarding though, it doesn’t index it locally instead it sends it to those two splunk servers on port 9997. Now assuming you have an index named sql on those indexers, you should be able to find the data you’ve collected by searching splunk for “index=sql”

0 Karma

ramarcsight
Explorer

yeah ping gave me results
the hosts Splunk-site1-indexer01:9997
splunk-site2-indexer01:9997 are 1.1.1.1 and 2.2.2.2 which have index=sql,so the question is why the collect command is not pushing results?

0 Karma

ramarcsight
Explorer

yes i understood that
but i created index=sql in 1.1.1.1 ip address and 2.2.2.2 ip address .

according to my knowledge they r not named
splunk-site1-indexer01:9997
splunk-site2-indexer01:9997

first i wanna know what are these indexers :
splunk-site1-indexer01:9997
splunk-site2-indexer01:9997

is there any way i can log in to those indexers ?

my whole point is how to check splunk-site1-indexer01:9997 and splunk-site2-indexer01:9997 are 1.1.1.1 and 2.2.2.2 ?
please help me

0 Karma

jkat54
SplunkTrust
SplunkTrust

you can go to command prompt and run this command:

nslookup splunk-site1-indexer01

This will give you the ip address if those names are in dns. If it says non-existent domain, then try this command at command prompt:

ping splunk-site2-indexer01

It will give you the last IP your system has for that server.

Do these IPs match the same servers that are in your search peers?

If not, you’re forwarding the stashed data to indexers that you are not searching.

0 Karma

ramarcsight
Explorer

both of them give no results

1.ping request couldnot find the host
2.nslookup says "non exist domain"

0 Karma

jkat54
SplunkTrust
SplunkTrust

Can you paste the results of the ping command?

Do you have a splunk admin who can help you?

I’m uncomfortable recommending any changes at this point because you don’t sound like the person who should be making changes.

I highly recommend you contact Splunk Support.

0 Karma

ramarcsight
Explorer

ping is giving me IP addresses sir.
in /etc/hosts in Linux the IP address and hostname are mapped.
I still cannot understand why the collect command is not working.

Thank you very much for your patience with me.

0 Karma

jkat54
SplunkTrust
SplunkTrust

Collect command sends the stash file to the indexers shown in your data forwarding list the servers with port 9997 that you shared & you’re telling me that your search head isn’t configured to search the indexers you’re sending the data to.

0 Karma

jkat54
SplunkTrust
SplunkTrust

So you can add your other indexers as search peers or change your forwarding to send to idx1 and idx2.

0 Karma

ramarcsight
Explorer

SH is forwarding to IDX1 and IDX2 but collect command is not working .
Sorry for late reply but i think this is out of my scope

0 Karma

ramarcsight
Explorer

yeah ping gave me results
the hosts Splunk-site1-indexer01:9997
splunk-site2-indexer01:9997 are 1.1.1.1 and 2.2.2.2 which have index=sql, so the question is why the collect command is not pushing results?

0 Karma

ramarcsight
Explorer

I went to settings
--Forwarding and receiving
-- configure Forwarding
--App context=ALL owner=Any

I get this:
splunk-site1-indexer01:9997 enabled
splunk-site2-indexer01:9997 enabled -- what are these ??

so i shd add my IDX1(sql01) here ??

0 Karma

jkat54
SplunkTrust
SplunkTrust

Yes your indexer should be listed in the list on that page.

If you don’t recognize those two servers, then it’s possible someone changed your config to point to these two new servers.

You’ll want to check your search peers in your distributed search setup too. If your search head isn’t searching the indexers where the data is going... then obviously the search will fail.

Just thought of another common cause, does your role have access to the index? Is your role inheriting another role that has search filters that exclude the index?

0 Karma

ramarcsight
Explorer
  1. I can search the indexers 2.if I run index=sql, I can see data until may 28 and after that only it is not pushing data to IDX1 or IDX2(both of them contains index=sql) 3.in distributed search setup I can see the IDX1 and IDX2 are up and successful.
  2. There is no problem with the role, this worked for me many times sir.

The question lies in how to know these two indexers in forwarding and receiving Splunk-site1-indexer01: 9997, Splunk-site2-indexer01:9997 are my IDX1(1.1.1.1) and IDX2(2.2.2.2) ??

0 Karma

ramarcsight
Explorer

and in distributed search setup i can see
Splunk instance name PeerUri

IDX1 1.1.1.1
IDX2 2.2.2.2

But it doesnot show like

Splunk instance name PeerUri
splunk-site1-indexer01:9997 2.2.2.2
splunk-site2-indexer01:9997 1.1.1.1

does that mean both are different ??

0 Karma