hi
can someone help me with this error message?
will it be because of this file and its size? can i delete it?
replicating 265 MB should not be a problem.
can you check the connectivity of search peer in search head, settings -> Distributed search -> search peers.
is status healthy?
Hi,
It is fine but I have seen it several times in "failed" state
It's even intermittent between "Successful" and "failed"
@thambisetty what do you think is the cause?
Sorry for the insistence, could someone give me recommendations? 😭
Did this happened regularly?
Which kind of environment you have?
Have you MC (monitoring console) in use so you could check what there happened.
r. Ismo
Did this happened regularly?
Yes
Which kind of environment you have?
2 Search Head
2 Indexers
2 Heavy Forwarder
I have access to the monitor, but there are several menus and submenu, which option should I check exactly?
Just as I investigated the problem is presented because the file called bundle is very heavy which causes the error message, my question at this time is if this bundle file can be debugged?
Hi
maybe these previous answers can help you:
Basically this could means that you have e.g. massive csv lookup file which you are trying to use within yor search and timeouts rises due that.
r. Ismo