Deployment Architecture

map and sendmail commands in search head clustering

yutaka1005
Builder

In my environment, I am building search head clustering consisting of three search heads and one deployer.

In addition, I am using an alert that sends mail individually with the "map" command and "sendmail" command for logs that meet certain conditions.

However, as a result of checking this morning, only one alert was caught, and even though the result was one line, two mails were sent.

When I checking the internal logs, the logs below were issued in the internal logs of the two search heads at approximately the same timing (deviation of about 0.4 seconds).
"INFO sendemail:128 - Sending email..."

From this I thought that the same search ran for the two search heads.

Is there a workaround for this phenomenon?
Also, are "sendmail" and "map" commands not recommended in clustering?
And Is there a possibility that it is the cause?

0 Karma
1 Solution

HiroshiSatoh
Champion

MAPコマンドもsendmailコマンドもクラスタ環境で問題なく動くと思います。JOBの重複起動やデータの重複が原因ではないですか?

View solution in original post

0 Karma

tkomatsubara_sp
Splunk Employee
Splunk Employee

メールサーバ側(たとえば、Syslog) で、きちんとリクエストが来ているかという観点でのチェックも必要ですね。

yutaka1005
Builder

ご回答いただきありがとうございます。

アラートが二重で動作していたことが原因でした…
jobを確認したらすぐにわかりました。

0 Karma

HiroshiSatoh
Champion

MAPコマンドもsendmailコマンドもクラスタ環境で問題なく動くと思います。JOBの重複起動やデータの重複が原因ではないですか?

0 Karma

yutaka1005
Builder

ご回答いただきありがとうございます。

ご指摘のとおりアラートが二重で動いていたことが原因でした。

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...