Getting Data In

How can I create search and alert if port was down or not listing?

sekhar463
Path Finder

i have  events for port listening on 443

how can i create search and alert if port was down or not liseting

below are the same events

 

10/10/22
10:35:40.312 AM
2022-10-10 11:35:40.312 transport=TCP dest_ip=[::] dest_port=443 pid=4 appname=System
host = GBLONICORE01Vsource = C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows_850\bin\win_listening_ports.batsourcetype = Script:ListeningPorts
10/10/22
10:35:40.312 AM
2022-10-10 11:35:40.312 transport=TCP dest_ip=0.0.0.0 dest_port=443 pid=4 appname=System
host = GBLONICORE01Vsource = C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows_850\bin\win_listening_ports.batsourcetype = Script:ListeningPorts
10/10/22
9:35:40.006 AM
2022-10-10 10:35:40.006 transport=TCP dest_ip=[::] dest_port=443 pid=4 appname=System
host = GBLONICORE01Vsource = C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows_850\bin\win_listening_ports.batsourcetype = Script:ListeningPorts
10/10/22
9:35:40.006 AM
2022-10-10 10:35:40.006 transport=TCP dest_ip=0.0.0.0 dest_port=443 pid=4 appname=System
host = GBLONICORE01Vsource = C:\Program Files\SplunkUniversalForwarder\etc\apps\Splunk_TA_windows_850\bin\win_listening_ports.batsourcetype = Script:ListeningPorts

 

0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @sekhar463,

no results from the above search means that all the hosts have logs for that port.

If one host appears in the list, it means that there isn't any log with that port definition.

So the condition to trigger in the alert is results=0

Ciao.

Giuseppe

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sekhar463,

at first you should know the list of hosts to monitor and save this list a a lookup called e.g. perimeter.csv containing at least one column host.

then you could run a search like the following (if the above data are archived in the windows index with the sourcetype=ListeningPort:

index=windows sourcetype=ListeningPort dest_port=443
| eval host=lower(host)
| stats count BY host
| append [ | imputlookup perimeter.csv | eval host=lower(host), count=0 | fields host count ]
| stats sum(count) AS total BY host
| where total=0

If you want to monitor more ports, you can add to the lookup another column with the ports to monitor for each server and modify the search:

index=windows sourcetype=ListeningPort dest_port=443
| eval host=lower(host)
| stats count BY host dest_port
| append [ | imputlookup perimeter.csv | eval host=lower(host), count=0 | fields host dest_port count ]
| stats sum(count) AS total BY host dest_port
| where total=0

Ciao.

Giuseppe

0 Karma

sekhar463
Path Finder

Hi @gcusello Thanks for ur response

for now i want to check for 3 hosts and only wants to monitor port 443 for those

we are using main index for this port data 

so now i have created lookup for these hosts only host column

is anything i need to update 

0 Karma

gcusello
SplunkTrust
SplunkTrust

hi @sekhar463,

you could creat an alert for those 3 hosts, but I hint to use the lookup, to have a more flexible solution.

so you could run something like this:

index=main sourcetype=ListeningPort dest_port=443
| eval host=lower(host)
| stats count BY host
| append [ | inputlookup perimeter.csv | eval host=lower(host), count=0 | fields host count ]
| stats sum(count) AS total BY host
| where total=0

if you don't want to use the lookup, you could use:

index=main sourcetype=ListeningPort dest_port=443
| eval host=lower(host)
| stats count BY host
| append [ | makeresults | eval host=your_host1, count=0  | fields host count]
| append [ | makeresults | eval host=your_host2, count=0  | fields host count]
| append [ | makeresults | eval host=your_host3, count=0  | fields host count]
| stats sum(count) AS total BY host
| where total=0

Ciao.

Giuseppe

0 Karma

sekhar463
Path Finder
index=main sourcetype=ListeningPort dest_port=443
| eval host=lower(host)
| stats count BY host
| append [ | makeresults | eval host=your_host1, count=0  | fields host count]
| append [ | makeresults | eval host=your_host2, count=0  | fields host count]
| append [ | makeresults | eval host=your_host3, count=0  | fields host count]
| stats sum(count) AS total BY host
| where total=0

is this will show results port is not listening or port was down ?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sekhar463,

this search display one or more of these three hosts when there isn't any log containing "dest_port=443".

Ciao.

Giuseppe

0 Karma

sekhar463
Path Finder

what was the trigger condition can i use for the search to alert 

 

index=main sourcetype=Script:ListeningPorts dest_port=443
| eval host=lower(host)
| stats count BY host
| append [ | makeresults | eval host=gblonicore01vd, count=0 | fields host count]
| append [ | makeresults | eval host=gbwokicore01vd, count=0 | fields host count]
| append [ | makeresults | eval host=gblonicore01vt, count=0 | fields host count]
| append [ | makeresults | eval host=gbwokicore01vt, count=0 | fields host count]
| append [ | makeresults | eval host=gblonicore01v, count=0 | fields host count]
| append [ | makeresults | eval host=gbwokicore01v, count=0 | fields host count]
| stats sum(count) AS total BY host
| where total=0

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sekhar463,

no results from the above search means that all the hosts have logs for that port.

If one host appears in the list, it means that there isn't any log with that port definition.

So the condition to trigger in the alert is results=0

Ciao.

Giuseppe

0 Karma

sekhar463
Path Finder

so as per below if results will be there if port 443 is listening based on events i can see

so now if no results for 443 then alert will come if we will keep results is 0

is that correct what i understood

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sekhar463,

yes correct

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...