Security

splunk web got down

Splunk_U
Path Finder

Some wired thing got happen into the server. splunkd was running but the splunk web was down. A restart made everything normal. But i need to know why the splunk web was down. Have got nothing suspicious in splunkd.log and web_server.log.

Is there any way to know why the wplunk web was down??

Tags (1)
0 Karma
1 Solution

yannK
Splunk Employee
Splunk Employee

Check the _internal index for the logs in web_service.log.
Do you see anything prior to the stopping ?

Otherwise, if you are on linux, check the /var/log/messages for any "Out Of Memory / OOM" events, the system can kill a process.

View solution in original post

yannK
Splunk Employee
Splunk Employee

Check the _internal index for the logs in web_service.log.
Do you see anything prior to the stopping ?

Otherwise, if you are on linux, check the /var/log/messages for any "Out Of Memory / OOM" events, the system can kill a process.

Splunk_U
Path Finder

Yeah that I know...but Sos was not installed in the splunk server when the issue got happened...thats why asking that by seeing event_count, total_run_time etc can we understand the memory consumption...or least can we understand the relation between memory consumption with evvent_count or total_run_time?

0 Karma

yannK
Splunk Employee
Splunk Employee

the sos app and the ps_sos scripts, will show you cpu/memory of expensive searches
http://splunk-base.splunk.com/apps/29008/sos-splunk-on-splunk

0 Karma

Splunk_U
Path Finder

If my search is having event_count=20000 is it eating much memory than the event_count=100???

0 Karma

Splunk_U
Path Finder

I have understood that the root cause is Out of memory but is there any process to check the memory consumption by the search in Splunk?

0 Karma

Splunk_U
Path Finder

I have checked the messages. I have found that...
sisidsdaemon invoked oom-killer: gfp_mask=0x201da, order=0, oom_adj=0, oom_score_adj=0
and after that..
Out of memory: Kill process 3936 (python) score 730 or sacrifice child
So looks like splunk web got down due to out of memeory...

0 Karma
Get Updates on the Splunk Community!

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

🔐 Trust at Every Hop: How mTLS in Splunk Enterprise 10.0 Makes Security Simpler

From Idea to Implementation: Why Splunk Built mTLS into Splunk Enterprise 10.0  mTLS wasn’t just a checkbox ...