All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I want it as a dashboard panel. it works as is in the search visualisation tab
Hello @richgalloway  Thank you for your support. However, I would like to demonstrate that my NDR solution utilizes a centralized server called "Brain" to gather logs from network sensors. In orde... See more...
Hello @richgalloway  Thank you for your support. However, I would like to demonstrate that my NDR solution utilizes a centralized server called "Brain" to gather logs from network sensors. In order to achieve this, the optimal approach would be to establish a channel connecting the heavy forwarder to the NDR Brain. Therefore, I would appreciate your recommendations on this matter. so the best solution will be to create a channel between heavy forwarder and this NDR Brain so what is your recommendations ??
Hello , i just created new index on cluster master for new integrated log source, but can not find this new index on heavy forwarders to be configured as new data input. any recommendations for su... See more...
Hello , i just created new index on cluster master for new integrated log source, but can not find this new index on heavy forwarders to be configured as new data input. any recommendations for such as situation ?
Hi here https://github.com/splunk/splunk-ansible is one way to use ansible to install splunk environment. With ansible you could use both ways to add nodes to cluster (edit conf files + restart nod... See more...
Hi here https://github.com/splunk/splunk-ansible is one way to use ansible to install splunk environment. With ansible you could use both ways to add nodes to cluster (edit conf files + restart nodes and/or use cli commands).  Which one is better way is different story which depends on your needs and used software. As you need to add several nodes now (and probably more later) scripting is definitely the correct way to manage that. Personally I install even individual nodes with ansible. And of course you should use e.g. git to store your configuration and versioning it. r. Ismo
Thank You for your reply, I am using both UF and HF
Hi it's just as @gcusello said. You cannot create HA environment without indexer cluster. That needs minimum three nodes: manager node and minimum two peer. Here is https://www.splunk.com/en_us/pdf... See more...
Hi it's just as @gcusello said. You cannot create HA environment without indexer cluster. That needs minimum three nodes: manager node and minimum two peer. Here is https://www.splunk.com/en_us/pdfs/tech-brief/splunk-validated-architectures.pdf document which you should read and use as a base instructions how different splunk installations are working and for what purpose those are targeted. r. Ismo
Hi over those examples from @richgalloway I would like to point to set up your own development server (you can ask developer and/or dev/test license from splunk) and learn with this. That way you co... See more...
Hi over those examples from @richgalloway I would like to point to set up your own development server (you can ask developer and/or dev/test license from splunk) and learn with this. That way you could do e.g. data on boarding much easier without disturbing your production. Also create your apps etc. with this dev environment and then install apps into splunk cloud. r. Ismo
If/when you have strictly follow those requirements and steps that should work as expected without any other steps. Here is described how to test Try out the visualization And remember that when you... See more...
If/when you have strictly follow those requirements and steps that should work as expected without any other steps. Here is described how to test Try out the visualization And remember that when you are changing it you usually must restart splunk to get the new version into use!
Hi I hope that you have created a separate app for those. If not then it's time to create it! After that you can just copy and install this app into the new host. If/when you have CLI access to th... See more...
Hi I hope that you have created a separate app for those. If not then it's time to create it! After that you can just copy and install this app into the new host. If/when you have CLI access to this host (on development you always have) the easiest way to pack that app is  on cli  splunk package app <your app> Then splunk told where it put that package. You need just copy that package and then install it into your another nodes. r. Ismo 
I built a custom visualisation. I want to move it to the dashboard as a panel. how do it do that?
My visualisation is built following this framework https://docs.splunk.com/Documentation/Splunk/9.1.2/AdvancedDev/CustomVizTutorial
Hi In RHEL 8.9 you should use this version Enable boot-start on machines that run systemd And before that you must change ownership of all files which splunk is using to user splunk (or what ever u... See more...
Hi In RHEL 8.9 you should use this version Enable boot-start on machines that run systemd And before that you must change ownership of all files which splunk is using to user splunk (or what ever user you are using for run splunk). r. Ismo
Hi When your indexer goes down in cluster then CM try to spread out all buckets with your current indexers. This means that every indexers will divide those buckets which failed indexers have. In no... See more...
Hi When your indexer goes down in cluster then CM try to spread out all buckets with your current indexers. This means that every indexers will divide those buckets which failed indexers have. In normal situation you shouldn't change SF&RF on this situation. Instead of you should fix that failed indexers and bring it back to cluster. After that cluster will rebalance buckets with all available indexers. Basically indexer clusters works as it should. But for that reason you should have enough spare space on those peers to manage situation when one (or even more, based on your RF&SF) can be down for some time. Other option is that your peers frozen some buckets to have enough space for normal operations. Of course this needs that you are using volumes with suitable max limit. r. Ismo
Hi Team,   I am trying to below query, it showing the all servers are up, I tested one server stopped and checked  it's not showing Down status, please fine the below query index="_internal" | ev... See more...
Hi Team,   I am trying to below query, it showing the all servers are up, I tested one server stopped and checked  it's not showing Down status, please fine the below query index="_internal" | eval host=lower(host) | stats count BY host | append [ | eval host=lower(host) ] | eval status=if(total=0,"Down","up") | table host status   Please letme know exact query on that.
Hi Shortly just select wanted visualisation from selection list and ensure that your data has correct format what that visualisation is needed. If there isn't your wanted visualisation, then look wh... See more...
Hi Shortly just select wanted visualisation from selection list and ensure that your data has correct format what that visualisation is needed. If there isn't your wanted visualisation, then look what you could found from splunkbase. See more from https://docs.splunk.com/Documentation/Splunk/9.1.2/Viz/Visualizationreference On splunk base there is https://splunkbase.splunk.com/app/1603 which shows to you with examples how to use different visualisations. r. Ismo
I have a visualization in the splunk search -> visualization. I want this visualization as a splunk dashboard panel. how do i do it?  
Hi @syaseensplunk, It's always better to use sourcetype, alsop because I'm not sure that you can use the kubernetes contaiener, you can only use sourcetype, host and source. Sorry if I ask you agai... See more...
Hi @syaseensplunk, It's always better to use sourcetype, alsop because I'm not sure that you can use the kubernetes contaiener, you can only use sourcetype, host and source. Sorry if I ask you again: where do you located these conf files? Ciao. Giuseppe
There is none.. However, I was able to make it work with the <source_type>. Any help is much appreciated!!
When one of the indexers fails, I have a problem with the growth of buckets on the working indexer. If one of the indexers is not available, should I change the bucket policy? I currently have RF:2 ... See more...
When one of the indexers fails, I have a problem with the growth of buckets on the working indexer. If one of the indexers is not available, should I change the bucket policy? I currently have RF:2 SF:2
Trying to connect to a Splunk  independent stream forwarder to Stream forwarder Search Header 023-12-27 13:52:47 ERROR [140302542051072] (HTTPRequestSender.cpp:1459) stream.SplunkSenderHTTPEventColl... See more...
Trying to connect to a Splunk  independent stream forwarder to Stream forwarder Search Header 023-12-27 13:52:47 ERROR [140302542051072] (HTTPRequestSender.cpp:1459) stream.SplunkSenderHTTPEventCollector - (#1) Failing over to disk 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:717) stream.SplunkSenderHTTPEventCollector - (#2) Resetting blocked connection 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:717) stream.SplunkSenderHTTPEventCollector - (#3) Resetting blocked connection 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:1485) stream.SplunkSenderHTTPEventCollector - (#2) TCP connection failed: Operation canceled 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:1485) stream.SplunkSenderHTTPEventCollector - (#3) TCP connection failed: Operation canceled enable Stream forwarder Search Header ssl :  same error disable Stream forwarder Search Header ssl : same error What could be the problem? [Independant stream forwarder] inputs.conf [streamfwd://streamfwd] splunk_stream_app_location = http://192.168.0.111:8000/en-us/custom/splunk_app_stream/ streamfwd.conf [streamfwd] port = 8889 ipAddr = 192.168.0.112 [Stream forwarder Search Header] splunk_httpinput/local/input.conf [http://streamfwd] disabled = 0 token = 5aa1c706-2bcd-4f90-857b-636c8afab1f5 index = streamfwd indexes = streamfwd sourcetype = stream [http] disabled = 0 port = 8088 enableSSL = 1 * Stream forwarder Search Header, independent stream forwarder OS : Linux (CentOS 7)