All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @tokitamiki  I would recommend checking out the following page which has some really useful information about the ports used and the connectivity between the servers.  https://help.splunk.com/en/... See more...
Hi @tokitamiki  I would recommend checking out the following page which has some really useful information about the ports used and the connectivity between the servers.  https://help.splunk.com/en/splunk-enterprise/administer/inherit-a-splunk-deployment/9.3/inherited-deployment-tasks/components-and-their-relationship-with-the-network    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
We are storing data in a Splunk lookup file on one of the forwarders.  In our distributed Splunk architecture, this lookup data is not getting forwarded to the indexers or the search head, and there... See more...
We are storing data in a Splunk lookup file on one of the forwarders.  In our distributed Splunk architecture, this lookup data is not getting forwarded to the indexers or the search head, and therefore it is not available for search or enrichment.  How can we sync or transfer this lookup data from the forwarder to the search head (or indexers) so that it can be used across the distributed environment?
We are storing data in a Splunk lookup file on one of the forwarders.  In our distributed Splunk architecture, this lookup data is not getting forwarded to the indexers or the search head, and there... See more...
We are storing data in a Splunk lookup file on one of the forwarders.  In our distributed Splunk architecture, this lookup data is not getting forwarded to the indexers or the search head, and therefore it is not available for search or enrichment.  How can we sync or transfer this lookup data from the forwarder to the search head (or indexers) so that it can be used across the distributed environment?  
Hello. I make Splunk Enterprise Server. License Manager, Heavy Forwarder, Cluster Manager, Indexer, Search Head Cluster Deployer, Search Head, Deployment server. I wanna know what communication be... See more...
Hello. I make Splunk Enterprise Server. License Manager, Heavy Forwarder, Cluster Manager, Indexer, Search Head Cluster Deployer, Search Head, Deployment server. I wanna know what communication between Splunk server and other. ex) License Manager to Heavy Forwarder, they communicate 8086 port(TCP). Which manual does write these things? Thank you.
I have created a pipeline for filtering data coming into the sourcetype = fortigate_traffic. I would like to further add an exclusion to the data coming into this sourcetype. How can this be done?... See more...
I have created a pipeline for filtering data coming into the sourcetype = fortigate_traffic. I would like to further add an exclusion to the data coming into this sourcetype. How can this be done? Nested ? or any other method eg;- 1st pipeline is where NOT (dstport IN ("53") AND dstip IN ("10.5.5.5"))   Need to add onother pipeline as  NOT (dstport IN ("80","443") AND (app IN (xyz,fgh,dhjkl,.....) Has anyone done anything similar to this. Please guide. Thanks 
@BJ17  Currently, I don't think there is any built-in option to migrate older detections to the new versioning format(in ES 8.1) without encountering these errors.. As a workaround, can you manuall... See more...
@BJ17  Currently, I don't think there is any built-in option to migrate older detections to the new versioning format(in ES 8.1) without encountering these errors.. As a workaround, can you manually add a UUID-style string as the detection_id for your existing detections in savedsearches.conf and test if this resolves the issue Eg: [detection_name] detection_id = d6f2b006-0041-11ec-8885-acde48001122 Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
The CSR is must for installing SSL Certificate on Glassfish.  After generating the CSR file, open it in a text editor such as Notepad to check for any spelling mistakes or incorrect details. Once ver... See more...
The CSR is must for installing SSL Certificate on Glassfish.  After generating the CSR file, open it in a text editor such as Notepad to check for any spelling mistakes or incorrect details. Once verified, send the CSR to a Certificate Authority (CA) for validation. The time to receive your certificate will vary based on the type of validation chosen. After receiving the certificate, import it along with your private key into the GlassFish Keystore. There are the following steps for installation which are as follows:-  1) Unzipping and extracting the file 2) Uploading the Extracted Files to the Glassfish server 3) Importing Keystore into Glassfish default Keystore 4) Entering passwords 5) Updating the Configuration on your server If you need to understand properly in detail and getting error, Further I am mentioning one article that can be useful for you :-https://certera.com/kb/how-to-install-an-ssl-certificate-on-glassfish/. Hope it helps!
Hi , Thank you so much for your sample. Now I can generate my dashboards as multiple value right now
We can recreate the rules without errors. But I'm looking for a way without changing the rule name.
Sorry something happened when posting! Here we go: { "title": "ImageDashboardStudio", "description": "", "inputs": {}, "defaults": { "dataSources": { "ds.sear... See more...
Sorry something happened when posting! Here we go: { "title": "ImageDashboardStudio", "description": "", "inputs": {}, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": {} } } } }, "visualizations": { "viz_frONH0n1": { "options": { "markdown": "This is markdown\n$imgSearch:result._imageMarkdown$" }, "type": "splunk.markdown" }, "viz_vNXqSiui": { "dataSources": { "primary": "ds_2vTXdmuT" }, "options": { "count": 20, "dataOverlayMode": "none", "drilldown": "none", "showInternalFields": false, "showRowNumbers": false }, "type": "splunk.table" } }, "dataSources": { "ds_2vTXdmuT": { "name": "imgSearch", "options": { "enableSmartSources": true, "query": "|makeresults | eval imageUrl=\"https://beta.dashpub.online/screenshots/608f9a7d4726e06206c78ccbb488832f.jpg\"\n| eval imageUrl=mvappend(imageUrl,\"https://beta.dashpub.online/screenshots/f31dc5c4a2e5e76312c9b190c7ef7bfb.jpg\")\n| foreach imageUrl mode=multivalue\n [| eval _imageMarkdown=mvappend(_imageMarkdown,\"![](\".<<ITEM>>.\")\")]\n", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "type": "ds.search" } }, "layout": { "globalInputs": [], "layoutDefinitions": { "layout_1": { "options": { "display": "auto", "height": 1440, "width": 1440 }, "structure": [ { "item": "viz_vNXqSiui", "position": { "h": 250, "w": 720, "x": 0, "y": 0 }, "type": "block" }, { "item": "viz_frONH0n1", "position": { "h": 820, "w": 370, "x": 740, "y": 10 }, "type": "block" } ], "type": "absolute" } }, "options": {}, "tabs": { "items": [ { "label": "New tab", "layoutId": "layout_1" } ] } } }  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
I wasn't able to find the updated dashboard source with multivalue handling and the logic for creating markdown for each command. I’d really appreciate it if you could share it.
Hi @phupn1510  Ive updated the search to be multivalue and added logic to dynamically create the markdown content using a foreach command. I think this is the closest we can get to what you are look... See more...
Hi @phupn1510  Ive updated the search to be multivalue and added logic to dynamically create the markdown content using a foreach command. I think this is the closest we can get to what you are looking for.  Note that you cannot embed these images in a table, currently this is expecting a single row but you could use stats command to concat multiple rows into a single row to have a single variable containing all the markdown for the images. Also, I've used an _imageMarkdown field, the _ means that this field will not display in the table, this is incase you want to display the other data but not render the markdown as plaintext within the table. Its possible using dashboard studio to drag the table off the side of the canvas if you do not want to display it but you do want the search to run to generate the list of images. I hope this helps!  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@bowesmana  I don't think so:  
No I can't try this because we need the custom styling and behaviour for this use case. We can't do that in dashboard studio. This is for user input --> Big buttons which, when pressed, 'send data' t... See more...
No I can't try this because we need the custom styling and behaviour for this use case. We can't do that in dashboard studio. This is for user input --> Big buttons which, when pressed, 'send data' to splunk: A normal dashboard classic search is run with a collect command at the end. So no JS to run the searches, only for dashboard function. 
Ah, yes. Crible. Looked at their product earlier for other issues. Seems like a good product and solution but as you point out, another product to manage. Supposedly a logstash in front of our heavy... See more...
Ah, yes. Crible. Looked at their product earlier for other issues. Seems like a good product and solution but as you point out, another product to manage. Supposedly a logstash in front of our heavy forwarders could also do the trick. But that also includes another product and I assume that log shipping would then also become HEC traffic and in my (very limited but still) experience also mean having to deal with a new log format and sourcetype for a standard Linux audit log.
Unfortunenatly not. It's also not really an error. Just no response from the server after the request with the search string.
@BJ17  Could you try recreating one of your existing detections in the new ES App(8.1) and check if you’re able to update and save it successfully? Regards, Prewin Splunk Enthusiast | Always happ... See more...
@BJ17  Could you try recreating one of your existing detections in the new ES App(8.1) and check if you’re able to update and save it successfully? Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Does the error give any indication what's going on?   
Splunk got rid of DSP which did that and I'm aware that the aggregation features of DSP is something that EP is hopefully going to support at some point - note that Cribl could do this if you really ... See more...
Splunk got rid of DSP which did that and I'm aware that the aggregation features of DSP is something that EP is hopefully going to support at some point - note that Cribl could do this if you really wanted to go that route although it would entail another tech stack if you don't already use it.
@haph  Could you try using Dashboard Studio? Some custom CSS or JavaScript used in Classic dashboards may not work well with Safari/iOS Regards, Prewin Splunk Enthusiast | Always happy to help! I... See more...
@haph  Could you try using Dashboard Studio? Some custom CSS or JavaScript used in Classic dashboards may not work well with Safari/iOS Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!