All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Rim-unix , what do you mean with DR Indexers? at first, I suppose that you have an Indexer Cluster, is it correct? Anyway, you should design a multisite Indexer Cluster where the secondary sit... See more...
Hi @Rim-unix , what do you mean with DR Indexers? at first, I suppose that you have an Indexer Cluster, is it correct? Anyway, you should design a multisite Indexer Cluster where the secondary site is on AWS. To do this I hint to engage a Splunk PS or a certified Splunk Architect. Ciao. Giuseppe
For DR purposes you should use multisite cluster option. See more https://docs.splunk.com/Documentation/SVA/current/Architectures/M2M12
Dear Cisco, today, I'm not able to see the list of the snapshots in many SaaS Controllers (at least 10 Controllers to which I have access). It seems that the snapshots were saved until yesterday 11:... See more...
Dear Cisco, today, I'm not able to see the list of the snapshots in many SaaS Controllers (at least 10 Controllers to which I have access). It seems that the snapshots were saved until yesterday 11:00 PM CET. I opened a ticket with severity S2 3 hours ago, but I didn't receive any information. The status page doesn't report any issues. Could you post some updates about this? Thanks Alberto
Is this a new feature on dashboard studio? From which version it works?
Hi Team,  we are planning to build DR Splunk indexer on AWS Cloud. could you give the detailed instructions for creating the DR Splunk indexer. Thanks & Regards  Ramamohan   
Hi @zksvc , you're asking of data quality: how data are ingested in Splunk? is this input the same of Sophos? then, you shuld analyze if there's some difference caused by the hostname extraction: ... See more...
Hi @zksvc , you're asking of data quality: how data are ingested in Splunk? is this input the same of Sophos? then, you shuld analyze if there's some difference caused by the hostname extraction: Ip instead hostname, FQDN or hostname, uppercase or lowercase? You should perform an analysis on the hostnames and Splunk gives you all the tools to search and analyze them. Ciao. Giuseppe
Dear Everyone, I would like to create a custom correlation search to identify hostnames that have not been updated in one month or 30 days or longer. However, upon finalizing my query, I encountered... See more...
Dear Everyone, I would like to create a custom correlation search to identify hostnames that have not been updated in one month or 30 days or longer. However, upon finalizing my query, I encountered a discrepancy in the data. For instance, I found that the hostname "ABC" has not been updated for 41 days; however, when I checked in Sophos Central via the website, it indicated "No Devices Found." I am inquiring about how Splunk is able to read this data while Sophos Central reports that the device is not found. Thank you for your assistance.
Splunk status all is good but in splunk logs is like this one  idk why mongodb services do not running, i check. Also, when i want check telnet in port 8191 its refused   
@tomdaniel  <style> /* Hide the "Populating..." text */ .loading-msg { display: none !important; } </style> <dashboard> <!-- Your dashboard content here --> </dashboard>
i just tried doing it in a dashboad and insert the different searches in a dropdown values and used the token after a search and it worked. thank you very much.
Hi, I have installed the add on on search head and indexers, but it is not working. I am using 1st log format which w3c.   The time in w3c is UTC, but we need it in gtm+3.   for second log for... See more...
Hi, I have installed the add on on search head and indexers, but it is not working. I am using 1st log format which w3c.   The time in w3c is UTC, but we need it in gtm+3.   for second log format which is IIS, it is not pared at any sourcetype available in addon
Hi @Aedah , as @MuS said, xlsx isn't a format usable for uploading or reading a file, you must convert it in csv or txt. There isn't still a direct converter from Excel to Splunk, even if there's a... See more...
Hi @Aedah , as @MuS said, xlsx isn't a format usable for uploading or reading a file, you must convert it in csv or txt. There isn't still a direct converter from Excel to Splunk, even if there's an app to export results in Excel, but not to import. Ciao. Giuseppe
Hi @intosplunk , let me understand: how do you set variable1/2/3: using a dropdown or based on a condition inside the search? If based on a dropdown, you can insert the different searches in the dr... See more...
Hi @intosplunk , let me understand: how do you set variable1/2/3: using a dropdown or based on a condition inside the search? If based on a dropdown, you can insert the different searches in the dropdown values. If based on a condition, you should share your searches to understand how to build your complex search. Ciao. Giuseppe
Hey Splunkers,  I'm trying to create a conditional search that will run on the same index but will have different search terms according to a variable I have that can have one of three values. It i... See more...
Hey Splunkers,  I'm trying to create a conditional search that will run on the same index but will have different search terms according to a variable I have that can have one of three values. It is supposed to be something like that: index = my_index variable = 1/2/3 if variable=1 then run search1 if variable=2 then run search2 if variable=3 then run search3 i tried multiple ways but they didn't work so im trying to get some help here
Hi @rahulkumar , all these operations are before indexing, so you'll index the events as they ware before ingestion in logstash and having all the metadata you need, you you can apply all the parsin... See more...
Hi @rahulkumar , all these operations are before indexing, so you'll index the events as they ware before ingestion in logstash and having all the metadata you need, you you can apply all the parsing rules from the standard add-ons (the ones from splunkbase) and run all the searches. Indeed, these operations are only to apply the standard parsing rules, because you can search the logs also using the original logstash format, but without the parsing and tagging and normalization rules. let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
You can also use the tstats with prestats with count. | tstats prestats=t count where index IN (network,proxy) by index _time span=1h | timechart span=1h count by index  
As I said - read the dbinspect docs - specifically the section about the "cached" parameter. You can get that info but it will only pertain to the local copies of the buckets, not the cached metadata... See more...
As I said - read the dbinspect docs - specifically the section about the "cached" parameter. You can get that info but it will only pertain to the local copies of the buckets, not the cached metadata. Which makes sense since cached metadata only contains the general "boundaries" of the bucket, not its data distribution inside.
Hi @earl-b , To achieve redundant cluster manager synchronization in a Splunk environment hosted on a Linux server, you can utilize the rsync utility. This tool will assist in syncing the primary cl... See more...
Hi @earl-b , To achieve redundant cluster manager synchronization in a Splunk environment hosted on a Linux server, you can utilize the rsync utility. This tool will assist in syncing the primary cluster master's configuration to the standby cluster. To manually run the synchronization, execute the following rsync command on the primary cluster master: "rsync -avz --delete /path/to/source user@target_node_ip:/path/to/destination" For automated configuration synchronization, you can employ the crontab utility. This allows you to schedule the rsync command to run at regular intervals, ensuring continuous synchronization between the clusters.  
@zksvc  Did you checked splunk status or check the splunk logs 
@perfeng  Try this  { "type": "splunk.table", "dataSources": { "primary": "ds_zn4Nlcdc" }, "title": "Some title", "options": { "columnFormat": { "name": { "width": 109 }, "team": { ... See more...
@perfeng  Try this  { "type": "splunk.table", "dataSources": { "primary": "ds_zn4Nlcdc" }, "title": "Some title", "options": { "columnFormat": { "name": { "width": 109 }, "team": { "width": 60 }, "url": { "drilldown": "customUrl", "url": "$row.url.value$", "newTab": true } }, "headerVisibility": "fixed" }, "description": "Some description.", "eventHandlers": [ { "type": "drilldown.customUrl", "options": { "url": "$row.url.value$", "newTab": true } } ], "context": {}, "containerOptions": {}, "showProgressBar": false, "showLastUpdated": false }