All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @gcusello  Nice idea my friend, thanks for your answer Danke  Zake
I suppose that you have an Indexer Cluster, is it correct? No ,you should design a multisite Indexer Cluster where the secondary site is on AWS. yes we are planning multisite Indexer Cluster.  th... See more...
I suppose that you have an Indexer Cluster, is it correct? No ,you should design a multisite Indexer Cluster where the secondary site is on AWS. yes we are planning multisite Indexer Cluster.  the DR site is US-WEST-2 (Oregon) .
Hi @zksvc , you could extract from splunk the list of hostnames with a simple search index=* | stats count BY host. Then you could elaborate these results e.g. using nslookup to have the hostnames ... See more...
Hi @zksvc , you could extract from splunk the list of hostnames with a simple search index=* | stats count BY host. Then you could elaborate these results e.g. using nslookup to have the hostnames when you have the IPs and viceversa, at the same time, when you have an FQDN, you could extract the hostname using a regex, but it depends on your data. In this way, you couls have a list of hosts whose logs are monitored by Splunk and you can match them with the Sophos list using e.g. Excel. Otherwise, if you planned to ingest Sophos logs in Splunk, you can do this match in Splunk. Ciao. Giuseppe
I have not set up the ingest from Sophos to Splunk yet. I am currently looking to create a custom correlation search. However, if you know how to verify the data, please let me know. The query I've ... See more...
I have not set up the ingest from Sophos to Splunk yet. I am currently looking to create a custom correlation search. However, if you know how to verify the data, please let me know. The query I've crafted clearly identifies all the necessary details such as hostname, IP, and username. The issue of uppercase/lowercase is not a problem, as it only requires output without the need to compare data. I've been quite troubled trying to sort this out, which has led me to this point.
Hi @Rim-unix , what do you mean with DR Indexers? at first, I suppose that you have an Indexer Cluster, is it correct? Anyway, you should design a multisite Indexer Cluster where the secondary sit... See more...
Hi @Rim-unix , what do you mean with DR Indexers? at first, I suppose that you have an Indexer Cluster, is it correct? Anyway, you should design a multisite Indexer Cluster where the secondary site is on AWS. To do this I hint to engage a Splunk PS or a certified Splunk Architect. Ciao. Giuseppe
For DR purposes you should use multisite cluster option. See more https://docs.splunk.com/Documentation/SVA/current/Architectures/M2M12
Dear Cisco, today, I'm not able to see the list of the snapshots in many SaaS Controllers (at least 10 Controllers to which I have access). It seems that the snapshots were saved until yesterday 11:... See more...
Dear Cisco, today, I'm not able to see the list of the snapshots in many SaaS Controllers (at least 10 Controllers to which I have access). It seems that the snapshots were saved until yesterday 11:00 PM CET. I opened a ticket with severity S2 3 hours ago, but I didn't receive any information. The status page doesn't report any issues. Could you post some updates about this? Thanks Alberto
Is this a new feature on dashboard studio? From which version it works?
Hi Team,  we are planning to build DR Splunk indexer on AWS Cloud. could you give the detailed instructions for creating the DR Splunk indexer. Thanks & Regards  Ramamohan   
Hi @zksvc , you're asking of data quality: how data are ingested in Splunk? is this input the same of Sophos? then, you shuld analyze if there's some difference caused by the hostname extraction: ... See more...
Hi @zksvc , you're asking of data quality: how data are ingested in Splunk? is this input the same of Sophos? then, you shuld analyze if there's some difference caused by the hostname extraction: Ip instead hostname, FQDN or hostname, uppercase or lowercase? You should perform an analysis on the hostnames and Splunk gives you all the tools to search and analyze them. Ciao. Giuseppe
Dear Everyone, I would like to create a custom correlation search to identify hostnames that have not been updated in one month or 30 days or longer. However, upon finalizing my query, I encountered... See more...
Dear Everyone, I would like to create a custom correlation search to identify hostnames that have not been updated in one month or 30 days or longer. However, upon finalizing my query, I encountered a discrepancy in the data. For instance, I found that the hostname "ABC" has not been updated for 41 days; however, when I checked in Sophos Central via the website, it indicated "No Devices Found." I am inquiring about how Splunk is able to read this data while Sophos Central reports that the device is not found. Thank you for your assistance.
Splunk status all is good but in splunk logs is like this one  idk why mongodb services do not running, i check. Also, when i want check telnet in port 8191 its refused   
@tomdaniel  <style> /* Hide the "Populating..." text */ .loading-msg { display: none !important; } </style> <dashboard> <!-- Your dashboard content here --> </dashboard>
i just tried doing it in a dashboad and insert the different searches in a dropdown values and used the token after a search and it worked. thank you very much.
Hi, I have installed the add on on search head and indexers, but it is not working. I am using 1st log format which w3c.   The time in w3c is UTC, but we need it in gtm+3.   for second log for... See more...
Hi, I have installed the add on on search head and indexers, but it is not working. I am using 1st log format which w3c.   The time in w3c is UTC, but we need it in gtm+3.   for second log format which is IIS, it is not pared at any sourcetype available in addon
Hi @Aedah , as @MuS said, xlsx isn't a format usable for uploading or reading a file, you must convert it in csv or txt. There isn't still a direct converter from Excel to Splunk, even if there's a... See more...
Hi @Aedah , as @MuS said, xlsx isn't a format usable for uploading or reading a file, you must convert it in csv or txt. There isn't still a direct converter from Excel to Splunk, even if there's an app to export results in Excel, but not to import. Ciao. Giuseppe
Hi @intosplunk , let me understand: how do you set variable1/2/3: using a dropdown or based on a condition inside the search? If based on a dropdown, you can insert the different searches in the dr... See more...
Hi @intosplunk , let me understand: how do you set variable1/2/3: using a dropdown or based on a condition inside the search? If based on a dropdown, you can insert the different searches in the dropdown values. If based on a condition, you should share your searches to understand how to build your complex search. Ciao. Giuseppe
Hey Splunkers,  I'm trying to create a conditional search that will run on the same index but will have different search terms according to a variable I have that can have one of three values. It i... See more...
Hey Splunkers,  I'm trying to create a conditional search that will run on the same index but will have different search terms according to a variable I have that can have one of three values. It is supposed to be something like that: index = my_index variable = 1/2/3 if variable=1 then run search1 if variable=2 then run search2 if variable=3 then run search3 i tried multiple ways but they didn't work so im trying to get some help here
Hi @rahulkumar , all these operations are before indexing, so you'll index the events as they ware before ingestion in logstash and having all the metadata you need, you you can apply all the parsin... See more...
Hi @rahulkumar , all these operations are before indexing, so you'll index the events as they ware before ingestion in logstash and having all the metadata you need, you you can apply all the parsing rules from the standard add-ons (the ones from splunkbase) and run all the searches. Indeed, these operations are only to apply the standard parsing rules, because you can search the logs also using the original logstash format, but without the parsing and tagging and normalization rules. let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
You can also use the tstats with prestats with count. | tstats prestats=t count where index IN (network,proxy) by index _time span=1h | timechart span=1h count by index