All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Update: I remove the old expired license (that was incorrectly detected as a PRODUCTION license when it is Non-Production): /opt/splunk/bin/splunk remove license <license_hash> Then I went back to... See more...
Update: I remove the old expired license (that was incorrectly detected as a PRODUCTION license when it is Non-Production): /opt/splunk/bin/splunk remove license <license_hash> Then I went back to the Splunk Web: Log in to Splunk Web as an admin. Navigate to Settings > Licensing. Add new license New license successfully added  Thanks both @kiran_panchavat and @PrewinThomas for your help.    
Hi,  I have saw there are many recommendations to rebuild and migrate with its existing data and configuration. It abit confusing for me as a new Splunk user, would appreciate if there some guidanc... See more...
Hi,  I have saw there are many recommendations to rebuild and migrate with its existing data and configuration. It abit confusing for me as a new Splunk user, would appreciate if there some guidance for it. The following are the instances. 1x Search Head 3x Indexer 3x Heavy Forwarder 1x License server 1x deployment server Current version: 9.3.2 Assuming the hostname/IP could be the same or different for the rebuild. What is the best way to perform the rebuild and migration with it existing data and configuration? Same hostname/IP: - Copy the entire contents of the $SPLUNK_HOME directory from the old server to the new server - Install all instance for the new Splunk component into new server Different hostname/IP: - Copy the entire contents of the $SPLUNK_HOME directory from the old server to the new server - Install all instance for the new Splunk component into new server - Update individual .conf of instances if using new hostname - Update individual instances to point to their respecitive instances roles And could i install a newer version of Splunk without going to 9.3.2 when rebuilding and migrating? For testing purpose, I'll be trying it at one AIO instances for the rebuilding/migration due to space limitation.
Please check this out,  https://splunk.my.site.com/customer/s/article/Can-Splunk-Ingest-Windows-etl-Formated-Files 
@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Spl... See more...
@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Splunk-managed infrastructure. Also controlled by the Searchable Retention (days) setting per index. DDAA (Dynamic Data Active Archive) Equivalent to frozen buckets. Data is archived but can be restored for search (up to 30 days) and it's managed by Splunk Please note that there is a restoration limit - up to 10% of your DDAS entitlement at any time So, the Searchable Retention defines how long data remains in the searchable tier (hot/warm/cold equivalent), and Dynamic Data Storage handles what happens after that. Also if you exceed 100%, Splunk elastically expand DDAS to retain data, but consistent overages can impact search performance and potentially additional cost as well. You can refer below for detailed info. #https://splunk.my.site.com/customer/s/article/Details-for-DDAS-and-DDAA #https://www.splunk.com/en_us/blog/platform/dynamic-data-data-retention-options-in-splunk-cloud.html?locale=en_us Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karm... See more...
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourage... See more...
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourages continued contributions to the community.   Thanks Prewin
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a buc... See more...
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a bucket moves through several states (from hot to thawed). However, when I click on a new index in Splunk Cloud, I only saw "Searchable retention (days)" and "Dynamic Data Storage". Does this mean that the amount of data that can be searched in the hot and warm buckets before it goes to cold is basically equal to the searchable retention (days)? Does Dynamic Data Storage basically equate to the Cold, Frozen and Thawed buckets (as in Splunk Enterprise)?   Furthermore, in the Splunk Cloud Monitoring Console, I can see DDAS and DDAA in the 'License' section. What exactly are these, and what is their relationship with data retention? What happens if the DDAS/DDAA exceeds 100%? Does this affect searching performance, or does Splunk Cloud simply not allow you to search data? Thanks.
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update... See more...
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update the deploymentclient.conf file on the relevant clients as well. You can also consider using a DNS alias approach(depends on your environment) if you anticipate changing hostnames again in the future, without interrupting the forwarders splunk confs. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi Prewin, Sorry for my late response. I will follow your suggested steps below. Will update you on my results. Regards, Peter
Hi Kiran, Thanks for sharing. Mine is a Dev system that had expired. So, I am surprised to see the error. I'll take your suggestion to remove the old license and try again. Regards, Peter
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try wi... See more...
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try with subsearch index=a sourcetype=*b* NOT [ | inputlookup exclude_paths.csv | fields path httpcode ] Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the out... See more...
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the outputs.conf to the new destination, right??
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in ... See more...
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in your results called "path" and "http_code", you could run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | fields path http_code ] | ... if the fields in your main search have different names, you must rename them in the subsearch to be sure to match the field names from the main search. If instead the path in the lookup must match only a part of the path field, you should run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | rename path AS query | fields query http_code ] | ...  Ciao. Giuseppe
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to f... See more...
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to form a search query to exclude the fields in this input lookup file with matching the httpcode ;  whe i run query with like index=a and sourcetype=*b*  it needs to exclude the path and specific httpcode from the excel and siplay output for other paths and httpcodes.  please help 
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) se... See more...
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) set to localhost or the server’s hostname -Does not match the public IP address -Is not trusted by browsers Replace the Default Self-Signed Certificate Option A: Use a DNS Hostname Assign a DNS name (e.g., splunk.test.com) pointing to your public IP. Generate a certificate for that hostname using: A commercial CA (e.g., DigiCert, Sectigo)Or a free CA like Let’s Encrypt Access Splunk Web via https://splunk.test.com:8000 Option B: Use a Self-Signed Cert with Public IP as CN Generate a self-signed certificate with the CN set to your public IP. Install the root certificate on client machines to avoid trust warnings. Note: I would not recommend this for production or external access due to browser limitations, trust issues, compliance and best practices. Refer#https://help.splunk.com/en/splunk-enterprise/administer/manage-users-and-security/9.3/secure-splunk-platform-communications-with-transport-layer-security-certificates/how-to-create-and-sign-your-own-tls-certificates Install the Certificate in Splunk Place the new certificate and key files on your Splunk server. Edit $SPLUNK_HOME/etc/system/local/web.conf: [settings] enableSplunkWebSSL = true serverCert = /opt/splunk/etc/auth/custom_ssl/splunk_cert.pem privKeyPath = /opt/splunk/etc/auth/custom_ssl/splunk_key.pem Then restart Splunk Also you can refer #https://docs.splunk.com/Documentation/Splunk/9.4.2/Security/SecureSplunkWebusingasignedcertificate Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!  
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request throug... See more...
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request through their contact form to ensure it's considered through all available channels: #https://www.virustotal.com/gui/contact-us/premium-services Support contact details #https://docs.virustotal.com/docs/vt4splunk-guide Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Also you can try daniel.knights@hyperion3.com.au
@laura  App is restricted to authorized users only. Its developed by https://www.hyperion3.com.au You can reach them on contact@hyperion3.com.au Regards, Prewin Splunk Enthusiast | Always happy ... See more...
@laura  App is restricted to authorized users only. Its developed by https://www.hyperion3.com.au You can reach them on contact@hyperion3.com.au Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@siv  You can consider using mvfilter as well. | makeresults | eval Field1="Value 1" | eval Field2=split("Value 1,Value 2,Value 3", ",") | eval Field2=mvfilter(match(Field2, "^(?!Value 1$).*")) ... See more...
@siv  You can consider using mvfilter as well. | makeresults | eval Field1="Value 1" | eval Field2=split("Value 1,Value 2,Value 3", ",") | eval Field2=mvfilter(match(Field2, "^(?!Value 1$).*")) Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
The simplest way is to use = rather than match, unless you need match. The problem with match is if your data contains anything that might be significant to the regular expression, e.g. if Value 1 is... See more...
The simplest way is to use = rather than match, unless you need match. The problem with match is if your data contains anything that might be significant to the regular expression, e.g. if Value 1 is Va*ue, then that won't work well with match. You can do it like this - example shows simple fields in Field1, 2 and 3 and then one with regex significant characters and shows Field6 works, but not Field7 | makeresults | fields - _time | eval Field1="Value 1", Field2=split("Value 1,Value 2,Value 3", ",") | eval Field3=mvmap(Field2, if(Field1=Field2, null(), Field2)) | eval Field4="Odd[Regex Chars?]Value 1", Field5=split("Odd[Regex Chars?]Value 1,Value 2,Value 3", ",") | eval Field6_using_equals=mvmap(Field5, if(Field4=Field5, null(), Field5)) | eval Field7_using_match=mvmap(Field5, if(match(Field4,Field5), null(), Field5))