All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @siv  How about  | makeresults | eval Field1="value1", Field2="value1 value2 value3" | eval Field2=split(Field2," ") | foreach Field2 mode=multivalue [| eval Field2=mvappend(Field2,IF(<<ITEM>>!=... See more...
Hi @siv  How about  | makeresults | eval Field1="value1", Field2="value1 value2 value3" | eval Field2=split(Field2," ") | foreach Field2 mode=multivalue [| eval Field2=mvappend(Field2,IF(<<ITEM>>!=Field1, <<ITEM>>,null()))]  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @laura  The main developer for the app is Daniel Knights at Hyperion 3 - Their email is daniel.knights@hyperion3.com.au so worth reaching out directly, perhaps cc in contact@hyperion3.com.au too,... See more...
Hi @laura  The main developer for the app is Daniel Knights at Hyperion 3 - Their email is daniel.knights@hyperion3.com.au so worth reaching out directly, perhaps cc in contact@hyperion3.com.au too, or use their contact form at https://www.hyperion3.com.au/  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @ws  For rebuilding and migrating your Splunk Enterprise setup to a new site while preserving existing data and configurations either of your mentioned paths would work. This assumes compatible O... See more...
Hi @ws  For rebuilding and migrating your Splunk Enterprise setup to a new site while preserving existing data and configurations either of your mentioned paths would work. This assumes compatible OS/architecture between old and new servers. Personally I'd probably use the same version as your existing deployment for the new site and upgrade once complete, that way you're doing a migration rather than a transformation - which is less risky. It also means that there wont be any unknown config changes when copying the contents of $SPLUNK_HOME. You may want to look at using something like rsync for copying the $SPLUNK_DB paths over from the old servers to the new ones, which might take some time depending on your data retention size/configurations. You could move the bulk of this first and then copy the config.  If you're able to keep the same hostnames etc and switch the DNS over, or retain the same IPs then this will obviously reduce a lot of additional work, otherwise you will need to go through various servers to update things like deploymentclient.conf for clients connecting to the DS etc.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Update: I remove the old expired license (that was incorrectly detected as a PRODUCTION license when it is Non-Production): /opt/splunk/bin/splunk remove license <license_hash> Then I went back to... See more...
Update: I remove the old expired license (that was incorrectly detected as a PRODUCTION license when it is Non-Production): /opt/splunk/bin/splunk remove license <license_hash> Then I went back to the Splunk Web: Log in to Splunk Web as an admin. Navigate to Settings > Licensing. Add new license New license successfully added  Thanks both @kiran_panchavat and @PrewinThomas for your help.    
Hi,  I have saw there are many recommendations to rebuild and migrate with its existing data and configuration. It abit confusing for me as a new Splunk user, would appreciate if there some guidanc... See more...
Hi,  I have saw there are many recommendations to rebuild and migrate with its existing data and configuration. It abit confusing for me as a new Splunk user, would appreciate if there some guidance for it. The following are the instances. 1x Search Head 3x Indexer 3x Heavy Forwarder 1x License server 1x deployment server Current version: 9.3.2 Assuming the hostname/IP could be the same or different for the rebuild. What is the best way to perform the rebuild and migration with it existing data and configuration? Same hostname/IP: - Copy the entire contents of the $SPLUNK_HOME directory from the old server to the new server - Install all instance for the new Splunk component into new server Different hostname/IP: - Copy the entire contents of the $SPLUNK_HOME directory from the old server to the new server - Install all instance for the new Splunk component into new server - Update individual .conf of instances if using new hostname - Update individual instances to point to their respecitive instances roles And could i install a newer version of Splunk without going to 9.3.2 when rebuilding and migrating? For testing purpose, I'll be trying it at one AIO instances for the rebuilding/migration due to space limitation.
Please check this out,  https://splunk.my.site.com/customer/s/article/Can-Splunk-Ingest-Windows-etl-Formated-Files 
@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Spl... See more...
@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Splunk-managed infrastructure. Also controlled by the Searchable Retention (days) setting per index. DDAA (Dynamic Data Active Archive) Equivalent to frozen buckets. Data is archived but can be restored for search (up to 30 days) and it's managed by Splunk Please note that there is a restoration limit - up to 10% of your DDAS entitlement at any time So, the Searchable Retention defines how long data remains in the searchable tier (hot/warm/cold equivalent), and Dynamic Data Storage handles what happens after that. Also if you exceed 100%, Splunk elastically expand DDAS to retain data, but consistent overages can impact search performance and potentially additional cost as well. You can refer below for detailed info. #https://splunk.my.site.com/customer/s/article/Details-for-DDAS-and-DDAA #https://www.splunk.com/en_us/blog/platform/dynamic-data-data-retention-options-in-splunk-cloud.html?locale=en_us Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karm... See more...
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourage... See more...
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourages continued contributions to the community.   Thanks Prewin
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a buc... See more...
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a bucket moves through several states (from hot to thawed). However, when I click on a new index in Splunk Cloud, I only saw "Searchable retention (days)" and "Dynamic Data Storage". Does this mean that the amount of data that can be searched in the hot and warm buckets before it goes to cold is basically equal to the searchable retention (days)? Does Dynamic Data Storage basically equate to the Cold, Frozen and Thawed buckets (as in Splunk Enterprise)?   Furthermore, in the Splunk Cloud Monitoring Console, I can see DDAS and DDAA in the 'License' section. What exactly are these, and what is their relationship with data retention? What happens if the DDAS/DDAA exceeds 100%? Does this affect searching performance, or does Splunk Cloud simply not allow you to search data? Thanks.
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update... See more...
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update the deploymentclient.conf file on the relevant clients as well. You can also consider using a DNS alias approach(depends on your environment) if you anticipate changing hostnames again in the future, without interrupting the forwarders splunk confs. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi Prewin, Sorry for my late response. I will follow your suggested steps below. Will update you on my results. Regards, Peter
Hi Kiran, Thanks for sharing. Mine is a Dev system that had expired. So, I am surprised to see the error. I'll take your suggestion to remove the old license and try again. Regards, Peter
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try wi... See more...
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try with subsearch index=a sourcetype=*b* NOT [ | inputlookup exclude_paths.csv | fields path httpcode ] Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the out... See more...
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the outputs.conf to the new destination, right??
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in ... See more...
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in your results called "path" and "http_code", you could run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | fields path http_code ] | ... if the fields in your main search have different names, you must rename them in the subsearch to be sure to match the field names from the main search. If instead the path in the lookup must match only a part of the path field, you should run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | rename path AS query | fields query http_code ] | ...  Ciao. Giuseppe
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to f... See more...
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to form a search query to exclude the fields in this input lookup file with matching the httpcode ;  whe i run query with like index=a and sourcetype=*b*  it needs to exclude the path and specific httpcode from the excel and siplay output for other paths and httpcodes.  please help 
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) se... See more...
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) set to localhost or the server’s hostname -Does not match the public IP address -Is not trusted by browsers Replace the Default Self-Signed Certificate Option A: Use a DNS Hostname Assign a DNS name (e.g., splunk.test.com) pointing to your public IP. Generate a certificate for that hostname using: A commercial CA (e.g., DigiCert, Sectigo)Or a free CA like Let’s Encrypt Access Splunk Web via https://splunk.test.com:8000 Option B: Use a Self-Signed Cert with Public IP as CN Generate a self-signed certificate with the CN set to your public IP. Install the root certificate on client machines to avoid trust warnings. Note: I would not recommend this for production or external access due to browser limitations, trust issues, compliance and best practices. Refer#https://help.splunk.com/en/splunk-enterprise/administer/manage-users-and-security/9.3/secure-splunk-platform-communications-with-transport-layer-security-certificates/how-to-create-and-sign-your-own-tls-certificates Install the Certificate in Splunk Place the new certificate and key files on your Splunk server. Edit $SPLUNK_HOME/etc/system/local/web.conf: [settings] enableSplunkWebSSL = true serverCert = /opt/splunk/etc/auth/custom_ssl/splunk_cert.pem privKeyPath = /opt/splunk/etc/auth/custom_ssl/splunk_key.pem Then restart Splunk Also you can refer #https://docs.splunk.com/Documentation/Splunk/9.4.2/Security/SecureSplunkWebusingasignedcertificate Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!  
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request throug... See more...
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request through their contact form to ensure it's considered through all available channels: #https://www.virustotal.com/gui/contact-us/premium-services Support contact details #https://docs.virustotal.com/docs/vt4splunk-guide Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Also you can try daniel.knights@hyperion3.com.au