All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Spl... See more...
@azer271  I'm sure you are well aware how buckets moves in Splunk Enterprise.  DDAS (Dynamic Data Active Searchable) Equivalent to hot + warm + cold buckets. Data is searchable and stored in Splunk-managed infrastructure. Also controlled by the Searchable Retention (days) setting per index. DDAA (Dynamic Data Active Archive) Equivalent to frozen buckets. Data is archived but can be restored for search (up to 30 days) and it's managed by Splunk Please note that there is a restoration limit - up to 10% of your DDAS entitlement at any time So, the Searchable Retention defines how long data remains in the searchable tier (hot/warm/cold equivalent), and Dynamic Data Storage handles what happens after that. Also if you exceed 100%, Splunk elastically expand DDAS to retain data, but consistent overages can impact search performance and potentially additional cost as well. You can refer below for detailed info. #https://splunk.my.site.com/customer/s/article/Details-for-DDAS-and-DDAA #https://www.splunk.com/en_us/blog/platform/dynamic-data-data-retention-options-in-splunk-cloud.html?locale=en_us Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karm... See more...
Hi @BoscoBaracus , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourage... See more...
@peterow  Sure. Feel free to post your results. Consider giving karma or marking it as the accepted solution(if it resolved your issue)— it helps recognize the efforts of volunteers and encourages continued contributions to the community.   Thanks Prewin
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a buc... See more...
Hello! I'm new to Splunk Cloud. Could you please explain the difference between hot, warm, cold and thawed buckets in Splunk Enterprise and Splunk Cloud? I understand that in Splunk Enterprise, a bucket moves through several states (from hot to thawed). However, when I click on a new index in Splunk Cloud, I only saw "Searchable retention (days)" and "Dynamic Data Storage". Does this mean that the amount of data that can be searched in the hot and warm buckets before it goes to cold is basically equal to the searchable retention (days)? Does Dynamic Data Storage basically equate to the Cold, Frozen and Thawed buckets (as in Splunk Enterprise)?   Furthermore, in the Splunk Cloud Monitoring Console, I can see DDAS and DDAA in the 'License' section. What exactly are these, and what is their relationship with data retention? What happens if the DDAS/DDAA exceeds 100%? Does this affect searching performance, or does Splunk Cloud simply not allow you to search data? Thanks.
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update... See more...
@ws  Absolutely. You'll need to update the outputs.conf file on all forwarders that send data to this server. Additionally, if this server is functioning as a deployment server, make sure to update the deploymentclient.conf file on the relevant clients as well. You can also consider using a DNS alias approach(depends on your environment) if you anticipate changing hostnames again in the future, without interrupting the forwarders splunk confs. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi Prewin, Sorry for my late response. I will follow your suggested steps below. Will update you on my results. Regards, Peter
Hi Kiran, Thanks for sharing. Mine is a Dev system that had expired. So, I am surprised to see the error. I'll take your suggestion to remove the old license and try again. Regards, Peter
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try wi... See more...
@tkrprakash  You can try below one, index=a sourcetype=*b* | lookup exclude_paths.csv path AS path httpcode AS httpcode OUTPUT path AS matched_path | where isnull(matched_path) Also you can try with subsearch index=a sourcetype=*b* NOT [ | inputlookup exclude_paths.csv | fields path httpcode ] Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the out... See more...
@PrewinThomas Thanks for providing the information. I'll take note of if require performing the domain change.   But if there happen to be deployment clients connected, I'll need to update the outputs.conf to the new destination, right??
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in ... See more...
Hi @tkrprakash , do you want to exclude from results events that match the full paths contained in the lookup or a part of it? if you want to use the full path and you have two extracted fileds in your results called "path" and "http_code", you could run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | fields path http_code ] | ... if the fields in your main search have different names, you must rename them in the subsearch to be sure to match the field names from the main search. If instead the path in the lookup must match only a part of the path field, you should run something like this: index=a and sourcetype=*b* NOT [ | inputlookup your_lookup.csv | rename path AS query | fields query http_code ] | ...  Ciao. Giuseppe
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to f... See more...
Hi All, I have an input lookup file with 2 fields  first filed contains some path and the second filed is an httpcode for the path.  example   :  /s/a/list   403  ; /s/b/list 504  i need help to form a search query to exclude the fields in this input lookup file with matching the httpcode ;  whe i run query with like index=a and sourcetype=*b*  it needs to exclude the path and specific httpcode from the excel and siplay output for other paths and httpcodes.  please help 
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) se... See more...
@ielshahrori  When you enable HTTPS and access Splunk Web via a public IP, browsers attempt to validate the SSL certificate. The default Splunk self-signed certificate: -Has a Common Name (CN) set to localhost or the server’s hostname -Does not match the public IP address -Is not trusted by browsers Replace the Default Self-Signed Certificate Option A: Use a DNS Hostname Assign a DNS name (e.g., splunk.test.com) pointing to your public IP. Generate a certificate for that hostname using: A commercial CA (e.g., DigiCert, Sectigo)Or a free CA like Let’s Encrypt Access Splunk Web via https://splunk.test.com:8000 Option B: Use a Self-Signed Cert with Public IP as CN Generate a self-signed certificate with the CN set to your public IP. Install the root certificate on client machines to avoid trust warnings. Note: I would not recommend this for production or external access due to browser limitations, trust issues, compliance and best practices. Refer#https://help.splunk.com/en/splunk-enterprise/administer/manage-users-and-security/9.3/secure-splunk-platform-communications-with-transport-layer-security-certificates/how-to-create-and-sign-your-own-tls-certificates Install the Certificate in Splunk Place the new certificate and key files on your Splunk server. Edit $SPLUNK_HOME/etc/system/local/web.conf: [settings] enableSplunkWebSSL = true serverCert = /opt/splunk/etc/auth/custom_ssl/splunk_cert.pem privKeyPath = /opt/splunk/etc/auth/custom_ssl/splunk_key.pem Then restart Splunk Also you can refer #https://docs.splunk.com/Documentation/Splunk/9.4.2/Security/SecureSplunkWebusingasignedcertificate Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!  
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request throug... See more...
@ez-secops-awn  If you'd like to see native support for this feature, I recommend reaching out to VirusTotal directly by emailing contact@virustotal.com You can also submit a feature request through their contact form to ensure it's considered through all available channels: #https://www.virustotal.com/gui/contact-us/premium-services Support contact details #https://docs.virustotal.com/docs/vt4splunk-guide Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Also you can try daniel.knights@hyperion3.com.au
@laura  App is restricted to authorized users only. Its developed by https://www.hyperion3.com.au You can reach them on contact@hyperion3.com.au Regards, Prewin Splunk Enthusiast | Always happy ... See more...
@laura  App is restricted to authorized users only. Its developed by https://www.hyperion3.com.au You can reach them on contact@hyperion3.com.au Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@siv  You can consider using mvfilter as well. | makeresults | eval Field1="Value 1" | eval Field2=split("Value 1,Value 2,Value 3", ",") | eval Field2=mvfilter(match(Field2, "^(?!Value 1$).*")) ... See more...
@siv  You can consider using mvfilter as well. | makeresults | eval Field1="Value 1" | eval Field2=split("Value 1,Value 2,Value 3", ",") | eval Field2=mvfilter(match(Field2, "^(?!Value 1$).*")) Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
The simplest way is to use = rather than match, unless you need match. The problem with match is if your data contains anything that might be significant to the regular expression, e.g. if Value 1 is... See more...
The simplest way is to use = rather than match, unless you need match. The problem with match is if your data contains anything that might be significant to the regular expression, e.g. if Value 1 is Va*ue, then that won't work well with match. You can do it like this - example shows simple fields in Field1, 2 and 3 and then one with regex significant characters and shows Field6 works, but not Field7 | makeresults | fields - _time | eval Field1="Value 1", Field2=split("Value 1,Value 2,Value 3", ",") | eval Field3=mvmap(Field2, if(Field1=Field2, null(), Field2)) | eval Field4="Odd[Regex Chars?]Value 1", Field5=split("Odd[Regex Chars?]Value 1,Value 2,Value 3", ",") | eval Field6_using_equals=mvmap(Field5, if(Field4=Field5, null(), Field5)) | eval Field7_using_match=mvmap(Field5, if(match(Field4,Field5), null(), Field5))  
There are many ways to look at search performance, particularly of windows event log data. You should get comfortable with understanding the search job properties page. In particular, look at the pha... See more...
There are many ways to look at search performance, particularly of windows event log data. You should get comfortable with understanding the search job properties page. In particular, look at the phase1 search and the scanCount. scanCount is the number of events that were scanned to return the results. With winevent log data particulary, you should understand that in order for the search to know if the process_name field is not what you want, it has to look at all events because process_name is a field that is is mapped by the Windows TA.  Minimising the number of events you look at (scanCount) will always help performance. Look at this presentation that shows how to use TERM() effectively. That can be a significant benefit to your searches.  https://conf.splunk.com/files/2020/slides/PLA1089C.pdf For example, your initial search index=windows source=XmlWinEventLog:Security process_name=ipconfig.exe can most likely be significantly improved just by writing index=windows source=XmlWinEventLog:Security TERM(ipconfig) process_name=ipconfig.exe because instead of pulling every event out to see if the windows TA has mapped a piece of the raw event to the process_name field, it will ONLY look at the events that have the term ipconfig in the raw event, so given that ipconfig will be a less frequently used command, your scanCount will drop significantly. In the search log from the inspect job page, search for LISPY and you can see how the parser has interpreted your search. In your other example of the != vs NOT, take a look at the phase0 search in the job properties. You will no doubt see a significant different in the expanded search. There are other forms of "filters", such as subsearches and lookups, but I would say that there is not often a one-size-fits-all approach to optimising your searches. It frequently depends on your data and the event count and cardinality of values you get back for fields you're trying to exclude. Lookups are often a good way to filter data, particularly when your data is still being searched in the index tier, i.e. before a transforming command has sent the data to the search head. So, it can be more efficient to do this type of logic index=windows source=XmlWinEventLog:Security | lookup process_names.csv process_name OUTPUT is_this_one_i_want | where isnotnull(is_this_one_i_want) which will then drop all events where process_name is included in your lookup. Note that this is a poor example, as it would grab all events and then filter, but the point is that it can be more efficient to first limit your data set in the primary search then filter using a lookup to remove other events rather than writing an up-front really complex set of conditions.
Thanks for the clarification, this helps a lot. Do you happen to know how I can get in touch with the developer to request access? I couldn’t find a contact listed on Splunkbase. Appreciate your help!
I found some more information, when I go: Apps -> DBX -> search -> save as alert -> I get the Output Name field but if I go: Apps -> other app (like Search & Reporting) -> search -> save a... See more...
I found some more information, when I go: Apps -> DBX -> search -> save as alert -> I get the Output Name field but if I go: Apps -> other app (like Search & Reporting) -> search -> save as alert -> I don't get the Output Name field   Any ideas what that could be? Kind Regards, Andre