All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, Unable to search the dataset Botsv3 in my splunk local machine it is throwing an error like  Configuration initialization for C:\Program Files\Splunk\etc took longer than expected (16532ms) whe... See more...
Hi, Unable to search the dataset Botsv3 in my splunk local machine it is throwing an error like  Configuration initialization for C:\Program Files\Splunk\etc took longer than expected (16532ms) when dispatching a search with search ID 1751901306.48. This might indicate an issue with underlying storage performance or the knowledge bundle size. If you want this message displayed more or less often, change the value of the 'search_startup_config_timeout_ms' setting in "limits.conf" to a lower or higher number. How to resolve this issue! Thanks
I just found the events in Splunk, they ended up in a different index (osnix). Any idea why the SC4S parser was not triggered? the tags indicate that SC4S correctly identify CITRIX as the logs so... See more...
I just found the events in Splunk, they ended up in a different index (osnix). Any idea why the SC4S parser was not triggered? the tags indicate that SC4S correctly identify CITRIX as the logs source. thanks  
Hi, litle update. Splunk upgrafed to 9.3.3 , SC4S upgraded to version 3.37 and Splunk Citrix add-on upgraded to 8.2.3. the issue remains, I see the CITRIX syslog packet reaching SC4S but nothing is... See more...
Hi, litle update. Splunk upgrafed to 9.3.3 , SC4S upgraded to version 3.37 and Splunk Citrix add-on upgraded to 8.2.3. the issue remains, I see the CITRIX syslog packet reaching SC4S but nothing is forwarded to Splunk.  
Hi everyone, I’m currently working with a Splunk distributed clustered environment (v9.4.1), with 3 indexers, 3 search heads and 1 cluster master, on RHEL.  I recently added a second 500GB disk t... See more...
Hi everyone, I’m currently working with a Splunk distributed clustered environment (v9.4.1), with 3 indexers, 3 search heads and 1 cluster master, on RHEL.  I recently added a second 500GB disk to each indexer in order to separate hot/warm and cold bucket storage. I have set up and mounted the 500GB disks hoping that should differentiate between the /indexes and the /coldstore.  I also edited the indexes.conf file on the cluster master, an example is shown below: [bmc] homePath = /indexes/bmc/db coldPath = /coldstore/bmc/colddb thawedPath = $SPLUNK_DB/bmc/thaweddb repFactor = auto maxDataSize = auto_high_volume I then applied the cluster-bundle as well as gave it a rolling-restart just in case.  Even though (I think) that I have configured everything correctly, when I navigate to the cluster master GUI and go to the path  Settings → Indexer Clustering → Indexes The indexes tab is empty, with none of the default indexes or the custom indexes that I had made. Has anyone encountered this behaviour where indexes do not appear in the Clustering UI, despite valid indexes.conf and bundle deployment?
Hi @PickleRick @isoutamo  Thanks for taking time to reply. Much appreciated. I'd prefer to take the upgrade path of 8.2->9.0.x -> 9.2.7; unfortunately, 9.0 does not support AL2023. Please take a ... See more...
Hi @PickleRick @isoutamo  Thanks for taking time to reply. Much appreciated. I'd prefer to take the upgrade path of 8.2->9.0.x -> 9.2.7; unfortunately, 9.0 does not support AL2023. Please take a moment to review the migration log file as didn't see any alarming (you may download from the link below), i did disable THP and verified the ulimits to match the recommended settings. https://limewire.com/d/PDWiS#NfyxSpwkrX Also,  log ingestion is working properly . It's been a week since I upgraded our test instance of Splunk to 9.27 and there has been no issues. The health stats from Settings->Monitoring Console reports no issues. And we dont use workload management feature in Splunk from a cgroup compatibility standpoint we are covered...   With regards to SystemD, it was setup using the below commands in AL2023:   sudo /opt/splunk/bin/splunk enable boot-start -user splunk -systemd-managed 1   sudo systemctl daemon-reload sudo systemctl start splunkd sudo systemctl enable splunkd
I am going through all the sources, and I am getting lot useful information. Thanks so much!
Hi @isoutamo  We want to ingest data from from applications hosted in the cloud, we can't use UF, HF since we can't install agents in them. "The Add-ons" mentioned are applications available in Spl... See more...
Hi @isoutamo  We want to ingest data from from applications hosted in the cloud, we can't use UF, HF since we can't install agents in them. "The Add-ons" mentioned are applications available in Splunk Enterprise UI similar to "Search and Reporting". They help to pull data from AWS S3 logging Accounts (AWS Add-on) and Azure Event Hub (Azure Add-on). My problem is these add-ons pull everything available in the logging accounts which includes logs unrelated to my application. What I want is to filter the logs such that I can take or at least filter the logs pertaining only to my application.
There are several things which "work" but which are unsupported and might bite you here and there at some point. Just saying. As @isoutamo pointed out (I admit I didn't bother to check this one), str... See more...
There are several things which "work" but which are unsupported and might bite you here and there at some point. Just saying. As @isoutamo pointed out (I admit I didn't bother to check this one), straight jump to 9.2 from your old version isn't supported so whilie the migration seems to have gone well, you might have skipped some step normally performed on migration to 9.0, for example, which later versions might rely on. Again - you might get away with doing unsupported things if you're lucky. But you might not. And debugging will be more difficult later if you have some issues lingering from a few versions back.
Is there any specific reason why you must use your own regex to extract domain?  There are much more mature/robust algorithms, including Splunk's built-in transforms, e.g., url: | extract url | rex ... See more...
Is there any specific reason why you must use your own regex to extract domain?  There are much more mature/robust algorithms, including Splunk's built-in transforms, e.g., url: | extract url | rex field=domain "(?<domain>.+)(?::(?<port>\d+))$" | rename proto as scheme (The url transform results in a field named "domain" that contains both domain and port. This is why I add a second extraction to separate port from domain. It also gives a field "proto" which you call scheme.) Here are some mock data for you to play with and compare with real data | makeresults format=csv data="_raw http://www.google.com/search?q=what%20about%20bob https://yahoo.com:443/ ftp://localhost:23/ ssh://1234:abcd:::21/" ``` data emulation above ``` They should give _raw domain port q scheme uri url http://www.google.com/search?q=what%20about%20bob www.google.com   what%20about%20bob http /search?q=what%20about%20bob http://www.google.com/search?q=what%20about%20bob https://yahoo.com:443/ yahoo.com 443   https / https://yahoo.com:443/ ftp://localhost:23/ localhost 23   ftp / ftp://localhost:23/ ssh://1234:abcd:::21/ 1234:abcd:: 21   ssh / ssh://1234:abcd:::21/
@dtaylor  You can use below one, (?i)(?:https?|ftp|hxxp)s?:\/\/(?:www\.)?(?P<domain>[a-zA-Z0-9\-\.]+) Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, pleas... See more...
@dtaylor  You can use below one, (?i)(?:https?|ftp|hxxp)s?:\/\/(?:www\.)?(?P<domain>[a-zA-Z0-9\-\.]+) Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Unfortunately, I haven't found a fix for this yet.  I hope someone will share the solution so i can mark is as solution and help other people
SPL-268481is a bug we encountered in Enterprise  9.1 and also is in 9.2.   We have very large SHC cluster with  6 indexer clusters and a total of > 1500 indexers across these 6 clusters. The issue... See more...
SPL-268481is a bug we encountered in Enterprise  9.1 and also is in 9.2.   We have very large SHC cluster with  6 indexer clusters and a total of > 1500 indexers across these 6 clusters. The issue: - we would add an indexer back to an indexer cluster (e.g. it had hardware fixed) - the indexer would join the cluster again - the search heads would briefly REMOVE ALL/almost all indexers (not just the ones that were in the SAME indexer cluster being added back) - then each SHC would add the indexers back - most or all of the SHC heads would repeat this process so over a many minute period you could have searches that were not searching all possible indexers For each head the time period where all indexers were removed was less than a minute BUT it meant that searches would run and find NO indexers/fewer indexers to search. The solution provided by Splunk that worked is to add a setting to distsearch.conf (and btw the setting is not documented and not in distsearch.conf.spec so you would get a btool warning I am told)   [distributedSearch] useIPAddrAsHost = false I am sharing this solution in case you encountered the issue.  
My apologies @dtaylor  Not had my morning coffee yet.. how about this? (?:http|ftp|hxxp)s?:\/\/([\p{L}\d-]+(?:\.[\p{L}\d-]+)*)  Did this answer help you? If so, please consider: Adding kar... See more...
My apologies @dtaylor  Not had my morning coffee yet.. how about this? (?:http|ftp|hxxp)s?:\/\/([\p{L}\d-]+(?:\.[\p{L}\d-]+)*)  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @npgandlove  The mvfile replacementType in Eventgen expects the sample file to be a valid CSV with at least two columns, and the column index in your replacement (:2) must refer to the correct co... See more...
Hi @npgandlove  The mvfile replacementType in Eventgen expects the sample file to be a valid CSV with at least two columns, and the column index in your replacement (:2) must refer to the correct column (1-based index). The error "0 columns" usually means Eventgen can't parse the file as CSV or the column index is out of range. You mentioned that you "created a file with a single line of items separated by a comma" - perhaps try the following nodename.sample: host01,10.11.0.1 host02,10.12.0.2 host03,10.13.0.3 Then eventgen should take column 2 (the IP) for your substitutions.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Thank you for offering to help me. I tested your example Regex, and like with your screenshot, it looks like I'm getting a lot more matches for the domain group than just the domain. I need the domai... See more...
Thank you for offering to help me. I tested your example Regex, and like with your screenshot, it looks like I'm getting a lot more matches for the domain group than just the domain. I need the domain group to only match the domain of a URL. I apologize of this wasn't clear. The end goal of this is to use the expression in an automation which take in URL's, parse the domain, perform a DNS lookup on the domains, and judge whether a domain is hosted locally based on the IP.
The <SPLUNK_HOME>/var/log/splunk/mlspl.log may give you more details about the error.
Hi @dtaylor  You could try with the following: (?i)(?P<scheme>(?:http|ftp|hxxp)s?(?::\/\/|-3A__|%3A%2F%2F))?(?:%[\da-f][\da-f])?(?P<domain>[\p{L}\d\–]+(?:\.[\p{L}\d\–]+)*)(@|%40)?(?:\b| |[[:punct:]... See more...
Hi @dtaylor  You could try with the following: (?i)(?P<scheme>(?:http|ftp|hxxp)s?(?::\/\/|-3A__|%3A%2F%2F))?(?:%[\da-f][\da-f])?(?P<domain>[\p{L}\d\–]+(?:\.[\p{L}\d\–]+)*)(@|%40)?(?:\b| |[[:punct:]]|$)  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi,  You must provide a `model` parameter in your `| ai` command. https://docs.splunk.com/Documentation/MLApp/5.6.0/User/Aboutaicommand#Parameters_for_the_ai_command Also, I assume that you have... See more...
Hi,  You must provide a `model` parameter in your `| ai` command. https://docs.splunk.com/Documentation/MLApp/5.6.0/User/Aboutaicommand#Parameters_for_the_ai_command Also, I assume that you have configured the required details in the Connections Management page https://docs.splunk.com/Documentation/MLApp/5.6.0/User/Aboutaicommand#Connection_Management_page 
This may not be the best place to ask given my issue isn't technically Splunk related, but hopefully I can get some help from people smarter than me anyway.   (?i)(?P<scheme>(?:http|ftp|hxxp)s?(?::... See more...
This may not be the best place to ask given my issue isn't technically Splunk related, but hopefully I can get some help from people smarter than me anyway.   (?i)(?P<scheme>(?:http|ftp|hxxp)s?(?:://|-3A__|%3A%2F%2F))?(?:%[\da-f][\da-f])?(?P<domain>(?:[\p{L}\d\-–]+(?:\.|\[\.\]))+[\p{L}]{2,})(@|%40)?(?:\b| |[[:punct:]]|$) The above regex is a template I'm working from(lol, I'm not nearly good enough to write this). While it's not too hard to read and see how it works, in a nut shell, it matches on the domain of a URL and nothing else. It does this by first looking for the optional beginning 'https://' and storing that in the 'scheme' group. Following that, it parses the following domain.  For example, the URL 'https://community.splunk.com/t5/forums/postpage/board-id/splunk-search' would match 'community.splunk.com'   My issue is that the way it looks for domains following the 'scheme' group requires it use a TLD(.com, .net, etc). Unfortunately, internal services used by my company don't use a TLD, and this causes the regex not to catch them. I need to change this so it can do this.   I want to modify the regex expression above to detect on URLs like: 'https://mysite/resources/rules/123456' wherein the domain would be 'mysite'. I've attempted to do so, but with my limited understanding of how regex really works, my attempts lead to too many matches as shown below. (?i)(?P<scheme>(?:http|ftp|hxxp)s?(?::\/\/|-3A__|%3A%2F%2F))?(?:%[\da-f][\da-f])?(?P<domain>((?:[\p{L}\d\-–]+(?:\.|\[\.\]))+)?[\p{L}]{2,})(@|%40)?(?:\b| |[[:punct:]]|$) I tried to throw in an extra non-capturing group within the named 'domain' ground and make the entire first half of the 'domain' group optional, but it leads to matches beyond the domain. Thank you to whomever may be able to assist. This doesn't feel like it should be such a difficult thing, but it's been vexing me for hours.
I have tried all of the above suggestions and still getting the following error when trying to install MLTK: Error during app install: failed to extract app from C:\Program Files\Splunk\var\run\c6ae... See more...
I have tried all of the above suggestions and still getting the following error when trying to install MLTK: Error during app install: failed to extract app from C:\Program Files\Splunk\var\run\c6ae5d0a07047977.tar.gz to C:\Program Files\Splunk\var\run\splunk\bundle_tmp\878c329ad1cecad1: Operation did not complete successfully because the file contains a virus or potentially unwanted software. I am running Enterprise trial version on Win11 box. In fact, I was not able to find the file C:\Program Files\Splunk\var\run\c6ae5d0a07047977.tar.gz nor any file in the extracted files (downloaded the PSC zip from splunk) with .tar or .tar.gz extension I am in the middle of a Coursera course and am stuck because I can't install PSC nor MLTK. Help please.