All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I couldn't find any other cause and solution. I don't have any problems with Splunk operations, so I'm just using it..
FYI - I got the same problem installing on a ubuntu 22.04 VM. Splunkd is up and running though so perhaps, as suggested above, this is a red herring?  
Hi @RobertCEG Pass the list of email addresses as a list/array to the "add_to_list" utility block, not as a single comma-delimited string.   Use a playbook block (e.g., "Format" or "Custom Functio... See more...
Hi @RobertCEG Pass the list of email addresses as a list/array to the "add_to_list" utility block, not as a single comma-delimited string.   Use a playbook block (e.g., "Format" or "Custom Function") to ensure your email addresses are output as a list/array. Connect this output directly to the "add_to_list" block. Example (pseudo) code for a Custom Function: def add_emails_to_list(email_string): # Split comma-separated string into a list return [email.strip() for email in email_string.split(',')] Then, pass the resulting list to "add_to_list". If you pass a single string (even if comma-separated), SOAR treats it as one row with multiple columns. Passing a list/array adds each value as a new row. Check the output type from your previous block—ensure it is a list, not a string. Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Xiaorq  Just to check is it IT Essentials Work (ITEW) that you see installed? If you install ITSI but do not apply the ITSI License to your environment then I believe it reverts to ITEW (see ht... See more...
Hi @Xiaorq  Just to check is it IT Essentials Work (ITEW) that you see installed? If you install ITSI but do not apply the ITSI License to your environment then I believe it reverts to ITEW (see https://splunk.my.site.com/customer/s/article/ITSI-app-reverted-to-IT-Essential-Work-IT-W-and-does-not-show-premium-features) Please can you confirm if you have installed your ITSI specific license? The install location depends on your environment configuration/architecture - please see https://docs.splunk.com/Documentation/ITSI/4.20.0/Install/InstallDD for more info.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @shashigari  Sorry it isnt clear to me which search is having the issue. I'm not sure why you are doing a makeresults followed by an append? Are you specifying the earliest/latest in your subsea... See more...
Hi @shashigari  Sorry it isnt clear to me which search is having the issue. I'm not sure why you are doing a makeresults followed by an append? Are you specifying the earliest/latest in your subsearch/append search? Please can you post your full search with the issue.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
This is an indication of inefficient bucket use, meaning buckets roll `before they fill up.  This can happen when indexers restart often, but in this case I suspect it's just a matter of the main ind... See more...
This is an indication of inefficient bucket use, meaning buckets roll `before they fill up.  This can happen when indexers restart often, but in this case I suspect it's just a matter of the main index getting very few events before maxHotSpecSecs is reached and the bucket rolls to warm. The answer for buckets that are known to contain few events is to set maxDataSize to a value that makes the bucket at least 50% full before it rolls.  The default bucket size is 750MB.  The dbinspect command can tell you the current size of buckets to give you an idea of how to set maxDataSize. Best Practice is to not use the main index at all.  All incoming data should go into a custom index, leaving main empty (and not needing to roll).
I have the same question
We've added documentation to dev.splunk.com to cover Custom REST Endpoints.   (apologies for thread necromancy but this is still one of the top hits on gsearch for this topic somehow, 14 years late... See more...
We've added documentation to dev.splunk.com to cover Custom REST Endpoints.   (apologies for thread necromancy but this is still one of the top hits on gsearch for this topic somehow, 14 years later) 
I have a list of email addresses being returned by a query that I want to use to update a custom list. My goal is to have one value per row. If I add a utility block "add_to_list" to my playbook, the... See more...
I have a list of email addresses being returned by a query that I want to use to update a custom list. My goal is to have one value per row. If I add a utility block "add_to_list" to my playbook, then all the values get added in as a single row, with a separate value per column. I assume this is because the values being returned are seen as a single long comma-delimited list. What is the best practice for ensuring my playbook is updating the custom list with just one value per row, and adding new rows for each value in my list?
Hi @Gururaj1  The UF does not have a web UI.  Check /opt/splunkforwarder/var/log/splunk/splunkd.log to see if the server is running (it should update quite regularly)  Did this answer help you? ... See more...
Hi @Gururaj1  The UF does not have a web UI.  Check /opt/splunkforwarder/var/log/splunk/splunkd.log to see if the server is running (it should update quite regularly)  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@marisstella- Kindly please accept the answer @isoutamo if that help you resolve/understand your query by clicking on "Accept as Solution" so future Splunk Community users will get benefited from you... See more...
@marisstella- Kindly please accept the answer @isoutamo if that help you resolve/understand your query by clicking on "Accept as Solution" so future Splunk Community users will get benefited from your question as well.
Hello @livehybrid , thank you for the update,  I would have ignored this if I was able to access the splunk UF webpage. But the issue here the webpage timeout appears and I tested the exact scenario... See more...
Hello @livehybrid , thank you for the update,  I would have ignored this if I was able to access the splunk UF webpage. But the issue here the webpage timeout appears and I tested the exact scenario with splunk HF and Splunk Enterprise  9.4.1 and it works perfectly fine[No errors while installation]. Am i missing something here ? However, thank you. If you find any resolution to this, please do let me know. As the issue is unknown, not sure how to tackle..
Good morning, I got a query like this [| makeresults count=0] | append [ search (index="my_index"] When I use to setup analert like  earliest="04/11/2025:12:10:01" latest="04/11/2025:12:20:01" `... See more...
Good morning, I got a query like this [| makeresults count=0] | append [ search (index="my_index"] When I use to setup analert like  earliest="04/11/2025:12:10:01" latest="04/11/2025:12:20:01" `mymacro` | table _time IP this is not picking up the events in that time frame. however when I expand to 8hours from dropdown it is showing results.   Any one can help provide approach for this issue?
Hi @Gururaj1  Just to check - apart from the errors you mentioned - Does Splunk install correctly? Those errors dont necessarily mean there is an issue - its likely that part of the debian preinst s... See more...
Hi @Gururaj1  Just to check - apart from the errors you mentioned - Does Splunk install correctly? Those errors dont necessarily mean there is an issue - its likely that part of the debian preinst script which calls a "temp_splunk-preinstall" file - this is looking for those locations to do *something* - "find" is usually used to "find" something (file/folder) and then do something to it like update permissions or something.  If the find returns no files the script looks to continue, finally finishing with "complete" - at this point I'd expect your install to be complete - Since you splunk validate command returned a success I thing these "errors" are benign.  The existence of them is either a mistake in that script - OR - it could be for something we arent necessarily aware of. Either way, If the files get installed then I'm confident this isnt an issue.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Oh, thanks for the confirmation.. 
Hi @wm  Given that you are using the same connection details and the servers should be setup the same, it feels like the issue could be either network related or authentication related, as you have ... See more...
Hi @wm  Given that you are using the same connection details and the servers should be setup the same, it feels like the issue could be either network related or authentication related, as you have proven Splunk to be working. Try and establish a telnet connection between your Splunk host and the non-working database host on the relevant DB port. If this works then it demonstrates that the firewall is allowing connection. Double check the authentication credentials - are you using different passwords for the 2 databases or are they using a domain account? If a domain account, does it have the relevant permissions for the non-working server?  Are there any logs you can see on the database server around the connection?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @tech_g706  The last version that was listed as supporting Windows 2008 was 7.2.10 (See https://docs.splunk.com/Documentation/Splunk/7.2.10/Installation/Systemrequirements) - This lists it as Dep... See more...
Hi @tech_g706  The last version that was listed as supporting Windows 2008 was 7.2.10 (See https://docs.splunk.com/Documentation/Splunk/7.2.10/Installation/Systemrequirements) - This lists it as Deprecated and it was removed from the next version. UF 7.2.10 actually went out of Splunk support in April 2021 (With P3 support ending in 2023). The following shows a matrix of supported forwarders->Indexers which might also be useful: https://docs.splunk.com/Documentation/VersionCompatibility/current/Matrix/Compatibilitybetweenforwardersandindexers Unfortunately it looks like you might already be on the latest version known to work with Windows 2008. It not being listed doesnt necessarily mean it wont work with 2008 - but it wont be supported and there may be issues we arent aware of.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I wouldn't expect any modern UF to be not only supported but also to work. That windows release is so out of support that I'd be more concerned with the OS itself than the UF version.
Hi @tech_g706 , there isn't any 8.x version certified compatible with Windows 2008/R2 and supported by Splunk. Probably the 9.4.x version runs on Windows 2008/R2 but it isn't certified and supporte... See more...
Hi @tech_g706 , there isn't any 8.x version certified compatible with Windows 2008/R2 and supported by Splunk. Probably the 9.4.x version runs on Windows 2008/R2 but it isn't certified and supported by Splunk. The only that  can formally answer to your question is Splunk Support. Ciao. Giuseppe
Although it looks like you should be able to use the "move" link in "Lookup definitions", you cannot. The best way to achieve this is to create a new KVstore in the desired location then copy the KV... See more...
Although it looks like you should be able to use the "move" link in "Lookup definitions", you cannot. The best way to achieve this is to create a new KVstore in the desired location then copy the KVstore data there. | inputlookup my_old_kvstore | outputlookup my_new_kvstore I did create a Splunk Idea to resolve this issue. Please vote for it here.