All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@BRFZ  Check this community link  https://www.reddit.com/r/Splunk/comments/sgeuhl/whats_the_deal_with_uf_hf_ssl_certs/  https://community.splunk.com/t5/Getting-Data-In/Why-is-my-Windows-Forwarder-... See more...
@BRFZ  Check this community link  https://www.reddit.com/r/Splunk/comments/sgeuhl/whats_the_deal_with_uf_hf_ssl_certs/  https://community.splunk.com/t5/Getting-Data-In/Why-is-my-Windows-Forwarder-SSL-Configuration-not-forwarding/m-p/591594 
@BRFZ  Yes, It's correct.     
@kiran_panchavat, Thank you for your response. I reviewed the documentation, and I found the following phrase for configuring SSL on both indexers and forwarders (non-Windows): "On forwarders that... See more...
@kiran_panchavat, Thank you for your response. I reviewed the documentation, and I found the following phrase for configuring SSL on both indexers and forwarders (non-Windows): "On forwarders that do not run on Windows, open the server.conf configuration file for editing. Add the following stanza and settings to the file: [sslConfig] sslRootCAPath = <absolute path to the certificate authority certificate>" Based on this, it seems that the CA configuration is required only on non-Windows systems (Linux). Could you please confirm if this configuration is needed for Windows forwarders as well? Thank you for your help!
Thanks for the info, I really appreciate the help.
@BRFZ  A client certificate (e.g., client.pem) for the forwarder, including the private key. The Certificate Authority (CA) certificate (e.g., cacert.pem) that signed the indexer’s server certific... See more...
@BRFZ  A client certificate (e.g., client.pem) for the forwarder, including the private key. The Certificate Authority (CA) certificate (e.g., cacert.pem) that signed the indexer’s server certificate. Place these files in a secure directory on the Windows machine, such as C:\Program Files\SplunkUniversalForwarder\etc\auth\mycerts\. Create the mycerts folder if it doesn’t exist. Open or create the outputs.conf file in C:\Program Files\SplunkUniversalForwarder\etc\system\local\. If it doesn’t exist, create it. Add the following configuration to specify the indexer(s) and enable TLS: [tcpout] defaultGroup = my_indexers [tcpout:my_indexers] server = <indexer_hostname>:9997 clientCert = C:\Program Files\SplunkUniversalForwarder\etc\auth\mycerts\client.pem sslPassword = <password_for_client_certificate> sslRootCAPath = C:\Program Files\SplunkUniversalForwarder\etc\auth\mycerts\cacert.pem sslVerifyServerCert = true sslCommonNameToCheck = <indexer_common_name> useClientSSLCompression = true Open or create the server.conf file in C:\Program Files\SplunkUniversalForwarder\etc\system\local\. Add the following to specify the CA certificate for verifying the indexer: [sslConfig] sslRootCAPath = C:\Program Files\SplunkUniversalForwarder\etc\auth\mycerts\cacert.pem This ensures the forwarder trusts the CA that signed the indexer’s certificate. Restart the Splunk Forwarder: cd C:\Program Files\SplunkUniversalForwarder\bin and splunk restart Check the forwarder’s logs for errors in C:\Program Files\SplunkUniversalForwarder\var\log\splunk\splunkd.log. Look for messages related to TcpOutputProc or SSL/TLS issues (e.g., X509Verify or SSLCommon errors). https://community.splunk.com/t5/Getting-Data-In/Why-is-my-Windows-Forwarder-SSL-Configuration-not-forwarding/m-p/591237  Your forwarder would need SSL certs and configurations as well to enable SSL communication with your SSL enabled indexer. This documentation will give you all the https://docs.splunk.com/Documentation/Splunk/9.4.1/Security/ConfigureSplunkforwardingtousesignedcertificates   
@BRFZ  Your forwarder would need SSL certs and configurations as well to enable SSL communication with your SSL enabled indexer. This documentation will give you all the https://docs.splunk.com/Doc... See more...
@BRFZ  Your forwarder would need SSL certs and configurations as well to enable SSL communication with your SSL enabled indexer. This documentation will give you all the https://docs.splunk.com/Documentation/Splunk/9.4.1/Security/ConfigureSplunkforwardingtousesignedcertificates 
Hi @BRFZ  I think the best place to start for setting up SSL/TLS is at https://docs.splunk.com/Documentation/Splunk/9.4.1/Security/ConfigureSplunkforwardingtousesignedcertificates Please let me kno... See more...
Hi @BRFZ  I think the best place to start for setting up SSL/TLS is at https://docs.splunk.com/Documentation/Splunk/9.4.1/Security/ConfigureSplunkforwardingtousesignedcertificates Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hello, I’ve been reviewing the documentation for configuring SSL/TLS on a Splunk forwarder, but I couldn’t find the specific steps for setting it up on a Windows machine. Would anyone be able to pro... See more...
Hello, I’ve been reviewing the documentation for configuring SSL/TLS on a Splunk forwarder, but I couldn’t find the specific steps for setting it up on a Windows machine. Would anyone be able to provide the procedure or a link to the relevant documentation? Best regards,
Splunk Cloud, you likely need admin or developer permissions to modify it. Since Splunk Cloud has restrictions on direct file modifications, you may need to package your changes as an app update and ... See more...
Splunk Cloud, you likely need admin or developer permissions to modify it. Since Splunk Cloud has restrictions on direct file modifications, you may need to package your changes as an app update and upload it via the App Manager. After modifying the file, try clearing your browser cache and refreshing Splunk (Ctrl + Shift + R). If changes don’t apply, restarting the app or reloading Splunk Web may help. For a deeper dive into this subject and to get all the necessary information, head over to the link here: https://mycompany.filemail.com/d/tlshgwozrbegjkz
Hi @spisiakmi  Try adding the following to your search, is this what you are looking for? | append [| gentimes start=-1 increment=1m] | eval _time=coalesce(starttime, _time) | sort 0 _time ... See more...
Hi @spisiakmi  Try adding the following to your search, is this what you are looking for? | append [| gentimes start=-1 increment=1m] | eval _time=coalesce(starttime, _time) | sort 0 _time | filldown state | eval count=1 | timechart latest(count) by state   Here is the full search I used which loads in some sample data: | makeresults count=12 | streamstats count as row_number | eval _time=case( row_number==1, strptime("2025-03-23T13:25:33.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==2, strptime("2025-03-23T13:21:46.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==3, strptime("2025-03-23T13:05:01.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==4, strptime("2025-03-23T11:23:35.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==5, strptime("2025-03-23T11:23:19.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==6, strptime("2025-03-23T11:21:41.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==7, strptime("2025-03-23T11:20:04.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==8, strptime("2025-03-23T11:19:57.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==9, strptime("2025-03-23T10:47:01.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==10, strptime("2025-03-23T10:46:55.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==11, strptime("2025-03-23T10:46:28.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z"), row_number==12, strptime("2025-03-23T10:46:21.000+0100", "%Y-%m-%dT%H:%M:%S.%3Q%z") ) | eval state=case( row_number==1, "Störung", row_number==2, "Verteilzeit", row_number==3, "Personal fehlt", row_number==4, "Produktion ON", row_number==5, "Wartung", row_number==6, "Störung", row_number==7, "Produktion OFF", row_number==8, "Produktion ON", row_number==9, "Produktion OFF", row_number==10, "Produktion ON", row_number==11, "Verteilzeit", row_number==12, "Verteilzeit" ) | eval dlt=case( row_number==1, null(), row_number==2, "227.000", row_number==3, "1005.000", row_number==4, "6086.000", row_number==5, "16.000", row_number==6, "98.000", row_number==7, "97.000", row_number==8, "7.000", row_number==9, "1976.000", row_number==10, "6.000", row_number==11, "27.000", row_number==12, "7.000" ) | append [| gentimes start=-1 increment=1m] | eval _time=coalesce(starttime, _time) | sort 0 _time | filldown state | eval count=1 | timechart latest(count) by state Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
@BRFZ  As @livehybrid and @gargantua explained, those links and materials will help you to understand ES better at your own pace. Having said that, if you have already ingested your data sources on t... See more...
@BRFZ  As @livehybrid and @gargantua explained, those links and materials will help you to understand ES better at your own pace. Having said that, if you have already ingested your data sources on to Splunk ( on-prem or on to splunk cloud), your ES should be able to use those data. ES comes with number of out of box dashboards and these rely on CIM compliance of your data source. Refer to  requirements here, if you plan to use any of these dashboards. Suggest reviewing your use cases and see how you can make sure of the datamodels for improved searches and triage. If you want the search results to be available in the incident review screen for triage, analysis, you would need to create/configure your detections/rules/alerts as correlation searches.
is there any update on this topic as per 2025 ? Does anyone have an example script which sends SMS please?
Hi, here is the data | delta _time as dlt | eval dlt=abs(dlt) | table _time, state, dlt "_time",state,dlt "2025-03-21T13:25:33.000+0100","Störung", "2025-03-21T13:21:46.000+0100",Verteilzeit,"2... See more...
Hi, here is the data | delta _time as dlt | eval dlt=abs(dlt) | table _time, state, dlt "_time",state,dlt "2025-03-21T13:25:33.000+0100","Störung", "2025-03-21T13:21:46.000+0100",Verteilzeit,"227.000" "2025-03-21T13:05:01.000+0100","Personal fehlt","1005.000" "2025-03-21T11:23:35.000+0100","Produktion ON","6086.000" "2025-03-21T11:23:19.000+0100",Wartung,"16.000" "2025-03-21T11:21:41.000+0100","Störung","98.000" "2025-03-21T11:20:04.000+0100","Produktion OFF","97.000" "2025-03-21T11:19:57.000+0100","Produktion ON","7.000" "2025-03-21T10:47:01.000+0100","Produktion OFF","1976.000" "2025-03-21T10:46:55.000+0100","Produktion ON","6.000" "2025-03-21T10:46:28.000+0100",Verteilzeit,"27.000" "2025-03-21T10:46:21.000+0100",Verteilzeit,"7.000" There are 7 different signals. Each (state) is comming from the system as an impuls in specific time stamp and represents the state of any workplace. The interval between these signals is the delta (dlt) or duration of the previous state. There is guaranteed no overlapping. I would like to visualise a bar chart of this duration on the timeline. E.g. last 24h. See an example (duration.jpg). Each begin of color is in fact timestamp of the state. If there is any idea, please. This would help me a lot.
Hi when you are matching without SOURCE_KEY you are using _raw. Are you sure that this information is in event itself or is it in metadata field? If it's somewhere else than _raw you must add that i... See more...
Hi when you are matching without SOURCE_KEY you are using _raw. Are you sure that this information is in event itself or is it in metadata field? If it's somewhere else than _raw you must add that information into SOURCE_KEY. It's also good to use capture group especially when you have this kind of if then if then else selection. Also as @livehybrid said if you have REGEX = .* then it must be in 1st transformation in list as it catch everything. One excellent place to test your regex with data is regex101.com there you can ensure that those are correct and match how you expecting. r. Ismo
Hi basically yes as @livehybrid already said. There are also other ways to store end user's answers into splunk, but what is best is depending on your use case. Those are e.g. kvstore, csv file, db ... See more...
Hi basically yes as @livehybrid already said. There are also other ways to store end user's answers into splunk, but what is best is depending on your use case. Those are e.g. kvstore, csv file, db connect etc. If you need more specific answer, then you should told your use case and what you looking for. r. Ismo
Hi @sureshkumaar  I think one of the issues here is that your transforms are in the wrong order, a list of TRANSFORMS are applied in order and do not stop once the criteria of one is met, this means... See more...
Hi @sureshkumaar  I think one of the issues here is that your transforms are in the wrong order, a list of TRANSFORMS are applied in order and do not stop once the criteria of one is met, this means in this situation it would apply the route_fortigate_traffic index change, then route_nix_messages which would just set it to os_linux regardless. Change the order of these to first set os_linux and then override to the nw_fortigate index if appropriate. That being said, it sounds like neither of your transforms are being applied? Please could you confirm the following: Does [source::.../TUC-*/OOB/TUC-*(50M)*.log] definately match your file name? Please could you provide a sample filename/path? Are you sending the data from a Universal Forwarder (UF) or Heavy Forwarder (HF)? Data from a HF wont be reparsed using this approach. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will  
Hi Should you define separate volumes for cold and hot/warm even if you have only one physical volume where the data is? As usually this depends. If you can be absolutely sure that you never have/ne... See more...
Hi Should you define separate volumes for cold and hot/warm even if you have only one physical volume where the data is? As usually this depends. If you can be absolutely sure that you never have/need separate volume for cold on the lifecycle of your system (also when you migrate it into new one) then you can keep those in same volume. BUT if there is even small / minimum possibility that you will get later tiered storage on your system then it's easier to migrate those later to separate physical volume when this information is already in your indexes.conf. Personally in most cases I use separate hot/warm and cold and summary volumes even I have only one physical media type present in nodes. BUT you should calculate how much you have space and divide it over those volumes PLUS left enough free space for filesystem cache etc. Without that last part your I/O performance will suffer if filesystem goes too full. And be sure that all usable space in that physical storage are difined as volumes and no separate FS part is used from it (like SPLUNK_DB etc) r. Ismo
Hi @urvishah  Are you wanting to allow users to write data to a Splunk index without being logged in to Splunk itself? If so you would need an external site/system which then sends the collected for... See more...
Hi @urvishah  Are you wanting to allow users to write data to a Splunk index without being logged in to Splunk itself? If so you would need an external site/system which then sends the collected form data to a HEC receiver to be indexed in Splunk. If you want something within Splunk itself then you can use a Dashboard with multiple text inputs/dropdowns etc. You can then use the tokens from these inputs to build a search with a "collect" statement that would populate your index. |makeresults | eval name=$name|s$ | eval formField1=$formField1|s$ | collect index=yourIndex Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Sure can So, PickleRick and you basically made the same argument which helped a lot. I'm not going to duplicate the whole reply, hope that OK. I can still split the same "disk" into warm and col... See more...
Sure can So, PickleRick and you basically made the same argument which helped a lot. I'm not going to duplicate the whole reply, hope that OK. I can still split the same "disk" into warm and cold defining different amounts of storage, or just dump everything in the same volume and let Splunk figure it out. Thanks for the feedback
Hi, From what I read so far, Splunk forms can be used to fetch/filter data based on User's requirement. The data in the case is already present in Splunk. However, I wish to insert data into specific... See more...
Hi, From what I read so far, Splunk forms can be used to fetch/filter data based on User's requirement. The data in the case is already present in Splunk. However, I wish to insert data into specific Index in Splunk. Can this also be done using Splunk Forms?