All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Bedrohungsjäger  Please can I check what port configuration you have in SC4S? Have you set your port with SC4S_LISTEN_ZSCALER_LSS_TCP_PORT ? (For more info on setup please see https://splunk.git... See more...
Hi @Bedrohungsjäger  Please can I check what port configuration you have in SC4S? Have you set your port with SC4S_LISTEN_ZSCALER_LSS_TCP_PORT ? (For more info on setup please see https://splunk.github.io/splunk-connect-for-syslog/1.90.1/sources/Zscaler/ but you may have already seen this!  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
SC4S and not S4cs, apologies for the typo.
Hey Folkes Ingesting ZPA logs in Splunk using the Zscaler LSS service, I believe the configuration is correct based on the documentation, however the sourcetype is coming up as sc4s fallback and t... See more...
Hey Folkes Ingesting ZPA logs in Splunk using the Zscaler LSS service, I believe the configuration is correct based on the documentation, however the sourcetype is coming up as sc4s fallback and the logs are unreadable. It's confirmed that the logs are streaming to the HF. Can anyone who've done a similar configuration setup advise? 
In ITSI, when using NEAP to trigger email alerts, Splunk ITSI automatically appends a footer text to the email body. Even though we remove the footer text via the email alert action configuration an... See more...
In ITSI, when using NEAP to trigger email alerts, Splunk ITSI automatically appends a footer text to the email body. Even though we remove the footer text via the email alert action configuration and saved successfully, it reappears when the configuration is reopened. This issue persists despite the general email settings in Splunk not containing any footer text. We also need to have the "HTML & Plain Text" on because we are running tokens and multiple links (service analyzer and viewing the episode) and if not, it will not be utilized correctly. We cannot use just the "Plain Text" due to that reason above. If anyone has any ideas or suggestions that will be much appreciated.
Hi @sarit_s6  Sorry, the text in dashboard studio can only be set at a visualization level, not specific to a column within your table.  Did this answer help you? If so, please consider: Adding... See more...
Hi @sarit_s6  Sorry, the text in dashboard studio can only be set at a visualization level, not specific to a column within your table.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @danielbb  The Splunk dashboards are rendered with ReactJS so I'm not sure if you'd have much success getting this to work with Java - The framework can be found at https://splunkui.splunk.com/Pa... See more...
Hi @danielbb  The Splunk dashboards are rendered with ReactJS so I'm not sure if you'd have much success getting this to work with Java - The framework can be found at https://splunkui.splunk.com/Packages/visualizations/ if you're interested in looking into it. One thing you might be able to do is publish your dashboard however this relies on scheduling the search so its not very realtime. Another option would be to use dashpub which has a little more control - You would need to render these within a web view component within your Java app.   Otherwise, like @richgalloway mentioned - you could use the REST API but this would only return the results - not the visuals.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
The REST API lets you run searches and retrieve the results, but cannot render visualizations.  That's normally done by the web browser so if you're not using that then Splunk visualizations are not ... See more...
The REST API lets you run searches and retrieve the results, but cannot render visualizations.  That's normally done by the web browser so if you're not using that then Splunk visualizations are not possible.  The customer would have to use the data fetched by the API in another tool to create charts.
We have case of a customer that developed a dashboard within Splunk that has 10 panels (visualizations), and he uses the dashboard from outside Splunk with a Java program that does web scraping, and ... See more...
We have case of a customer that developed a dashboard within Splunk that has 10 panels (visualizations), and he uses the dashboard from outside Splunk with a Java program that does web scraping, and then he produces the appropriate real-time reports in the code. Now, due to authentication issues, we would like to convert this to be done via the REST API. Do we have any other options? Like embed or anything else.
Ok, let me try to get some better sample data. Believe I have it here. While this is only one ID, the data has multiple IDs, and its spans multiple months.  | makeresults count=1 | eval ID="10001", ... See more...
Ok, let me try to get some better sample data. Believe I have it here. While this is only one ID, the data has multiple IDs, and its spans multiple months.  | makeresults count=1 | eval ID="10001", _time=strptime("2025-06-01 08:00:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=100.50, code="product1" | append [| makeresults | eval ID="10001", _time=strptime("2025-06-01 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=120.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-01 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=140.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-02 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=130.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-02 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=150.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-01 08:10:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=102.50, code="product1"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-01 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=125.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-01 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=145.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-02 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=135.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-06-02 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-06-01", cost=155.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-01 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=125.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-01 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=145.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-02 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=135.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-02 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=155.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-01 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=120.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-02 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=140.00, code="product3"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-02 10:15:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=130.75, code="product2"] | append [| makeresults | eval ID="10001", _time=strptime("2025-05-02 13:30:00", "%Y-%m-%d %H:%M:%S"), billing_date="2025-05-01", cost=150.00, code="product3"]
Hello In Splunk dashboard studio, is it possible to change the font size of specific column ?   thanks
Hi @schose  Were you able to resolve the issue of the navigation menu labels updating? I'm running into the same issue.
Hi @splunkreal  If using the raw endpoint then _raw will be whatever is sent from the source. Different Splunkbase / Custom apps can perform different field extractions depending on the source of th... See more...
Hi @splunkreal  If using the raw endpoint then _raw will be whatever is sent from the source. Different Splunkbase / Custom apps can perform different field extractions depending on the source of the data.  Are you sending a particular type of log or from a specific vendor/tool via Kafka? I'd be happy to investigate if there is an appropriate add-on to export the data for it. Note, however, that Kafka may result in the data not being in the original format and thus might not extract correctly and might need further work. Please let us know what the source data is in and I'd be happy to help.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hey @uagraw01, I understand that you might not have liked the solution. But it wasn't chatgpt based. If you would have read correctly, I mentioned that I used it as a solution and it worked perfectl... See more...
Hey @uagraw01, I understand that you might not have liked the solution. But it wasn't chatgpt based. If you would have read correctly, I mentioned that I used it as a solution and it worked perfectly in my scenario. Community is for helping every member and that's what I tried honestly. If you would have tried the solution, you could have observed how it would help you with the situation.   Anyways, I hope you get the solution that you need outside of ChatGPT. Happy Splunking >. Thanks, Tejas.
Since your multivalue fields appear to be coming from one lookup (Alarm_list_details_scada_mordc.csv) you could try something like this | lookup Alarm_list_details_scada_mordc.csv component_type_id ... See more...
Since your multivalue fields appear to be coming from one lookup (Alarm_list_details_scada_mordc.csv) you could try something like this | lookup Alarm_list_details_scada_mordc.csv component_type_id AS statistical_subject OUTPUTNEW operational_rate technical_rate maximum_duration minimum_duration alarm_severity | eval row=mvrange(0,mvcount(operational_rate)) | mvexpand row | foreach operational_rate technical_rate maximum_duration minimum_duration alarm_severity [| eval <<FIELD>>=mvindex(<<FIELD>>,row) ] | fields - row  
@isoutamo is exactly right about the 9.2 changes! To help troubleshoot this further, you should check a few things to understand why the forwarders aren't connecting properly to the DS. Start by test... See more...
@isoutamo is exactly right about the 9.2 changes! To help troubleshoot this further, you should check a few things to understand why the forwarders aren't connecting properly to the DS. Start by testing connectivity from each forwarder using telnet or netcat to make sure they can actually reach the deployment server on port 8089. Next, examine your serverclass.conf on the Deployment Server to verify that your forwarders match the whitelist criteria and that the client matching is configured properly. Many times the issue is that the serverclass isn't set up to recognize your specific forwarders. On the forwarder side, run btool deploymentclient to see what configuration is actually being applied. This will show you if there are any conflicting settings or if the deploymentclient.conf isn't pointing where you expect it to. If your deployment server is forwarding its internal logs to your indexer, you might also need to add the indexAndForward settings in outputs.conf on the DS, as this can affect how deployment client data appears in the management UI after 9.2. Just to confirm, are you also managing your Search Head and indexer through the Deployment Server? And is this truly a distributed setup with separate VMs, or multiple Splunk instances on one box? That architecture detail might help explain what you're seeing. If this Helps Please Upvote!
Hi have you look this https://docs.splunk.com/Documentation/Splunk/9.4.2/Updating/Upgradepre-9.2deploymentservers ? There was some changes on 9.2 how DS has stored client information. This leads a... See more...
Hi have you look this https://docs.splunk.com/Documentation/Splunk/9.4.2/Updating/Upgradepre-9.2deploymentservers ? There was some changes on 9.2 how DS has stored client information. This leads also in situation where you see those deployment clients on your SH as it get that information from your indexer's indexes (I suppose that you have forwarded all logs to indexer). r. Ismo
Normally spunk runs on your nodes all IPs unless you define it differently. So in splunk point of view you don't need to do anything. But usually nodes and network put firewalls between network segm... See more...
Normally spunk runs on your nodes all IPs unless you define it differently. So in splunk point of view you don't need to do anything. But usually nodes and network put firewalls between network segments and also in hosts. I suppose that you are running this in linux, so you need to check if there is local firewalld or iptables or some other host based fw in use. Just add those ports 8000 (for GUI) and 9997 (for input data from UFs) or what port you are using for ingesting data.  Based on your OS windows/linux+distro this has done different way. Also you should check if your sources (UFs) are in different network segments and if your users (laptops/workstation) are different segment then open access to those ports. Also there could be some web proxies in use which didn't allow traffic into your Splunk sever GUI (port 8000).
@heathramos Thanks for the update, glad it worked out. 
FYI It looks like the dashboards are now working changing the datamodel at every step and adding the index reference fixed the issue thanks for the help
I modify @uagraw01 your answer to moving your SPL inside </> block in editor. In that way it's much more readable. Can you check that it is still correct and there haven't missed anything! Could yo... See more...
I modify @uagraw01 your answer to moving your SPL inside </> block in editor. In that way it's much more readable. Can you check that it is still correct and there haven't missed anything! Could you In future use that </> block always when you are adding some SPL, dashboards etc. With it we can ensure that what we see is what you have written, not something what editor has changed!