All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@sankaraniyan1  Check this https://docs.splunk.com/observability/en/gdi/monitors-messaging/apache-kafka.html 
@jkamdar  Can you send the inputs.conf and props.conf files? Also, please use the btool command to check if there are any duplicate inputs.conf configurations. To check for duplicate inputs.conf co... See more...
@jkamdar  Can you send the inputs.conf and props.conf files? Also, please use the btool command to check if there are any duplicate inputs.conf configurations. To check for duplicate inputs.conf configurations using the btool command, you can run the following: /opt/splunk/bin/splunk btool inputs list --debug This command will display the full path to each inputs.conf file that Splunk is reading from, making it easier to identify any duplicates.
Hi @jkamdar , it's really difficoult to debug your issue without accessing your conf files and your data! could you share your inputs.conf and props.conf? Ciao. Giuseppe
@gcusello  I tried btool commands on my rsyslog server: splunk btool inputs list and splunk btool inputs list --debug | grep index  and the files I found are configured properly.  Not sure where ... See more...
@gcusello  I tried btool commands on my rsyslog server: splunk btool inputs list and splunk btool inputs list --debug | grep index  and the files I found are configured properly.  Not sure where to look next. 
How can i gather application data streamed via Kafka to Splunk Observability ?
Continuing to get this error on Palo Alto Networks application v. 8.1.3. TypeError: mvc.createService is not a function at eval (eval at _runScript (dashboard_1.1.js), <anonymous>:40:21) I rem... See more...
Continuing to get this error on Palo Alto Networks application v. 8.1.3. TypeError: mvc.createService is not a function at eval (eval at _runScript (dashboard_1.1.js), <anonymous>:40:21) I removed the dashboards.js file and still experiencing. Any ideas?
Yes i wanted to be able to upload a conf file from search head into the Deployment Server which would results it being pulled by UFs but as you said it's not possible through the rest api and the GUI... See more...
Yes i wanted to be able to upload a conf file from search head into the Deployment Server which would results it being pulled by UFs but as you said it's not possible through the rest api and the GUI. Can you provide any references on how to safely use credentials using splunk encryption so i don't leave credentials unprotected ?
Not seeing the fields will stop your filter from working, but why is the question.
Hi @smanojkumar  Sorry, I got my wires crossed. It doesnt look like this is possible with XML Dashboards, however you can probably achieve this design much easier with Dashboard Studio dashboard. Is... See more...
Hi @smanojkumar  Sorry, I got my wires crossed. It doesnt look like this is possible with XML Dashboards, however you can probably achieve this design much easier with Dashboard Studio dashboard. Is there anything preventing you switching to a Dashboard Studio dashboard for this? Thanks Will
Thanks for the detailed response.   To clarify, this is meant as an audit trail for a few users with very limited technical expertise, and I agree with your sentiments.   I'm doing this as an explora... See more...
Thanks for the detailed response.   To clarify, this is meant as an audit trail for a few users with very limited technical expertise, and I agree with your sentiments.   I'm doing this as an exploratory exercise, although I'm leaning towards this being a maintenance nightmare and am exploring other solutions for providing data.   I'll play around with the json string and/or lookups as in your examples.  thanks!
Hello M2024X_Ray, Did you get an answer from Splunk staff about the Windows Server 2025 compatibility by any chance?
Hi. A little late but you should try to go to Settings/User interface/Views, find your dashboard and click on its name. It will open the XML version of it with the header. version="2" tells that is a... See more...
Hi. A little late but you should try to go to Settings/User interface/Views, find your dashboard and click on its name. It will open the XML version of it with the header. version="2" tells that is a Dashboard Studio version. Try to add your call to script there : <dashboard script="App_Name:script_name.js" version="2" theme="dark">
yes fields are correct but values coming out are same as what i was doing using spath statement. is there any difference i will get if iam using props and transofrms conf
Thank you @tscroggins, We have 2 logstash servers so I took one of them and made a to conf file that sends data from elastic to splunk via hec. Only issue now is logstash is running out of heap memo... See more...
Thank you @tscroggins, We have 2 logstash servers so I took one of them and made a to conf file that sends data from elastic to splunk via hec. Only issue now is logstash is running out of heap memory due to the size of the transfers. Working on fixing the pipeline now. Thanks again for the suggestions!
The inputs are unchecked now. disabled = 0 in local/inputs.conf as well. 443/tcp is allowed in firewall.   There is still no data. Is there anything I am missing? Thank you everyone for you... See more...
The inputs are unchecked now. disabled = 0 in local/inputs.conf as well. 443/tcp is allowed in firewall.   There is still no data. Is there anything I am missing? Thank you everyone for your help! API Token Post Request: internal log:  
I dont have a resoluton here, this is the documentation and the issue is around certs but I still cant work out where im going wrong. Upgrade the KV store server version - Splunk Documentation   I... See more...
I dont have a resoluton here, this is the documentation and the issue is around certs but I still cant work out where im going wrong. Upgrade the KV store server version - Splunk Documentation   Im just going to wait for a new version where this is resolved.
@gcusello thanks for a quick response.  >>at first, are you sure that you are analyzing only the new data and not also the oldest? Yes, I have changes time picker for last 15 or 60 minutes to make ... See more...
@gcusello thanks for a quick response.  >>at first, are you sure that you are analyzing only the new data and not also the oldest? Yes, I have changes time picker for last 15 or 60 minutes to make sure it's all recent data >> At least, are you sure that you're receiving logs from the same host? Yes,  this is a very small deployment and have only one ESX server.  >>Anyway, use btool I meant try btool but ended up posting question before I try that. I will do that now.     
Hi @jkamdar , at first, are you sure that you are analyzing only the new data and not also the oldest? Anyway, use btool ( https://docs.splunk.com/Documentation/Splunk/9.4.0/Troubleshooting/Usebtoo... See more...
Hi @jkamdar , at first, are you sure that you are analyzing only the new data and not also the oldest? Anyway, use btool ( https://docs.splunk.com/Documentation/Splunk/9.4.0/Troubleshooting/Usebtooltotroubleshootconfigurations ) to debug your configurations because, probably there's another input. At least, are you sure that you're receiving logs from the same host? Ciao. Giuseppe
Hi @Zorghost , at first, there's a mistyping error: not auditrial but audittrail Then analyzing the results of your search I see seom interesting fields: _time use dest action info But I d... See more...
Hi @Zorghost , at first, there's a mistyping error: not auditrial but audittrail Then analyzing the results of your search I see seom interesting fields: _time use dest action info But I don't think that you need external help for this! Ciao. Giuseppe
I have ESX hosts sending logs to rsyslog and then being ingested in Splunk.  Originally, I configured to ingest all logs (my linux servers and ESX) into one index called linux. Later, I created new ... See more...
I have ESX hosts sending logs to rsyslog and then being ingested in Splunk.  Originally, I configured to ingest all logs (my linux servers and ESX) into one index called linux. Later, I created new index called "esx" and modified the inputs.conf on my rsyslog server to reflect in stanzas for all the esx hosts and esxvcenter (added index = esx) and restarted Splunkforwarder.  However, it looks like, I am getting data in both indexes, linux and esx.  I have checked all possible inputs.conf on my rsyslog server but can't find anywhere that directs ESX logs to "linux" index.  Any help to troubleshoot the issue would be appreciated.