All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi,  I am having an issue trying to make a version of the search app filtering timeline work in my dashboard in Dashboard Studio with other visualizations. I have set a token interaction on click to... See more...
Hi,  I am having an issue trying to make a version of the search app filtering timeline work in my dashboard in Dashboard Studio with other visualizations. I have set a token interaction on click to update the global_time.earliest value as to the time that is clicked on the chart. I, however, am running into an issue where I cannot set the global_time.latest value by clicking again on the timechart. If I set up a second token interaction to get the latest time, it just sets it to the same as the earliest, all on the first click. I'm trying to filter it down to each bar's representation on the timechart, which is 2 hours ( |timechart span=2h ...).  Like the search apps version, this timechart is meant to be a filtering tool that will only filter down the search times of the other visualizations once it is set. Setting the earliest token works perfectly fine; it's all just about the latest. I just need to know how or if it is possible. Thank you!!
Contact Splunk Support for versions not available on the web site.
Hey mates, I'm new to Splunk and while ingesting the data from my local machine to Splunk this message shows up. "The TCP output processor has paused the data flow. Forwarding to host_dest=192.XXX.X... See more...
Hey mates, I'm new to Splunk and while ingesting the data from my local machine to Splunk this message shows up. "The TCP output processor has paused the data flow. Forwarding to host_dest=192.XXX.X.XX inside output group default-auto lb-group from host_src=MRNOOXX has been blocked for blocked_seconds=10. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data." Kindly help me. Thank you 
Hi There, We've a standalone Splunk instance v8.2.2.1 deployed on a  RHEL server which is EOL; we wish to migrate to a newer OS Amazon Linux (AL) 2023 OS-- rather than  performing an in-place upgrad... See more...
Hi There, We've a standalone Splunk instance v8.2.2.1 deployed on a  RHEL server which is EOL; we wish to migrate to a newer OS Amazon Linux (AL) 2023 OS-- rather than  performing an in-place upgrade. Instead of using the most recent version of Splunk enterprise, we still wish to adopt a more conservative approach and choose 9.0.x (we've UFs that are older version 7.x and 8.x). Please let me know where can i download 9.0.x version of Splunk enterprise as it's not here: https://www.splunk.com/en_us/download/previous-releases.html   Thanks!
Hi @tscroggins  I have appended intermediate  and root cert to the cacert.pem .After this error is not observed.
As many mentioned on this post, even if I was able to get Splunk to read the log file it will end up with duplicate logs or I might lose events if the UF reads to slow. The solution is to write a cu... See more...
As many mentioned on this post, even if I was able to get Splunk to read the log file it will end up with duplicate logs or I might lose events if the UF reads to slow. The solution is to write a custom script that can handle the log behaviour of when it's "full" it starts overwriting the oldest event. This custom script allows Splunk to ingest events and can help handle the duplicate logs. As for loss of events by overwriting, I don't have a bullet proof solution beyond ensuring the events are ingested into Splunk faster than they are written. You should consider just using the script to tail the log and write a new log file, to aid in this if necessary.   Many thanks for the insights on UF behaviour for this wierd log.
Hi _olivier_, Yes, off course when on your server go to the monitoring console, there under the menu setting, select "general setup" and there you can set the server roles.    Kind regards. 
Hello @Satyams14, If you plan to stream WAF logs to Eventhubs and wish to use Splunk Supported Add-on, you can also consider using Splunk Add-on for Microsoft Cloudservices (#3110 - https://splunkba... See more...
Hello @Satyams14, If you plan to stream WAF logs to Eventhubs and wish to use Splunk Supported Add-on, you can also consider using Splunk Add-on for Microsoft Cloudservices (#3110 - https://splunkbase.splunk.com/app/3110). It is a supported add-on and can fetch logs directly from the eventhub. Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated..!! 
Hi @_olivier_ , don't attach a new question on an old one, even if on the same topic: open a new request, so you will be more sure to receive an answer. Ciao. Giuseppe
Hi @Satyams14  This app is created by Splunk (but not a Splunk supported app) - not created by Microsoft, having said that I believe that it IS the "go-to" app for Azure feeds/onboarding. For a goo... See more...
Hi @Satyams14  This app is created by Splunk (but not a Splunk supported app) - not created by Microsoft, having said that I believe that it IS the "go-to" app for Azure feeds/onboarding. For a good overview on getting-data-in (GDI) for Azure check out https://docs.splunk.com/Documentation/SVA/current/Architectures/AzureGDI (which lists this app).  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Satyams14 , as you can read at https://splunkbase.splunk.com/app/3757, this isn't an official app by Splunk or Microsoft: It was created by "Splunk Works", It isn't supported, even if it has... See more...
Hi @Satyams14 , as you can read at https://splunkbase.splunk.com/app/3757, this isn't an official app by Splunk or Microsoft: It was created by "Splunk Works", It isn't supported, even if it has 64,900 downloads, and you can find it on GitHub. Ciao. Giuseppe
Hi, @hendriks ,  this is an old post, but can you remember the actions to add the indexserver role ?    Thanks.
Hello, Can someone confirm if this is official app by microsoft or a third party created app? I want to integrate azure waf logs into my splunk indexer.   Thanks and Regards, satyam
Hi @tanjil  As you are a Splunk Cloud customer you are entitled to a "0-byte" license which allows you to use non-indexing components without restriction (e.g. auth/kvstore/forwarding/accessing prev... See more...
Hi @tanjil  As you are a Splunk Cloud customer you are entitled to a "0-byte" license which allows you to use non-indexing components without restriction (e.g. auth/kvstore/forwarding/accessing previously indexed data etc etc) - Check out https://splunk.my.site.com/customer/s/article/0-byte-license-for-Deployment-Server-or-Heavy-Forwarder for more information.  Basically this is a perpetual 0-byte license so you can perform your usual HF/DS work. Just open a case via https://www.splunk.com/support and they should issue it pretty quickly.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
well... if im removing the table i see the entire event with the real structure, but i want to see only the testlogs.log part how can i do it ? using |fields does not help
1. Ok. You're searching by full json paths which probably means that you're using indexed extractions. This is generally Not Good (tm). 2. You're using the table command at the end. It creates a sum... See more...
1. Ok. You're searching by full json paths which probably means that you're using indexed extractions. This is generally Not Good (tm). 2. You're using the table command at the end. It creates a summary table which does not do any additional formating. You might try to do | fields logs | fields - _raw _time | rename logs as _raw instead of the table command and use event list widget instead of table but I'm not sure it will look good.
"AdditionalData":{"time":"2025-06-19T11:52:37","testName":"CheckLiveRatesTest","testClass":"Automation.TestsFolder","fullName":"Automation.TestsFolder","repoUrl":"***","pipelineName":"***","buildId":... See more...
"AdditionalData":{"time":"2025-06-19T11:52:37","testName":"CheckLiveRatesTest","testClass":"Automation.TestsFolder","fullName":"Automation.TestsFolder","repoUrl":"***","pipelineName":"***","buildId":"291","platform":"Backend","buildUrl":"https://github.com/","domain":"***","team":"***","env":"PreProd","status":"Failed","testDuration":"00:00:51.763","retry":1,"maxRetries":1,"isFinalResult":true,"errorMessage":" Verify live rates color\nAssert.That(market.VerifyLiveRatesColor(), is equal to 'true')\n Expected: True\n But was: False\n","stackTrace":" ***","triggeredManually":true,"hidden":false,"testLog":{"artifacts":{"Snapshot below: ":"http://www.dummyurl.com"},"logs":["[06/19/2025 11:51:45] Initializing BaseTestUI",["EndTime: 06/19/2025 11:51:47","Duration: 00:00:01.7646422","[06/19/2025 11:51:45] Driver configurations:\r\nIs local run: False\r\n
Please provide the raw event (not the formatted version e.g. {"AdditionalData": { "buildId":291,
AdditionalData: { [-] buildId: 291 buildUrl: https://github.com domain: *** env: PreProd errorMessage: Verify live rates color Assert.That(market.VerifyLiveRatesColor(), i... See more...
AdditionalData: { [-] buildId: 291 buildUrl: https://github.com domain: *** env: PreProd errorMessage: Verify live rates color Assert.That(market.VerifyLiveRatesColor(), is equal to 'true') Expected: True But was: False fullName: Automation.TestsFolder hidden: false isFinalResult: true maxRetries: 1 pipelineName: *** platform: Backend repoUrl: *** retry: 1 stackTrace: at *** status: Failed team: *** testCategories: [ [+] ] testClass: Automation.TestsFolder testDuration: 00:00:51.763 testLog: { [-] artifacts: { [+] } logs: [ [-] [06/19/2025 11:51:45] Initializing BaseTestUI [ [+] ] [06/19/2025 11:51:47] Initializing EtoroWorkFlows [ [+] ]   So if im using the query in my post, i don't see the [+] inside logs : .. i see it flat as one event
Please provide some anonymised sample events which demonstrate the issue you are facing. Ideally, place these in a code block (using the </> formatting option).