All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, @hendriks ,  this is an old post, but can you remember the actions to add the indexserver role ?    Thanks.
Hello, Can someone confirm if this is official app by microsoft or a third party created app? I want to integrate azure waf logs into my splunk indexer.   Thanks and Regards, satyam
Hi @tanjil  As you are a Splunk Cloud customer you are entitled to a "0-byte" license which allows you to use non-indexing components without restriction (e.g. auth/kvstore/forwarding/accessing prev... See more...
Hi @tanjil  As you are a Splunk Cloud customer you are entitled to a "0-byte" license which allows you to use non-indexing components without restriction (e.g. auth/kvstore/forwarding/accessing previously indexed data etc etc) - Check out https://splunk.my.site.com/customer/s/article/0-byte-license-for-Deployment-Server-or-Heavy-Forwarder for more information.  Basically this is a perpetual 0-byte license so you can perform your usual HF/DS work. Just open a case via https://www.splunk.com/support and they should issue it pretty quickly.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
well... if im removing the table i see the entire event with the real structure, but i want to see only the testlogs.log part how can i do it ? using |fields does not help
1. Ok. You're searching by full json paths which probably means that you're using indexed extractions. This is generally Not Good (tm). 2. You're using the table command at the end. It creates a sum... See more...
1. Ok. You're searching by full json paths which probably means that you're using indexed extractions. This is generally Not Good (tm). 2. You're using the table command at the end. It creates a summary table which does not do any additional formating. You might try to do | fields logs | fields - _raw _time | rename logs as _raw instead of the table command and use event list widget instead of table but I'm not sure it will look good.
"AdditionalData":{"time":"2025-06-19T11:52:37","testName":"CheckLiveRatesTest","testClass":"Automation.TestsFolder","fullName":"Automation.TestsFolder","repoUrl":"***","pipelineName":"***","buildId":... See more...
"AdditionalData":{"time":"2025-06-19T11:52:37","testName":"CheckLiveRatesTest","testClass":"Automation.TestsFolder","fullName":"Automation.TestsFolder","repoUrl":"***","pipelineName":"***","buildId":"291","platform":"Backend","buildUrl":"https://github.com/","domain":"***","team":"***","env":"PreProd","status":"Failed","testDuration":"00:00:51.763","retry":1,"maxRetries":1,"isFinalResult":true,"errorMessage":" Verify live rates color\nAssert.That(market.VerifyLiveRatesColor(), is equal to 'true')\n Expected: True\n But was: False\n","stackTrace":" ***","triggeredManually":true,"hidden":false,"testLog":{"artifacts":{"Snapshot below: ":"http://www.dummyurl.com"},"logs":["[06/19/2025 11:51:45] Initializing BaseTestUI",["EndTime: 06/19/2025 11:51:47","Duration: 00:00:01.7646422","[06/19/2025 11:51:45] Driver configurations:\r\nIs local run: False\r\n
Please provide the raw event (not the formatted version e.g. {"AdditionalData": { "buildId":291,
AdditionalData: { [-] buildId: 291 buildUrl: https://github.com domain: *** env: PreProd errorMessage: Verify live rates color Assert.That(market.VerifyLiveRatesColor(), i... See more...
AdditionalData: { [-] buildId: 291 buildUrl: https://github.com domain: *** env: PreProd errorMessage: Verify live rates color Assert.That(market.VerifyLiveRatesColor(), is equal to 'true') Expected: True But was: False fullName: Automation.TestsFolder hidden: false isFinalResult: true maxRetries: 1 pipelineName: *** platform: Backend repoUrl: *** retry: 1 stackTrace: at *** status: Failed team: *** testCategories: [ [+] ] testClass: Automation.TestsFolder testDuration: 00:00:51.763 testLog: { [-] artifacts: { [+] } logs: [ [-] [06/19/2025 11:51:45] Initializing BaseTestUI [ [+] ] [06/19/2025 11:51:47] Initializing EtoroWorkFlows [ [+] ]   So if im using the query in my post, i don't see the [+] inside logs : .. i see it flat as one event
Please provide some anonymised sample events which demonstrate the issue you are facing. Ideally, place these in a code block (using the </> formatting option).
Thank you very much @PrewinThomas , with what you commented along with @bowesmana  I was able to specify what I needed
Applying this suggestion worked for me... I've tested it with more data, and so far there have been no inconsistencies. I really appreciate the input!
Hello I have a table in dashboard studio and i want to show a part of the json field which contains sub objects when running this  query : index="stg_observability_s" AdditionalData.testName=* so... See more...
Hello I have a table in dashboard studio and i want to show a part of the json field which contains sub objects when running this  query : index="stg_observability_s" AdditionalData.testName=* sourcetype=SplunkQuality AdditionalData.domain="*" AdditionalData.pipelineName="*" AdditionalData.buildId="15757128291" AdditionalData.team="*" testCategories="*" AdditionalData.status="*" AdditionalData.isFinalResult="*" AdditionalData.fullName="***" | search AdditionalData.testLog.logs{}=* | spath path="AdditionalData.testLog.logs{}" output=logs | table logs the json looks flatten , i dont see the sub objects inside is there a way to fix it ?  thanks 
@tanjil  I recommend raising a Splunk Support ticket to request the 0 MB license file. Please ensure that the support case is submitted under your valid entitlement. Recently, one of our customers s... See more...
@tanjil  I recommend raising a Splunk Support ticket to request the 0 MB license file. Please ensure that the support case is submitted under your valid entitlement. Recently, one of our customers submitted a similar request, and Splunk provided the 0 MB license file for their heavy forwarder..
First thing to do would be to call out to your local friendly Splunk Partner or any other sales channel you might have used before. If you are a current Cloud customer you should be entitled to a 0 b... See more...
First thing to do would be to call out to your local friendly Splunk Partner or any other sales channel you might have used before. If you are a current Cloud customer you should be entitled to a 0 bytes license. It's typically used for a forwarder, but might also be used for accessing previously indexed data.
Hi everyone, We already have a Splunk Cloud environment, and on-premises we have a Splunk deployment server. However, the on-prem deployment server currently has no license — it's only used to manag... See more...
Hi everyone, We already have a Splunk Cloud environment, and on-premises we have a Splunk deployment server. However, the on-prem deployment server currently has no license — it's only used to manage forwarders and isn’t indexing any data. We now have some legacy logs stored locally that we’d like to search through without ingesting new data. For this, we’re looking to get a Splunk 0 MB license (search-only) on the deployment server. Is there any way to request or generate a 0 MB license for this use case? Thanks in advance for your help!
Mine did the same thing.  It would seem it is your version of Splunk.  Set up a test environment on a laptop or spare VM and run a newer version of Splunk and see if the problem remediates itself.  
The OP is quite old.  It is possible that there was a bug in 9.2.3 that caused selectFirstSearchResult to not take effect.  I can confirm @tej57 's observation that the sample code behaves exactly as... See more...
The OP is quite old.  It is possible that there was a bug in 9.2.3 that caused selectFirstSearchResult to not take effect.  I can confirm @tej57 's observation that the sample code behaves exactly as you asked in 9.4, too.      
Hi @chrisboy68, There are lots of options presented, but combining @yuanliu's response with a conversion from bill_date to year and month gives the output closest to "ID Cost by month": | makeresul... See more...
Hi @chrisboy68, There are lots of options presented, but combining @yuanliu's response with a conversion from bill_date to year and month gives the output closest to "ID Cost by month": | makeresults format=csv data="bill_date,ID,Cost,_time 6/1/25,1,1.24,2025-06-16T12:42:41.282-04:00 6/1/25,1,1.4,2025-06-16T12:00:41.282-04:00 5/1/25,1,2.5,2025-06-15T12:42:41.282-04:00 5/1/25,1,2.2,2025-06-14T12:00:41.282-04:00 5/1/25,2,3.2,2025-06-14T12:42:41.282-04:00 5/1/25,2,3.3,2025-06-14T12:00:41.282-04:00 3/1/25,1,4.4,2025-06-13T12:42:41.282-04:00 3/1/25,1,5,2025-06-13T12:00:41.282-04:00 3/1/25,2,6,2025-06-13T12:42:41.282-04:00 3/1/25,2,6.3,2025-06-13T12:00:41.282-04:00" | eval _time=strptime(_time, "%FT%T.%N%z") ``` end test data ``` ``` assuming month/day/year for bill_date ``` | eval Month=strftime(strptime(bill_date, "%m/%e/%y"), "%Y-%m") | stats latest(Cost) as Cost by Month ID Month ID Cost ----- -- ---- 2025-03 1 4.4 2025-03 2 6 2025-05 1 2.5 2025-05 2 3.2 2025-06 1 1.24 You can alternatively use chart, xyseries, etc. to pivot the results: | chart latest(Cost) over ID by Month ID 2025-03 2025-05 2025-06 -- ------- ------- ------- 1 4.4 2.5 1.24 2 6 3.2
Hi @Namo, Make sure $SPLUNK_HOME/etc/auth/cacert.pem contains all certificates in the trust chain. If you're using a self-signed certificate, add this certificate to cacert.pem. If you've changed th... See more...
Hi @Namo, Make sure $SPLUNK_HOME/etc/auth/cacert.pem contains all certificates in the trust chain. If you're using a self-signed certificate, add this certificate to cacert.pem. If you've changed the name or location of the file, update the new file. If you're also attempting a KV store upgrade, check the prerequisites at https://help.splunk.com/en/splunk-enterprise/administer/admin-manual/9.4/administer-the-app-key-value-store/upgrade-the-kv-store-server-version#ariaid-title2 as others have recommended. Also note that your private key must be encrypted with the correct sslPassword value in server.conf for a KV store upgrade to succeed. When using a blank/empty password, you'll see a message similar to the following in splunkd.log: 06-21-2025 00:00:00.000 -0000 WARN KVStoreUpgradeToolTLS [133719 KVStoreConfigurationThread] - Incomplete TLS settings detected, skipping creation of KVStore TLS credentials file!  
That is perfect. Exactly what I needed. This was the most helpful reply to any question I think I have ever posted to a forum.