All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @elend , yes, you have to rebuild the DataModel, otherwise the change is applied only to new events. Ciao. Giuseppe
Hi @mahesh27 , try to add INDEXED_EXTRACTIONS = JSON to your props.conf Ciao. Giuseppe
Hi  Now and again we get an extremely high system load average on the Search Head. I cant figure out why it is happening and I have to do a kill -9 -1 and restart to fix it. This means we ... See more...
Hi  Now and again we get an extremely high system load average on the Search Head. I cant figure out why it is happening and I have to do a kill -9 -1 and restart to fix it. This means we can't log into the Splunk GUI. I kill Splunk and I see a lot of processes. After it is dead, I can still Splunkd process on the box and the load time is still high.   Regards Robert  
Please try:     oneshotsearch_results = service.jobs.oneshot(searchquery_oneshot, **kwargs_oneshot)     reader = results.JSONResultsReader(oneshotsearch_results)  
Hi All, We have a json logs where few logs are not parsing properly. When i check internal logs its shows that truncate value exceed the default 10000 bytes, so i tried increasing truncate value to 4... See more...
Hi All, We have a json logs where few logs are not parsing properly. When i check internal logs its shows that truncate value exceed the default 10000 bytes, so i tried increasing truncate value to 40000, but still logs are not parsing correctly. the logs length is around  26000. props used: [app:json:logs] SHOULD_LINEMERGE=true LINE_BREAKER=([\r\n]+) CHARSET=UTF-8 TIMEPREFIX=\{\"timestamp"\:\" KV_MODE=json TRUNCATE=40000    
This question has been answered here: Solved: Re: Unanswered question about duplicate forwarders... - Splunk Community
appreciate it, but... Have you actually used this? I can't get it to work (it's in beta now, zero reviews or ratings). Even it's own demos and samples throw errors. Running on RHEL8, 9.2.2.
|inputlookup dmc_forwarder_assets.csv | sort - last_connected hostname |streamstats count by hostname |search status=active OR (status=missing AND count=1) |fields - count | outputlookup dmc_for... See more...
|inputlookup dmc_forwarder_assets.csv | sort - last_connected hostname |streamstats count by hostname |search status=active OR (status=missing AND count=1) |fields - count | outputlookup dmc_forwarder_assets.csv
S3SPL Add-On for Splunk enables your data stored in S3 for immediate insight using custom Splunk commands. The source of the data does not matter, as long as it is stored in S3 and can be queried usi... See more...
S3SPL Add-On for Splunk enables your data stored in S3 for immediate insight using custom Splunk commands. The source of the data does not matter, as long as it is stored in S3 and can be queried using S3 Select. This includes JSON, CSV, Parquet and even files written by Splunk Ingest Actions. S3SPL provides the following functionality to Splunk users: Query S3 using S3Select in an ad-hoc fashion using WHERE statements Save queries and share them with other users Configure queries to manage timestamps based on defined field names automatically Configure queries with replacements to adapt queries to the current requirement on the fly Create queries and preview results using an interactive workbench In addition, S3SPL provides an admin section that allows the management of multiple buckets and saved queries. Finally, a comprehensive access control system based on Splunk capabilities and roles allows for granular access control from Splunk to buckets and prefixes within them.
Hello, I have this - results = service.jobs.oneshot(searchquery_oneshot, **kwargs_oneshot) reader = results.JSONResultsReader(oneshotsearch_results)   dict = json.loads(oneshotsearch_result... See more...
Hello, I have this - results = service.jobs.oneshot(searchquery_oneshot, **kwargs_oneshot) reader = results.JSONResultsReader(oneshotsearch_results)   dict = json.loads(oneshotsearch_results)  # to get dict to send data outside splunk selectively   Error: TypeError: the JSON object must be str, bytes or bytearray, not ResponseReader   How do I fix this?   Thanks    
Use this query to find out which indexes are used by a data model. | tstats count from datamodel=foo by index
But can you give me a bit more on the Rebuild Forwarder Asset table in the DMC? And do you have maybe how that search would look? I have basically generally searched for specific users in the search ... See more...
But can you give me a bit more on the Rebuild Forwarder Asset table in the DMC? And do you have maybe how that search would look? I have basically generally searched for specific users in the search and reporting field. So any more pointing in the direction would help. But in the interim, I will start looking into this as a solution and work towards it. Appreciate it
We have configured a health rule in AppDynamics to monitor storage usage across all Servers. (Hardware Resources|Volumes|/|Used (%))The rule is set to trigger a Slack notification when the root stora... See more...
We have configured a health rule in AppDynamics to monitor storage usage across all Servers. (Hardware Resources|Volumes|/|Used (%))The rule is set to trigger a Slack notification when the root storage exceeds the 80% warning and 90% critical threshold. While the rule violation is correctly detected for all nodes, for 2 of the VMs which crossing 90% above but alerts are sent for one VM. We need assistance in ensuring that alerts are triggered and sent for all affected nodes. Please also see attached screenshots.       
You have two options:   1. Rebuild the Forwarder Asset table in the DMC 2. Create a custom search to identify duplicate hostnames and remove these entries of missing forwarder in the lookup file d... See more...
You have two options:   1. Rebuild the Forwarder Asset table in the DMC 2. Create a custom search to identify duplicate hostnames and remove these entries of missing forwarder in the lookup file dmc_fowarder_assets.csv that is located in the splunk_monitoring_console app  
Hi @bowesmana ,   I mean to ask what part of the js file defines the JS error in the UI. I have other files as well that have different functionalities but they do not have the util/console part bu... See more...
Hi @bowesmana ,   I mean to ask what part of the js file defines the JS error in the UI. I have other files as well that have different functionalities but they do not have the util/console part but still throw the same error. How do I identify those parts in the JS file?   Regards, Pravin
Here is an old post from 2019 that was unanswered. https://community.splunk.com/t5/Deployment-Architecture/Remove-missing-duplicate-forwarders-from-forwarder-managment/m-p/492211 I am running into ... See more...
Here is an old post from 2019 that was unanswered. https://community.splunk.com/t5/Deployment-Architecture/Remove-missing-duplicate-forwarders-from-forwarder-managment/m-p/492211 I am running into the same issue. Splunk Enterprise 9.2.2. Basically we had maybe 400+ machines with version 9.0.10. When upgrading to a newer splunkforwarder 9.2.2 under Forwarder Management there is duplicate instances of the computers. Pushing our Clients now to above 800. How can you remove the duplicates with going through each duplicate and clicking delete Record? Thanks
Hi All, What are the licenses and subscription required for Lambda Monitoring in AppDynamics. Our requirement is to monitor Microservices in Lambda. The technology used is Node Js. As per below com... See more...
Hi All, What are the licenses and subscription required for Lambda Monitoring in AppDynamics. Our requirement is to monitor Microservices in Lambda. The technology used is Node Js. As per below community answer this doesn't require APM license and only requires AppDynamics Serverless APM for AWS Lambda https://community.appdynamics.com/t5/Licensing-including-Trial/How-does-licensing-work-when-instrumenting-AppD-and-lambda/m-p/38605#M545 But, I also could find the below comment in documentation (https://docs.appdynamics.com/appd/23.x/latest/en/application-monitoring/install-app-server-agents/serverless-apm-for-aws-lambda/subscribe-to-serverless-apm-for-aws-lambda) An AppDynamics Premium or Enterprise license, using either the Agent-based Licensing model or the Infrastructure-based Licensing model. Please provide clarity on this, If APM license is required or not. Thanks Fadil
Actually I already evals all field and made fillnull with "Unknonwn" strings all the fields. However some queries show same amount of event, but some field filled "Unknonwn" even it actually have val... See more...
Actually I already evals all field and made fillnull with "Unknonwn" strings all the fields. However some queries show same amount of event, but some field filled "Unknonwn" even it actually have values.  Or rebuild the datamodel is needed?
Have you checked the Search behind your dashboard panels? Have you verified that all used fields in these searches are still existing?
Hi @thellmann , We have our hosted apps on Splunk Enterprise and vetting is also completed and passed successfully. How can I unit test that app  over splunk cloud without license or using any Dev li... See more...
Hi @thellmann , We have our hosted apps on Splunk Enterprise and vetting is also completed and passed successfully. How can I unit test that app  over splunk cloud without license or using any Dev license before release. Any workaround for this?