All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Splunk cannot natively query Elasticsearch or use it as an external data source.  There is, however, an add-on to make some of that possible.  See https://splunkbase.splunk.com/app/6477
Hi @kn450  You might want to look at ElasticSPL Add-on for Splunk app on Splunkbase which allows this, the mean features include: Query Elasticsearch in an ad-hoc fashion using DSL, Lucene or ES|QL... See more...
Hi @kn450  You might want to look at ElasticSPL Add-on for Splunk app on Splunkbase which allows this, the mean features include: Query Elasticsearch in an ad-hoc fashion using DSL, Lucene or ES|QL search statements for time-series data using elasticadhoc and elasticquery Query Elasticsearch in an ad-hoc fashion using DSL search statements for aggregated data using elasticadhocstats and elasticquerystats For more info and docs please see https://docs.datapunctum.com/elasticspl The app contains custom commands that allow you to search Elastic from Splunk without having to use a modular input to ingest the data into Splunk.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hello @Nawab @dania_abujuma , Integrating Azure Web Application Firewall (WAF) logs into Splunk can be achieved using the Splunk Add-on for Microsoft Cloud Services. This add-on allows you to co... See more...
Hello @Nawab @dania_abujuma , Integrating Azure Web Application Firewall (WAF) logs into Splunk can be achieved using the Splunk Add-on for Microsoft Cloud Services. This add-on allows you to collect activity logs, service status, operational messages, Azure audit logs, and Azure resource data from various Microsoft cloud services, including Azure WAF, using Event Hubs, Azure Service Management APIs, and Azure Storage APIs. It provides the necessary inputs and CIM-compatible knowledge to integrate with other Splunk apps, such as Splunk Enterprise Security and the Splunk App for PCI Compliance. Splunkbase+2Splunkbase+2splunk.github.io+2 For more detailed information and installation instructions, you can refer to the official documentation 
Hello @danielbb , We have this 2 add-on which play around Cisco Meraki: Cisco Meraki Add-on for Splunk : https://splunkbase.splunk.com/app/5580  CCX Add-on for Cisco Meraki: https://splunkbase.s... See more...
Hello @danielbb , We have this 2 add-on which play around Cisco Meraki: Cisco Meraki Add-on for Splunk : https://splunkbase.splunk.com/app/5580  CCX Add-on for Cisco Meraki: https://splunkbase.splunk.com/app/7365 Cisco Meraki Add-on for Splunk provides comprehensive network observability and security monitoring across your Meraki organizations by collecting data via Cisco Meraki REST APIs and webhooks. It delivers insights into network performance, security, and device health, and includes sample visualizations to help explore the data and create custom dashboards. The add-on also provides Common Information Model (CIM) compatible knowledge to integrate with other Splunk solutions, such as Splunk Enterprise Security and the Splunk App for PCI Compliance. Splunkbase+7Splunkbase+7Splunkbase+7 For a more streamlined and CIM-compliant experience, you might consider the CCX Add-on for Cisco Meraki. This add-on offers enhanced field extraction and CIM compliance for Meraki logs, supporting sourcetypes such as meraki:securityappliances, meraki:devicesavailabilitieschangehistory, and others. It provides additional field extraction and CIM compliance, making it a comprehensive field extraction bundle for Cisco Meraki logs. splunk.github.io+10Splunkbase+10Splunk Docs+10 Both add-ons are compatible with Splunk Enterprise and Splunk Cloud. The Cisco Meraki Add-on for Splunk offers a broader set of features and visualizations, while the CCX Add-on focuses on enhanced CIM compliance and field extraction. Depending on your organization's needs, you can choose the add-on that best fits your requirements.
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Proces... See more...
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Processing Language) — without ingesting or storing the data again in Splunk, to avoid duplication and unnecessary storage costs. My main questions are: Is there a way to connect Splunk directly to Elasticsearch as an external data source? Can Splunk query external data (like from Elasticsearch) using SPL, without indexing it? Are there any available add-ons, modular inputs, or scripted solutions that allow this type of integration? Is this approach officially supported by Splunk, or would it require a custom integration? I’m aware that tools like Logstash or Kafka can be used to bring data into Splunk, but that’s exactly what I’m trying to avoid — I don’t want to duplicate the data storage. If anyone has experience with a similar setup, or any recommendations, I’d greatly appreciate your input. Thanks in advance!  
You could try stats source="mobilepro-test" | stats first(UserInfo.UserId) as UserInfo.UserId first(Location.Site) as Location.Site first(Session.StartTime) as Session.StartTime by Session.SessionId... See more...
You could try stats source="mobilepro-test" | stats first(UserInfo.UserId) as UserInfo.UserId first(Location.Site) as Location.Site first(Session.StartTime) as Session.StartTime by Session.SessionId | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | table Session.SessionId, label
@hv64  Please review this older solution for reference https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-DB-Connect-connection-to-Hana/m-p/311647 
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping ... See more...
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping by SessionId and combing  3 fields into a single field. source="mobilepro-test" | dedup Session.SessionId | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | table Session.SessionId, label It looks like it's the dedup that is causing the slowness, but I have no idea how to improve that. Thanks for any help on this one, Tom
Hi, We want to connect Splunk to SAP Hana Database. Have you some idea ? Do we use ngdbc.jar and  Put that driver in: $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers Regards.
Hi @vtamas  Just to check, how many licenses do you have listed in the license page on your License Manager?  As far I know you should have 3, the Core/Enterprise license, the ITSI entitlement lice... See more...
Hi @vtamas  Just to check, how many licenses do you have listed in the license page on your License Manager?  As far I know you should have 3, the Core/Enterprise license, the ITSI entitlement license and the ITSI "Internal" license (not to be confused with the other ITSI license - this allows for ITSI sourcetypes to be ingested without impacting your main core license).  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Mahendra_Penuma  I’m not familiar with the process for generating a diag file for the Edge Processor, but you may want to refer to this resource to see if it helps https://docs.splunk.com/Document... See more...
@Mahendra_Penuma  I’m not familiar with the process for generating a diag file for the Edge Processor, but you may want to refer to this resource to see if it helps https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/EdgeProcessor/Troubleshooting#Generate_a_diagnostic_report_for_an_Edge_Processor_instance 
Need assistance to create diag file on splunk edge processor
OK. A stupid question since I don't know ITSI. But ES has this nasty role configurator in WebUI and you cannot just add capabilities to a role using standard Splunk role settings screen, you have to ... See more...
OK. A stupid question since I don't know ITSI. But ES has this nasty role configurator in WebUI and you cannot just add capabilities to a role using standard Splunk role settings screen, you have to do it in ES and let the ES "modular input" managing capbilities do its magic. Doesn't ITSI have its equivalent of that? We had similar errors when trying to manage ES capabilities directly, instead of via ES internal mechanisms.
We have a lab Splunk deployment with the following specification: 3 indexers in an indexer cluster 1 SH for normal searches 1 SH with ITSI installed 1 SH with Enterprise Security installed 1 se... See more...
We have a lab Splunk deployment with the following specification: 3 indexers in an indexer cluster 1 SH for normal searches 1 SH with ITSI installed 1 SH with Enterprise Security installed 1 server that acts as the Cluster manager for the indexers and as the License manager We have NFR licenses (Enterprise, ITSI) installed on the License manager and all the other servers are configured as license peers. With the above setup the problem is that the ITSI license doesn't work, and we only get IT Essential Works. When the ITSI license is installed directly on the ITSI server ITSI is working correctly (but the other licences don't apply in this case, because those are installed on the License manager). We installed the required applications (SA-ITSI-Licensechecker and SA-UserAccess) on the License manager as per the official documentation.  Did anyone encounter similar problem and if so, what was the solution?
He has them  but still there is error, is there anything on the conf files: accelerate_search bulk_import_service_or_entity change_own_password configure_mltk_container configure_perms co... See more...
He has them  but still there is error, is there anything on the conf files: accelerate_search bulk_import_service_or_entity change_own_password configure_mltk_container configure_perms control_mltk_container delete_drift_detection_results delete_itsi_correlation_search delete_itsi_custom_threshold_windows delete_itsi_data_integration delete_itsi_deep_dive delete_itsi_deep_dive_context delete_itsi_drift_detection_template delete_itsi_event_management_export delete_itsi_event_management_state delete_itsi_glass_table delete_itsi_homeview delete_itsi_kpi_at_info delete_itsi_kpi_base_search delete_itsi_kpi_entity_threshold delete_itsi_kpi_state_cache delete_itsi_kpi_threshold_template delete_itsi_notable_aggregation_policy delete_itsi_notable_event_email_template delete_itsi_refresh_queue_job delete_itsi_sandbox_service delete_itsi_service delete_itsi_temporary_kpi delete_maintenance_calendar delete_module_interface delete_notable_event edit_log_alert_event edit_own_objects edit_search_schedule_window edit_sourcetypes edit_statsd_transforms edit_token_http embed_report entities_at_configurations_get execute-notable_event_action execute_notable_event_action export_results_is_visible get_drift_detection_kpis get_drift_detection_results get_metadata get_typeahead input_file interact_with_itsi_correlation_search interact_with_itsi_deep_dive interact_with_itsi_deep_dive_context interact_with_itsi_event_management_state interact_with_itsi_glass_table interact_with_itsi_homeview interact_with_itsi_notable_aggregation_policy kpis_at_configurations_get list_accelerate_search list_all_objects list_health list_inputs list_metrics_catalog list_mltk_container list_search_head_clustering list_settings list_storage_passwords list_tokens_own metric_alerts output_file pattern_detect read-notable_event read-notable_event_action read_itsi_backup_restore read_itsi_base_service_template read_itsi_correlation_search read_itsi_custom_threshold_windows read_itsi_data_integration read_itsi_deep_dive read_itsi_deep_dive_context read_itsi_drift_detection_template read_itsi_entity_discovery_searches read_itsi_entity_management_policies read_itsi_event_management_export read_itsi_event_management_state read_itsi_glass_table read_itsi_homeview read_itsi_kpi_at_info read_itsi_kpi_base_search read_itsi_kpi_entity_threshold read_itsi_kpi_state_cache read_itsi_kpi_threshold_template read_itsi_notable_aggregation_policy read_itsi_notable_event_email_template read_itsi_refresh_queue_job read_itsi_sandbox read_itsi_sandbox_service read_itsi_sandbox_sync_log read_itsi_service read_itsi_team read_itsi_temporary_kpi read_maintenance_calendar read_metric_ad read_module_interface read_notable_event read_notable_event_action request_remote_tok rest_access_server_endpoints rest_apps_view rest_properties_get rest_properties_set rtsearch run_collect run_custom_command run_dump run_mcollect run_msearch run_sendalert schedule_rtsearch schedule_search search search_process_config_refresh upload_lookup_files upload_onnx_model_file write-notable_event write_itsi_correlation_search write_itsi_custom_threshold_windows write_itsi_data_integration write_itsi_deep_dive write_itsi_deep_dive_context write_itsi_drift_detection_template write_itsi_event_management_export write_itsi_event_management_state write_itsi_glass_table write_itsi_homeview write_itsi_kpi_at_info write_itsi_kpi_base_search write_itsi_kpi_entity_threshold write_itsi_kpi_state_cache write_itsi_kpi_threshold_template write_itsi_notable_aggregation_policy write_itsi_notable_event_email_template write_itsi_refresh_queue_job write_itsi_sandbox write_itsi_sandbox_service write_itsi_sandbox_sync_log write_itsi_service write_itsi_temporary_kpi write_maintenance_calendar write_metric_ad write_module_interface write_notable_event    
OK. This is very very strange. I've had logs with < and > signs many times over my years of Splunk experience and never noticed such behaviour. It is possible that you're triggering some obscure bug... See more...
OK. This is very very strange. I've had logs with < and > signs many times over my years of Splunk experience and never noticed such behaviour. It is possible that you're triggering some obscure bug so it's important to narrow down its scope (as I wrote earlier - try to pinpoint the exact moment when this issue appears - whether it's the transaction command, the table command after transaction or maybe it is happening with the table command without transaction as well). And it's most probably a support case material.
Anyhow I strongly recommend you to use that last option as also @PickleRick present!
@whitefang1726  The error does not indicate the total size of the KV Store. Instead, it means the data returned by a specific query is too large (exceeds 50 MB). Your query is likely retrieving too ... See more...
@whitefang1726  The error does not indicate the total size of the KV Store. Instead, it means the data returned by a specific query is too large (exceeds 50 MB). Your query is likely retrieving too many records or large documents from the KV Store, exceeding the 50 MB limit per result set.
@whitefang1726  In the Splunk KV store, max_size_per_result_mb controls the maximum size of a result set (in MB) that can be returned from a single query to a collection. Default Value: The defaul... See more...
@whitefang1726  In the Splunk KV store, max_size_per_result_mb controls the maximum size of a result set (in MB) that can be returned from a single query to a collection. Default Value: The default value is 50 MB, but it's recommended to increase it if you need to retrieve larger results from the KV store   https://docs.splunk.com/Documentation/Splunk/latest/admin/Limitsconf  max_size_per_result_mb = <unsigned integer> * The maximum size, in megabytes (MB), of the result that will be returned for a single query to a collection. * Default: 50  
HI @a1bg503461  Please can you share the capabilities listed when the user runs:  |rest /services/authentication/current-context If they are unable to run this then they are missing the rest_prope... See more...
HI @a1bg503461  Please can you share the capabilities listed when the user runs:  |rest /services/authentication/current-context If they are unable to run this then they are missing the rest_properties_get capability.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing