All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There is clear and simple steps how to migrate also users from old SHC or single node to a new SHC via deployer. https://docs.splunk.com/Documentation/Splunk/9.4.1/DistSearch/Migratefromstandalonese... See more...
There is clear and simple steps how to migrate also users from old SHC or single node to a new SHC via deployer. https://docs.splunk.com/Documentation/Splunk/9.4.1/DistSearch/Migratefromstandalonesearchheads I have done this couple of times in several environments both for apps and users. And don’t migrate anything from Splunk’s own system apps like search! Then you must remember that if/when you have deploy those apps via deployer into SHC members you push everything into default directories. This means that if users have previously created those e.g. alerts by GUI then they cannot remove those what you have deployed by deployer. They can change those but admins must remove those from deployer and then push again into members.  That seems to be quite common question from users side in this kind of cases!
@PickleRick @isoutamo @livehybrid @kiran_panchavat  The test found that when some nodes are isolated, the isolated nodes will not elect a new caption, because it requires more than half of the total... See more...
@PickleRick @isoutamo @livehybrid @kiran_panchavat  The test found that when some nodes are isolated, the isolated nodes will not elect a new caption, because it requires more than half of the total number of nodes. And the caption cannot be manually specified. The following is the returned information. This node is not the captain of the search head cluster, and we could not determine the current captain. The cluster is either in the process of electing a new captain, or this member hasn't joined the pool https://docs.splunk.com/Documentation/Splunk/9.4.2/DistSearch/SHCarchitecture#Captain_election
As a side note, completely irrelevant to the original problem - I'm wondering whether there will be any noticeable performance difference between first(something) and latest(something) in case of a d... See more...
As a side note, completely irrelevant to the original problem - I'm wondering whether there will be any noticeable performance difference between first(something) and latest(something) in case of a default base search returning results in reverse chronological order.
Add how you did this update? Just updated node by node or install new nodes where you migrate splunk somehow?
@Nraj87 Please follow this: https://docs.splunk.com/Documentation/ES/7.3.3/User/Howurgencyisassigned#Modify_the_urgency_lookup_directly Modify the urgency lookup directly You can change which sever... See more...
@Nraj87 Please follow this: https://docs.splunk.com/Documentation/ES/7.3.3/User/Howurgencyisassigned#Modify_the_urgency_lookup_directly Modify the urgency lookup directly You can change which severity and priority values result in which calculated urgency values for notable events in Splunk Enterprise Security. Only specific values are valid for severity or priority values. Use only those values when modifying the lookup. Do not modify the names of the notable event urgency values. Valid severity values: unknown, informational, low, medium, high, critical. Valid priority values: unknown, low, medium, high, critical. Valid urgency values: informational, low, medium, high, critical. On the Enterprise Security menu bar, select Configure > Content > Content Management. Choose the Urgency Levels lookup. An editable, color coded table representing the urgency lookup file displays. In any row where the priority or severity is listed as unknown, review the assigned urgency. (Optional) Edit the table and change the urgency to another one of the accepted values. All urgency values must be lower case. Click Save. If this Helps, Please Upvote!
As others weighed in, I cannot reproduce this, either.  Here is a test code according to your description: index = _internal "<" sourcetype=splunkd | transaction thread_id | table _raw Here, I'm in... See more...
As others weighed in, I cannot reproduce this, either.  Here is a test code according to your description: index = _internal "<" sourcetype=splunkd | transaction thread_id | table _raw Here, I'm including a transaction, followed by a table command.  Please see if you get any weird artifact.  For me, nothing unusual:  
Hi @tdavison76  I would recommend using stats for this instead, see below: source="mobilepro-test" | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | stats latest(label) as la... See more...
Hi @tdavison76  I would recommend using stats for this instead, see below: source="mobilepro-test" | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | stats latest(label) as label by Session.SessionId You could switch the order of strcat to save on processing multiple strcat: source="mobilepro-test" | stats latest(UserInfo.UserId) as UserInfo_UserId, latest(Location.Site) as Location_Site, latest(Session.StartTime) AS Session_StartTime by Session.SessionId | strcat UserInfo_UserId " " Location_Site " " Session_StartTime label | table Session.SessionId, label Note: We are using "latest" here which keeps the most recent event.   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Splunk cannot natively query Elasticsearch or use it as an external data source.  There is, however, an add-on to make some of that possible.  See https://splunkbase.splunk.com/app/6477
Hi @kn450  You might want to look at ElasticSPL Add-on for Splunk app on Splunkbase which allows this, the mean features include: Query Elasticsearch in an ad-hoc fashion using DSL, Lucene or ES|QL... See more...
Hi @kn450  You might want to look at ElasticSPL Add-on for Splunk app on Splunkbase which allows this, the mean features include: Query Elasticsearch in an ad-hoc fashion using DSL, Lucene or ES|QL search statements for time-series data using elasticadhoc and elasticquery Query Elasticsearch in an ad-hoc fashion using DSL search statements for aggregated data using elasticadhocstats and elasticquerystats For more info and docs please see https://docs.datapunctum.com/elasticspl The app contains custom commands that allow you to search Elastic from Splunk without having to use a modular input to ingest the data into Splunk.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hello @Nawab @dania_abujuma , Integrating Azure Web Application Firewall (WAF) logs into Splunk can be achieved using the Splunk Add-on for Microsoft Cloud Services. This add-on allows you to co... See more...
Hello @Nawab @dania_abujuma , Integrating Azure Web Application Firewall (WAF) logs into Splunk can be achieved using the Splunk Add-on for Microsoft Cloud Services. This add-on allows you to collect activity logs, service status, operational messages, Azure audit logs, and Azure resource data from various Microsoft cloud services, including Azure WAF, using Event Hubs, Azure Service Management APIs, and Azure Storage APIs. It provides the necessary inputs and CIM-compatible knowledge to integrate with other Splunk apps, such as Splunk Enterprise Security and the Splunk App for PCI Compliance. Splunkbase+2Splunkbase+2splunk.github.io+2 For more detailed information and installation instructions, you can refer to the official documentation 
Hello @danielbb , We have this 2 add-on which play around Cisco Meraki: Cisco Meraki Add-on for Splunk : https://splunkbase.splunk.com/app/5580  CCX Add-on for Cisco Meraki: https://splunkbase.s... See more...
Hello @danielbb , We have this 2 add-on which play around Cisco Meraki: Cisco Meraki Add-on for Splunk : https://splunkbase.splunk.com/app/5580  CCX Add-on for Cisco Meraki: https://splunkbase.splunk.com/app/7365 Cisco Meraki Add-on for Splunk provides comprehensive network observability and security monitoring across your Meraki organizations by collecting data via Cisco Meraki REST APIs and webhooks. It delivers insights into network performance, security, and device health, and includes sample visualizations to help explore the data and create custom dashboards. The add-on also provides Common Information Model (CIM) compatible knowledge to integrate with other Splunk solutions, such as Splunk Enterprise Security and the Splunk App for PCI Compliance. Splunkbase+7Splunkbase+7Splunkbase+7 For a more streamlined and CIM-compliant experience, you might consider the CCX Add-on for Cisco Meraki. This add-on offers enhanced field extraction and CIM compliance for Meraki logs, supporting sourcetypes such as meraki:securityappliances, meraki:devicesavailabilitieschangehistory, and others. It provides additional field extraction and CIM compliance, making it a comprehensive field extraction bundle for Cisco Meraki logs. splunk.github.io+10Splunkbase+10Splunk Docs+10 Both add-ons are compatible with Splunk Enterprise and Splunk Cloud. The Cisco Meraki Add-on for Splunk offers a broader set of features and visualizations, while the CCX Add-on focuses on enhanced CIM compliance and field extraction. Depending on your organization's needs, you can choose the add-on that best fits your requirements.
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Proces... See more...
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Processing Language) — without ingesting or storing the data again in Splunk, to avoid duplication and unnecessary storage costs. My main questions are: Is there a way to connect Splunk directly to Elasticsearch as an external data source? Can Splunk query external data (like from Elasticsearch) using SPL, without indexing it? Are there any available add-ons, modular inputs, or scripted solutions that allow this type of integration? Is this approach officially supported by Splunk, or would it require a custom integration? I’m aware that tools like Logstash or Kafka can be used to bring data into Splunk, but that’s exactly what I’m trying to avoid — I don’t want to duplicate the data storage. If anyone has experience with a similar setup, or any recommendations, I’d greatly appreciate your input. Thanks in advance!  
You could try stats source="mobilepro-test" | stats first(UserInfo.UserId) as UserInfo.UserId first(Location.Site) as Location.Site first(Session.StartTime) as Session.StartTime by Session.SessionId... See more...
You could try stats source="mobilepro-test" | stats first(UserInfo.UserId) as UserInfo.UserId first(Location.Site) as Location.Site first(Session.StartTime) as Session.StartTime by Session.SessionId | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | table Session.SessionId, label
@hv64  Please review this older solution for reference https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-DB-Connect-connection-to-Hana/m-p/311647 
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping ... See more...
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping by SessionId and combing  3 fields into a single field. source="mobilepro-test" | dedup Session.SessionId | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | table Session.SessionId, label It looks like it's the dedup that is causing the slowness, but I have no idea how to improve that. Thanks for any help on this one, Tom
Hi, We want to connect Splunk to SAP Hana Database. Have you some idea ? Do we use ngdbc.jar and  Put that driver in: $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers Regards.
Hi @vtamas  Just to check, how many licenses do you have listed in the license page on your License Manager?  As far I know you should have 3, the Core/Enterprise license, the ITSI entitlement lice... See more...
Hi @vtamas  Just to check, how many licenses do you have listed in the license page on your License Manager?  As far I know you should have 3, the Core/Enterprise license, the ITSI entitlement license and the ITSI "Internal" license (not to be confused with the other ITSI license - this allows for ITSI sourcetypes to be ingested without impacting your main core license).  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Mahendra_Penuma  I’m not familiar with the process for generating a diag file for the Edge Processor, but you may want to refer to this resource to see if it helps https://docs.splunk.com/Document... See more...
@Mahendra_Penuma  I’m not familiar with the process for generating a diag file for the Edge Processor, but you may want to refer to this resource to see if it helps https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/EdgeProcessor/Troubleshooting#Generate_a_diagnostic_report_for_an_Edge_Processor_instance 
Need assistance to create diag file on splunk edge processor
OK. A stupid question since I don't know ITSI. But ES has this nasty role configurator in WebUI and you cannot just add capabilities to a role using standard Splunk role settings screen, you have to ... See more...
OK. A stupid question since I don't know ITSI. But ES has this nasty role configurator in WebUI and you cannot just add capabilities to a role using standard Splunk role settings screen, you have to do it in ES and let the ES "modular input" managing capbilities do its magic. Doesn't ITSI have its equivalent of that? We had similar errors when trying to manage ES capabilities directly, instead of via ES internal mechanisms.