All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Shane.Tembo, In case you don't hear back, I found this documentation that may help.  https://docs.appdynamics.com/observability/cisco-cloud-observability/en/cloud-and-infrastructure-monitorin... See more...
Hi @Shane.Tembo, In case you don't hear back, I found this documentation that may help.  https://docs.appdynamics.com/observability/cisco-cloud-observability/en/cloud-and-infrastructure-monitoring/aws-cloud-observability/configure-aws-cloud-connections/set-up-the-cisco-appdynamics-infrastructure-collector-to-monitor-aws/deploy-the-cisco-appdynamics-infrastructure-collector-in-ecs-fargate
By Default when you install Splunk it installs default certs that can be used, but these should be changed as per your organisations TLS cert process alongside the TLS requirements as per the Splunk ... See more...
By Default when you install Splunk it installs default certs that can be used, but these should be changed as per your organisations TLS cert process alongside the TLS requirements as per the Splunk docs. The default ones are only used for testing, POC etc.  If you look at the table from this document link, it shows the various components and TLS cert scenarios for Splunk enterprise and cloud and default status. https://docs.splunk.com/Documentation/Splunk/9.2.1/Security/AboutsecuringyourSplunkconfigurationwithSSL  So, in your case. (Between Search Heads and Indexers) You would need to create TLS cert for Web Browser Access, not use the default ones, unless just testing - this will give you config and search data. Note the SH should be connected to the Indexers via the Cluster Manager. Between Indexers (Index Clustering) - If you configure indexers in a cluster, they should also use TLS certificate - this will be for replication data, again its best to create your TLS certs for all the indexers. Side note: You typically also use the UF's with the indexers certs for UF to Indexer TLS data encryption. Here's a good link for Splunk TLS certs process and understanding https://lantern.splunk.com/Splunk_Platform/Product_Tips/Administration/Securing_the_Splunk_platform_with_TLS 
I am having an issue in Advanced hunting for Defender app in Splunk https://splunkbase.splunk.com/app/5518 My original KQL query in azure contains | JOIN KIND INNER. Is such syntax... See more...
I am having an issue in Advanced hunting for Defender app in Splunk https://splunkbase.splunk.com/app/5518 My original KQL query in azure contains | JOIN KIND INNER. Is such syntax also possible in SPL?            
Hi @Naruto7431 , what do you mean? you can normally search on raw logs using Splunk and get the desidered output. Could you better describe your request? Ciao. Giuseppe
Hi, I'll try to more specific, I have text files (one data type for example) those *.txt files contain geographic data that i would like to query using Splunk. The size and for format  of those file... See more...
Hi, I'll try to more specific, I have text files (one data type for example) those *.txt files contain geographic data that i would like to query using Splunk. The size and for format  of those files are varies, it could be txt, xml, json.... 1kb-10mb
Hello Splunkers!! After resetting my admin password, the users' accounts are gone and they are not visible in the UI. How can I restore all those user login accounts? In the above picture fro... See more...
Hello Splunkers!! After resetting my admin password, the users' accounts are gone and they are not visible in the UI. How can I restore all those user login accounts? In the above picture from the backend I can see the users. but from the UI no user account is visible.
From normal splunk search can i also search inside the show source raw log and get the desired o/p  
Could you please confirm whether Splunk utilizes TLS/SSL for the following communications by default or it should be manually configured: Between Search Heads and Indexers Between Indexers (Ind... See more...
Could you please confirm whether Splunk utilizes TLS/SSL for the following communications by default or it should be manually configured: Between Search Heads and Indexers Between Indexers (Index Clustering)
i am trying to create trigger alert but it is not work any video or doc from zero 
Did you continue to the next step "Map SAML groups to Splunk Enterprise roles "?
Not saying it is, but this could be an indication of CPU being bottlenecked due to Disk IO operations. I might be worth checking your disks are they SDD or align with whats being recommended in the l... See more...
Not saying it is, but this could be an indication of CPU being bottlenecked due to Disk IO operations. I might be worth checking your disks are they SDD or align with whats being recommended in the link below.   See the part on "Notes about optimizing Splunk software and storage usage"  and And "What storage type should I use for a role?"   https://docs.splunk.com/Documentation/Splunk/9.2.1/Capacity/Referencehardware  If your disk's are good, then its something else, such as the current set up is being overwhelmed with data and searches, therefore you need to expand, but this all depends on your current design / ingest volumes, and use case.   The disk's requirements would more or less be the same for the versions of your Splunk.   Tip: The iostat  command can be used to help with disk iowait (I've had to install this in the past for the TA Nix Add-on and collected that stats that way) .  
Hi, We are integrating the Splunk to our Microsoft Azure SSO, and followed instructions from https://learn.microsoft.com/en-us/entra/identity/saas-apps/splunkenterpriseandsplunkcloud-tutorial#conf... See more...
Hi, We are integrating the Splunk to our Microsoft Azure SSO, and followed instructions from https://learn.microsoft.com/en-us/entra/identity/saas-apps/splunkenterpriseandsplunkcloud-tutorial#configure-microsoft-entra-sso But after all the configuration, we are hitting the "No valid Splunk role found in local mapping"   Also checked Configure SSO with Microsoft Azure AD or AD FS as your Identity Provider - Splunk Documentation to remove the alias but was not able to make it work.
Hello,   We are ingesting Checkpoint logs through an Edge Processor to our SCP. We have deployed Splunk Add-on for Check Point Log Exporter in SCP but events are not parsing properly. I show you in... See more...
Hello,   We are ingesting Checkpoint logs through an Edge Processor to our SCP. We have deployed Splunk Add-on for Check Point Log Exporter in SCP but events are not parsing properly. I show you in a screenshot: We only can use these fields, related to the EP Could someone help us? Thank's in advance
Hi @jbv , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
I created a dvwa application and used SplunkUniversalForwarder to forward the log to port 9997, containing events such as sql injection and brute force cracking, but incident_review and ess_se... See more...
I created a dvwa application and used SplunkUniversalForwarder to forward the log to port 9997, containing events such as sql injection and brute force cracking, but incident_review and ess_security_posture is not hitting any event
In order for you to integrate with SQL data, you need to use the DB connect App as its designed for this purpose. You have to then configure it to communicate with the SQL server, this requires vari... See more...
In order for you to integrate with SQL data, you need to use the DB connect App as its designed for this purpose. You have to then configure it to communicate with the SQL server, this requires various services and other components and yes there are lots of small steps, but work through them slowly. The Change Data Capture sounds like any other table so you should be able to query it within the DB connect app and send that data to Splunk, once you have it configured. #Start here - Follow these steps carefully. This is really good documentation - ensure you configure for your environment SQL server. https://lantern.splunk.com/Splunk_Platform/Product_Tips/Extending_the_Platform/Configuring_Splunk_DB_Connect  #Install DB connect - This is typically installed onto a Heavy Forwarder (Splunk instance) Or for small environments you can install on a Search Head or All in one - but you may have performance issue should you be running lots searches, other splunk apps, and other functions etc.) The DB connect app cant be installed onto a UF. https://splunkbase.splunk.com/app/2686  #Docs https://docs.splunk.com/Documentation/DBX/3.17.1/DeployDBX/AboutSplunkDBConnect 
Hi Team, We use mongo db python script to get the logs into splunk We could see historical logs are getting  ingested, we changed the checkpoint value to 2024-05-03 08:46:13.327000 After that it... See more...
Hi Team, We use mongo db python script to get the logs into splunk We could see historical logs are getting  ingested, we changed the checkpoint value to 2024-05-03 08:46:13.327000 After that it worked fine for sometime, again we could see historical data getting ingested at 2 am How to fix it? What would be the checkpoint file value?
Current setup - Indexers --> F5 VIP --> CM  CM is seeing the requests are coming F5 VIP rather than actual source ip of the indexers. First indexer is connected successfully using the F5 VIP and th... See more...
Current setup - Indexers --> F5 VIP --> CM  CM is seeing the requests are coming F5 VIP rather than actual source ip of the indexers. First indexer is connected successfully using the F5 VIP and the latter requests from indexers are dropped as CM is seeing the connections coming from same Indexer (same F5 VIP). How did you configure the load balancer to see the actual source ip rather than the F5 VIP on the CM. Note: CM and Indexers are sitting on same network. I do not have much knowledge on the LB end. Any assistance is much appreciated.  
After pulling cases from ES to Phantom a certain label is assigned to the event , later it is automatically promoted to a case .  i have created an playbook that assign labels to the promoted cases ... See more...
After pulling cases from ES to Phantom a certain label is assigned to the event , later it is automatically promoted to a case .  i have created an playbook that assign labels to the promoted cases (based on the triggered splunk rule) and it works 99% of the times but sometimes i get 2 identical cases with different labels (the newly assign one and the one that is configured in the Splunk app). has anyone encountered this issue before ?