All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@splunklearner  Please check the pre-requisites . https://techdocs.akamai.com/siem-integration/docs/siem-splunk-connector 
@Nrsch If you're using Splunk ES version 8.x, navigate to the Splunk ES App, then go to Mission Control, where you'll find the "Analyst Queue." This serves the same function as "Incident Review."
@Nrsch  In Splunk Enterprise Security (ES), when a saved search like "Malware - Total Infection Count" is triggered, the results typically manifest as notable events. These notable events are design... See more...
@Nrsch  In Splunk Enterprise Security (ES), when a saved search like "Malware - Total Infection Count" is triggered, the results typically manifest as notable events. These notable events are designed to alert security analysts to potential issues and are centralized in specific dashboards within ES.   Incident Review Dashboard : -   The Incident Review dashboard is the main place to view triggered notable events from security saved searches, including something like "Malware - Total Infection Count."   How to Access: Log into Splunk ES. Navigate to Security > Incident Review in the ES menu. Look for notable events tied to the "Malware - Total Infection Count" search. You can filter by search name, urgency (e.g., critical, high), or time range to locate the specific event. Security Posture Dashboard   Provides a high-level overview of notable event activity across your environment.   https://docs.splunk.com/Documentation/ES/7.3.3/User/IncidentReviewdashboard  https://docs.splunk.com/Documentation/ES/7.3.3/User/IncidentReviewdashboard#How_Splunk_Enterprise_Security_identifies_notable_events 
@splunklearner  Go to Settings > Data Inputs, where you will find the Akamai data input.
Thanks, and have the fields already been extracted from these event? For 1, do you just want a count of these events? For 2, do you just want the total response time for all the events?
Hi, there are some security saved search and key indicator in ES, if I activate these searches, if they trigger,  in which dashboard in ES i can see the result? For example if the search "Malware- to... See more...
Hi, there are some security saved search and key indicator in ES, if I activate these searches, if they trigger,  in which dashboard in ES i can see the result? For example if the search "Malware- total infection count " trigger,  in which dashboard in ES can I see the result? # ES # enterprise security
I am getting the data extracted and published to a dashboard, but the problem is that the "Count" is published on separate rows, not merged in with the other rows. I want the count (from which the pe... See more...
I am getting the data extracted and published to a dashboard, but the problem is that the "Count" is published on separate rows, not merged in with the other rows. I want the count (from which the percentage is calculated) to end up as an additional column together with the Percentage, Route and Method. This is the Signalflow I currently use: B = data('http_requests_total', filter=filter('k8s.namespace.name', 'customer-service-pages-prd')).count() A = data('http_requests_total', filter=filter('k8s.namespace.name', 'customer-service-pages-prd')).count(by=['route', 'method']) Percentage=(A/B * 100) Percentage.publish(label='Percentage') A.publish('Count') And this is how it looks: Any ideas on how to merge the data so that also Count is on the same rows as the Percentage?
I am stuck at this point -- Click the Akamai Security Incident Event Manager API. I can't find this in data inputs after installing add-on.
@splunklearnerIf you don't have a heavy forwarder and need to install the add-on, you can install it on the search head cluster. Please refer to the documentation below for more details and installat... See more...
@splunklearnerIf you don't have a heavy forwarder and need to install the add-on, you can install it on the search head cluster. Please refer to the documentation below for more details and installation instructions. Install an add-on in a distributed Splunk Enterprise deployment - Splunk Documentation To deploy an add-on to the search head cluster members, use the deployer. https://docs.splunk.com/Documentation/Splunk/9.4.1/DistSearch/PropagateSHCconfigurationchanges   
@splunklearner I recommend using the add-on. Akamai SIEM Integration | Splunkbase
@splunklearner    Install the add-on on your heavy forwarder and configure it. You have two options for sending logs to Splunk: Install the add-on on your heavy forwarder and use it to send logs t... See more...
@splunklearner    Install the add-on on your heavy forwarder and configure it. You have two options for sending logs to Splunk: Install the add-on on your heavy forwarder and use it to send logs to Splunk. If Akamai supports syslog, you can send logs to your syslog forwarder, which will then forward them to Splunk. In this case, please configure syslog-ng or rsyslog to capture Akamai logs in a specific directory and create the necessary inputs to onboard the logs into Splunk. Configure the UF on your syslog server to monitor the log files. Update the inputs.conf file to specify the log file paths and the outputs.conf file to forward the data to your indexe Example inputs.conf: [monitor:///var/log/akamai/*.log] index = akamai sourcetype = akamaisiem  
@splunklearner  Please follow this SIEM Splunk connector
Anyone please help me how to get Akamai logs to Splunk. We have clustered environment with syslog server uf installed in it and forwards data to our Deployment Server initially and then it deployes t... See more...
Anyone please help me how to get Akamai logs to Splunk. We have clustered environment with syslog server uf installed in it and forwards data to our Deployment Server initially and then it deployes to Cluster Manager and Deployer. We have 6 indexers with 2 indexers in each site (3 site multi cluster). 3 search heads one in each site. How to proceed with this?
Hi @sol69  Please find the following instructions for configuring the add-on Prerequisites Wireshark Installation Download and install Wireshark. During the installation process, deselect all... See more...
Hi @sol69  Please find the following instructions for configuring the add-on Prerequisites Wireshark Installation Download and install Wireshark. During the installation process, deselect all components except for tshark (this is the command-line tool needed for packet capture), unless you have other reasons for installing the full package. TA-tshark app Installation Install the TA-tshark add-on on your Universal Forwarder (UF). After installation, ensure you configure the add-on to forward the necessary data. Configuration Steps Modify Configuration Files inputs.conf: Locate the file (often included in the app package). If needed, modify the configuration—by default, it is set up for Windows to capture traffic on port 53 (DNS) on the first interface. The input is defined with the name tshark:port53 and a specified sourcetype. bin/tcpdump.path: Adjust this file if your environment requires a different tcpdump/tshark path than what is provided. Enable Packet Capture In the inputs.conf file, find the stanza corresponding to the capture input. Set disabled = 0 to enable the capture feature. Restart the Universal Forwarder (UF) After making all changes, restart the UF to apply the new configuration settings. Optional: Additional Apps for Enhanced Functionality For further insights and to extend the functionality of the installed app, consider installing the following complementary Splunk apps: DNS Insight DNS Insight on Splunkbase DHCP Insight DHCP Insight on Splunkbase These apps provide additional analysis and visualization capabilities related to DNS and DHCP traffic. Note - How you install the app on your UF may depend on your architecture - are you using a Deployment Server to distribute apps to your UF(s)?  Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Thanks for the reply, @yuanliu. Sadly I don't know whether it's actually json; it might be. It's a college assignment, and we just know it's a bunch of data/logs in tar.gz. "src_ip" and the other one... See more...
Thanks for the reply, @yuanliu. Sadly I don't know whether it's actually json; it might be. It's a college assignment, and we just know it's a bunch of data/logs in tar.gz. "src_ip" and the other one have never appeared automatically in interesting fields so far. Would you expect them to appear as their "natural names" if it was json or would I need to do something proactive? Either way, why doesn't the extracted field appear?
@Rakzskull  Splunk manages the archival storage in DDAA, and you don’t have direct access to the underlying S3 buckets. To export archived data: Open a support ticket with Splunk.
@sol69  I recommend exploring an alternative method for forwarding the data, as this add-on or app does not appear to be CIM-compliant. It would be best to review this documentation for more details... See more...
@sol69  I recommend exploring an alternative method for forwarding the data, as this add-on or app does not appear to be CIM-compliant. It would be best to review this documentation for more details. https://community.splunk.com/t5/Splunk-Enterprise/Monitoring-Wireshark-usage-with-splunk/m-p/690530  https://community.splunk.com/t5/Monitoring-Splunk/Splunk-monitoring-a-wireshark-file/td-p/14218 
@sol69  To configure the inputs.conf for the TA_tshark (Network Input for Windows) on Splunk, follow these steps: Install TA_tshark: Install the TA_tshark on your Universal Forwarder (UF) and ... See more...
@sol69  To configure the inputs.conf for the TA_tshark (Network Input for Windows) on Splunk, follow these steps: Install TA_tshark: Install the TA_tshark on your Universal Forwarder (UF) and configure forwarding. Modify inputs.conf: Open the inputs.conf file located in $SPLUNK_HOME/etc/apps/TA_tshark/local/ (create the file ). Add the following configuration to capture DNS traffic on port 53: [script://<give your path>] disabled = 0 index = your_index sourcetype = tshark:port53 Ensure the disabled attribute is set to 0 to enable the input. Modify tcpdump.path: If needed, update the bin/tcpdump.path file to point to the correct path of tshark. Restart the Universal Forwarder: After making these changes, restart the Universal Forwarder to apply the new configuration. inputs.conf - Splunk Documentation 
@shabamichae In the Splunk Architect practical lab exam, configuring TLS/SSL for Universal Forwarder (UF) to Indexer (IDX) communication is not strictly required unless explicitly mentioned in the ex... See more...
@shabamichae In the Splunk Architect practical lab exam, configuring TLS/SSL for Universal Forwarder (UF) to Indexer (IDX) communication is not strictly required unless explicitly mentioned in the exam requirements. If the exam explicitly states that secure communication must be configured, then failing to implement SSL/TLS for UF-IDX traffic could result in deductions. Since time is limited, focus on core configurations (indexing, forwarding, clustering, search head deployment) first, then handle TLS if necessary.
In the practical Lab environment, how important is it to configure TLS on Splunk servers during the practical Lab. Do i get penalized for not securing UF-IDX traffic using SSL/TLS