We are planning to on-board Akamai platform logs to Splunk. We are following this link to implement the same - SIEM Splunk connector
In the process we have installed this Akamai add-on - Akamai SIEM Integration | Splunkbase
When we are going to Settings > Data Inputs as mentioned here - SIEM Splunk connector – we are unable to find this data input - Akamai Security Incident Event Manager API.
And we are getting the following error in Splunk post installing the add-on.
Deployer
Search head
Can you help us in this challenge? We are stuck at “data inputs”.
I think we need to perform these pre-requisites to get this Akamai add-on (Modular Input) work –
Please help us in installing Java in our Splunk instance and whether KVStore is installed or not and is it working fine?
@kiran_panchavat in Akamai docs, it is given Akamai Splunk Connector requires Java 8 (JRE 1.8) or above.
But here you have give JDK. Is it fine to install JDK instead of JRE? is it the same?
For Error 1 (Modular Input):
Verify the script exists and is executable.
Install Java if missing and ensure it’s in the PATH.
Test the script manually and adjust permissions or dependencies as needed.
For Error 2 (KV Store):
Check mongod.log and splunkd.log for details.
Validate and renew server.pem if expired.
Fix permissions or reinitialize KV Store if necessary.
For Error 1 (Modular Input):
Verify the script exists and is executable.
Install Java if missing and ensure it’s in the PATH.
Test the script manually and adjust permissions or dependencies as needed.
How to do this? can you please guide me... how to install java on my AWS Splunk instance?
Step 1:- run java -version, It should show Java 8 (JRE 1.8) or above.
Expected Output: Something like java version "1.8.0_351" or openjdk 11.0.2. This confirms Java is installed and then
Step 2:- Restart splunk you would be able to see the data input option.
How to install java on my Splunk instance which hosted on AWS? Please guide me.
Pls have a look
https://www.youtube.com/watch?v=njniDvVqWik
https://www.youtube.com/watch?v=YY_Qk8EqzQw
@Karthikeya Please check this https://stackoverflow.com/questions/77418759/how-do-i-install-java-in-an-ec2-instance
To install Java on your Splunk instances running in AWS, follow these steps based on your instance’s OS:
Update the package manager:
sudo yum update -y
Install OpenJDK (recommended) or Oracle JDK:
sudo yum install -y java-11-openjdk
sudo yum install -y java-1.8.0-openjdk
Verify installation:
java -version
Update the package manager:
sudo apt update && sudo apt upgrade -y
Install OpenJDK (recommended) or Oracle JDK:
sudo apt install -y openjdk-11-jdk
sudo apt install -y openjdk-8-jdk
Verify installation:
java -version
Amazon Linux 2023 uses dnf instead of yum:
sudo dnf install -y java-11-amazon-corretto
Find the Java installation path:
sudo update-alternatives --config java
or
readlink -f $(which java)
Add the JAVA_HOME path to /etc/environment:
echo 'export JAVA_HOME=/usr/lib/jvm/java-11-openjdk' | sudo tee -a /etc/environment source /etc/environment
We have deployment server which receives data from UF. We have cluster manager and deployer and SHs. Where to install and configure this add-on? in DS or Deployer or SHs? Please confirm I am confused. We don't have HF at the moment. Normally where we need to configure data inputs?
@KarthikeyaIf a Heavy Forwarder (HF) is not available, install it on the search head.
@kiran_panchavat where to create a new index so that all these Akamai logs to be flow into this new index? Consider we are configuring data inputs in HF...
You need to create a new index on the indexers. If you have a cluster master, you can create the index there and push it to the indexers. Additionally, if you create an index on the Heavy Forwarder (HF), you just need to add the index name in the data input configuration within the add-on.
Note: When you create an index on the HF, it does not store the data unless explicitly configured in the backend to do so. The HF will only collect the data and forward it to the indexers.
Ok will create new index in CM and push it to indexers. How to tell HF to forward all Akamai logs to this new index? Where to configure this? Please I am confused.
On the deployer right and push it to SHs? And where can I configure this?
@kiran_panchavat thank you. In EC2 instance which path I need to run all these commands?
Also, were you able to fix your KVStore issue or do you still need help with this? Please refer to previous response re checking mongo / splunkd.log logs to look into this issue too.
Thanks
Will
@livehybrid kv store issue is resolved once I installed java. Stuck here on how to assign new created index to all akamai logs?
Hi @Karthikeya
How have you configured the Data collection? Have you done this in the UI on the HF or did you deploy the inputs.conf from your Deployment Server?
If you are pushing an inputs.conf then you can specify index=<yourIndex> in the stanza for your input in your inputs.conf
Feel free to share some examples of your configuration so we can create a more relevant response!
Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards
Will
Done it in HF UI by configuring the data input but no where asked about index? Where to configure index now? Created new index on CM and pushed to indexers already. How to map these logs to new index?
Hi @Karthikeya
I think it would be worth focussing on the KV Store issue first as that might (although might not!) rectify your other issue if the app relies on the KV Store.
Have you made any other recent changes to the KV Store or Splunk version?
Are there any logs in splunkd.log ($SPLUNK_HOME/var/log/splunk/splunkd.log) which might indicate what the issue with KV Store is?
Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards
Will