All Apps and Add-ons

Akamai logging to Splunk

Karthikeya
Communicator

We are planning to on-board Akamai platform logs to Splunk. We are following this link to implement the same - SIEM Splunk connector

 

In the process we have installed this Akamai add-on - Akamai SIEM Integration | Splunkbase

 

When we are going to Settings > Data Inputs as mentioned here - SIEM Splunk connector – we are unable to find this data input - ​Akamai​ Security Incident Event Manager API.

 

And we are getting the following error in Splunk post installing the add-on.

 

Deployer

Karthikeya_0-1741101765486.png

 

 

 

Search head

Karthikeya_1-1741101765489.png

 

Can you help us in this challenge? We are stuck at “data inputs”.

I think we need to perform these pre-requisites to get this Akamai add-on (Modular Input) work –

Karthikeya_2-1741101810654.png

 

 

Please help us in installing Java in our Splunk instance and whether KVStore is installed or not and is it working fine?

Labels (1)
0 Karma

Karthikeya
Communicator

@kiran_panchavat in Akamai docs, it is given Akamai Splunk Connector requires Java 8 (JRE 1.8) or above. 

But here you have give JDK. Is it fine to install JDK instead of JRE? is it the same?

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya


For Error 1 (Modular Input):

Verify the script exists and is executable.
Install Java if missing and ensure it’s in the PATH.
Test the script manually and adjust permissions or dependencies as needed. 


For Error 2 (KV Store):


Check mongod.log and splunkd.log for details.
Validate and renew server.pem if expired.
Fix permissions or reinitialize KV Store if necessary.

 
Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

Karthikeya
Communicator

For Error 1 (Modular Input):

Verify the script exists and is executable.
Install Java if missing and ensure it’s in the PATH.
Test the script manually and adjust permissions or dependencies as needed. 

How to do this? can you please guide me... how to install java on my AWS Splunk instance?

 

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya 

Step 1:- run java -version, It should show Java 8 (JRE 1.8) or above.

Expected Output: Something like java version "1.8.0_351" or openjdk 11.0.2. This confirms Java is installed and then 

Step 2:- Restart splunk you would be able to see the data input option. 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

Karthikeya
Communicator

How to install java on my Splunk instance which hosted on AWS? Please guide me.

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya 

Pls have a look

https://www.youtube.com/watch?v=njniDvVqWik 

https://www.youtube.com/watch?v=YY_Qk8EqzQw 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya  Please check this https://stackoverflow.com/questions/77418759/how-do-i-install-java-in-an-ec2-instance 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya 

To install Java on your Splunk instances running in AWS, follow these steps based on your instance’s OS:

For Amazon Linux / RHEL / CentOS

  1. Update the package manager:

    sudo yum update -y
  2. Install OpenJDK (recommended) or Oracle JDK:

    • For OpenJDK 11 (recommended for Splunk):
      sudo yum install -y java-11-openjdk
    • If you need Java 8:
      sudo yum install -y java-1.8.0-openjdk
  3. Verify installation:

    java -version

For Ubuntu/Debian

  1. Update the package manager:

    sudo apt update && sudo apt upgrade -y
  2. Install OpenJDK (recommended) or Oracle JDK:

    • For OpenJDK 11 (recommended for Splunk):
      sudo apt install -y openjdk-11-jdk
    • If you need Java 8:
      sudo apt install -y openjdk-8-jdk
  3. Verify installation:

    java -version

For Amazon Linux 2023

Amazon Linux 2023 uses dnf instead of yum:

sudo dnf install -y java-11-amazon-corretto

Setting JAVA_HOME (if required)

  1. Find the Java installation path:

    sudo update-alternatives --config java

    or

    readlink -f $(which java)
  2. Add the JAVA_HOME path to /etc/environment:

    echo 'export JAVA_HOME=/usr/lib/jvm/java-11-openjdk' | sudo tee -a /etc/environment
    source /etc/environment

 

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!

Karthikeya
Communicator

We have deployment server which receives data from UF. We have cluster manager and deployer and SHs. Where to install and configure this add-on? in DS or Deployer or SHs? Please confirm I am confused. We don't have HF at the moment. Normally where we need to configure data inputs?

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@KarthikeyaIf a Heavy Forwarder (HF) is not available, install it on the search head.

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

Karthikeya
Communicator

@kiran_panchavat  where to create a new index so that all these Akamai logs to be flow into this new index? Consider we are configuring data inputs in HF...

0 Karma

kiran_panchavat
SplunkTrust
SplunkTrust

@Karthikeya

You need to create a new index on the indexers. If you have a cluster master, you can create the index there and push it to the indexers. Additionally, if you create an index on the Heavy Forwarder (HF), you just need to add the index name in the data input configuration within the add-on.

Note: When you create an index on the HF, it does not store the data unless explicitly configured in the backend to do so. The HF will only collect the data and forward it to the indexers.

Did this help? If yes, please consider giving kudos, marking it as the solution, or commenting for clarification — your feedback keeps the community going!
0 Karma

Karthikeya
Communicator

Ok will create new index in CM and push it to indexers. How to tell HF to forward all Akamai logs to this new index? Where to configure this? Please I am confused. 

0 Karma

Karthikeya
Communicator

On the deployer right and push it to SHs? And where can I configure this?

0 Karma

Karthikeya
Communicator

@kiran_panchavat thank you. In EC2 instance which path I need to run all these commands?

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Also, were you able to fix your KVStore issue or do you still need help with this? Please refer to previous response re checking mongo / splunkd.log logs to look into this issue too.

Thanks

Will

Karthikeya
Communicator

@livehybrid kv store issue is resolved once I installed java. Stuck here on how to assign new created index to all akamai logs?

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @Karthikeya 

How have you configured the Data collection? Have you done this in the UI on the HF or did you deploy the inputs.conf from your Deployment Server?

If you are pushing an inputs.conf then you can specify index=<yourIndex> in the stanza for your input in your inputs.conf

Feel free to share some examples of your configuration so we can create a more relevant response!

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

Karthikeya
Communicator

Done it in HF UI by configuring the data input but no where asked about index? Where to configure index now? Created new index on CM and pushed to indexers already. How to map these logs to new index?

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @Karthikeya 

I think it would be worth focussing on the KV Store issue first as that might (although might not!) rectify your other issue if the app relies on the KV Store.

Have you made any other recent changes to the KV Store or Splunk version?

Are there any logs in splunkd.log ($SPLUNK_HOME/var/log/splunk/splunkd.log) which might indicate what the issue with KV Store is?

Please let me know how you get on and consider adding karma to this or any other answer if it has helped.
Regards

Will

0 Karma
Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

Splunk Decoded: Business Transactions vs Business IQ

It’s the morning of Black Friday, and your e-commerce site is handling 10x normal traffic. Orders are flowing, ...

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...