All Apps and Add-ons

How to configure the Splunk Add-on for Check Point OPSEC LEA to grab older logs named by date, not just FW.log?

srbitlog
Engager

My Check Point Firewall is set to rotate logs monthly. This is to make it easier to delete older unwanted logs. The issue is that the Splunk Add-on for Check Point OPSEC LEA loggrabber is programmed to only grab the FW.log. which in my case is only the current month log. How do I get Splunk to grab the older logs which are named by date as well?

0 Karma
1 Solution

jamesarmitage
Path Finder

It sounds like this is a one-time import of historical data, since once you're up and running you'll always be monitoring the most current log file.

I'm not sure the app supports what you're trying to do, but as a possible workaround:

  1. cd to /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/
  2. Manually run lea_loggrabber to determine the logfiles you need
  3. Retrieve the data you need to temporary text files
  4. Index those files with the proper sourcetype

You'll probably want to test this before applying to a production environment.

To establish the names of the logfiles you want, I'd suggest sending the output to a pager (I use less), so you can scroll through it.

For example, if you want to get your non-audit data, try this. You'll have to use your own values for appname, lea_server_ip, etc

./lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve 2>&1 | less

You'll find a line that reads: log_level=2 file:lea_loggrabber.cpp func_name:get_fw1_logfiles_dict code_line_no:2414 :Available FW-1 Logfiles with a long list of the file names available.

Once you have a specific file in mind, you can retrieve the specific data you need by adding the --logfile flag to the command above. For example, my historical logs were daily, so to pull the July 17, 2015 data I would use something like:

./lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve --logfile 2015-07-17_235900.log > /var/log/checkpoint-2015-07-17.log

At this point you should be able to index the logfile with the proper sourcetype and have the props and transforms tag up your data as intended.

If that works for you, it's not a big step to write a bash loop to pull all the remaining files into a temporary location prior to indexing. If you can get all the desired logfiles into a text file, then something like this should pull them in sequence for you:

while IFS= read -r line; do /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve --logfile $line > /var/log/checkpoint-$line; done < your-text-file.txt

View solution in original post

jamesarmitage
Path Finder

It sounds like this is a one-time import of historical data, since once you're up and running you'll always be monitoring the most current log file.

I'm not sure the app supports what you're trying to do, but as a possible workaround:

  1. cd to /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/
  2. Manually run lea_loggrabber to determine the logfiles you need
  3. Retrieve the data you need to temporary text files
  4. Index those files with the proper sourcetype

You'll probably want to test this before applying to a production environment.

To establish the names of the logfiles you want, I'd suggest sending the output to a pager (I use less), so you can scroll through it.

For example, if you want to get your non-audit data, try this. You'll have to use your own values for appname, lea_server_ip, etc

./lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve 2>&1 | less

You'll find a line that reads: log_level=2 file:lea_loggrabber.cpp func_name:get_fw1_logfiles_dict code_line_no:2414 :Available FW-1 Logfiles with a long list of the file names available.

Once you have a specific file in mind, you can retrieve the specific data you need by adding the --logfile flag to the command above. For example, my historical logs were daily, so to pull the July 17, 2015 data I would use something like:

./lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve --logfile 2015-07-17_235900.log > /var/log/checkpoint-2015-07-17.log

At this point you should be able to index the logfile with the proper sourcetype and have the props and transforms tag up your data as intended.

If that works for you, it's not a big step to write a bash loop to pull all the remaining files into a temporary location prior to indexing. If you can get all the desired logfiles into a text file, then something like this should pull them in sequence for you:

while IFS= read -r line; do /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/lea_loggrabber --data non_audit --debug_level 2 --appname Splunk_TA_checkpoint-opseclealea_loggrabber  --lea_server_ip 10.1.2.3 --lea_server_auth_port 18184 --lea_server_auth_type sslca --opsec_sslca_file /opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/certs/checkpoint.p12 --opsec_sic_name CN=opsec_splunk_hf,O=your.institution.name.7ag9h5 --opsec_entity_sic_name CN=cp_mgmt_yourmanagementserver,O=your.institution.name.7ag9h5 --no_online --no_resolve --logfile $line > /var/log/checkpoint-$line; done < your-text-file.txt

srbitlog
Engager

Thanks that worked well for me!

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...