Splunk Enterprise

Storage for DB Connect Checkpoint values

altink
Builder

Hi

Where are the Checkpoint values for enabled DB Connect Inputs stored?

I did check at folder:
/opt/splunk/var/lib/splunk/modinputs/server/splunk_app_db_connect

There there are only files with names of our disabled DB Inputs, but not the ones of our enabled DB Inputs.

Splunk Enterprise Version: 9.0.4.1
Splunk DB Connect Version: 3.6.0

Ps. our three enabled DB Inputs do work correctly, and I can see the checkpoint values from the web.
Just cannot find where they are stored on the OS

best regards
Altin

Labels (1)
0 Karma

altink
Builder

Upgrade is not an option for now. furthermore, everything goes OK with DB Connect. it will be upgraded with the whole system.

I did ask this question precisely because could not find the checkpoint values persistence files for our enable DB Input in folder:
/opt/splunk/var/lib/splunk/modinputs/server/splunk_app_db_connect

0 Karma

altink
Builder

Thank You @Richfez 

As a first need (and I should have said this in the opening), I was not asking to access them at all. I just want to know where they are, to backup them just like everything else in /etc/apps folder.

But editing them could also be a need, ex: in case of db inputs info  loss of any kind.  And if it be the case, I guess to better edit them (at one's own risk) directly into the OS via a text editor since they are json.

regards
Altin


0 Karma

Richfez
SplunkTrust
SplunkTrust

Ah, backups.  Splunk has this documented, so I'll just point you to their docs on "Backup and restore Splunk DB Connect version 3.10.0 or higher"

Hope that helps!

-Rich

0 Karma

altink
Builder

The Documentation link given belongs to DB Connect version 3.15.0 and talks for "Splunk DB Connect version 3.10.0 or higher".

Our DB Connect Version is: 3.6.0

When on the doc link I change the DB Connect version to 3.6.0, I receive this:

"The topic you've asked to see does not apply to the version you selected. "

So it looks documentation tells nothing for backup/restore of DB Connect 3.6.0.

Or am I missing something?

best regards
Altin

0 Karma

Richfez
SplunkTrust
SplunkTrust

3.6 is old and you should update, then the current documentation would work.  🙂

But, it also had a simpler system of checkpoints, storing those in files on disk.

I cannot remember exactly where those were, but maybe something like $SPLUNK_HOME/var/lib/splunk/modinputs/dbx_input...?  I don't know, I have that directory and it seems familiar, but it's empty on my system because I try to keep up to date.  (Hehehe, you knew I was going to say that, didn't you... ?  🙂

 

0 Karma

Richfez
SplunkTrust
SplunkTrust

BTW I don't know if it's clear, but

a) you should be able to find the checkpoint files on disk, but ..

b) even if you don't, if you back up $SPLUNK_HOME/var/lib/modinputs I think you effectively back up your checkpoint files. 

A bit of interwebs searching ought to confirm this.

Also note that the checkpoint files are useless if you are trying to back them up pre-updating (at least if you cross the magical version near 3.10 where it switches from checkpoint files to KV store entries), because you can't slap them into place and expect it to find/use them any more.  It should migrate them during the upgrade, but I'm not sure it'll ever "re-migrate" later if you have to try to restore files into a kvstore based system.  YMMV, etc.

0 Karma

Richfez
SplunkTrust
SplunkTrust

It's terrible, they're not easily accessible except through the UI.  It's a big ... sore spot for some of us who need to use these in a more programmatic way.

But, there is a way using the REST interface from cURL.

curl -k -u <username>:<password> https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input

Obviously fix the username and password to an admin one, and your hostname if it's not on localhost.

 You might want to pipe that through jq to 'pretty print' it if you have jq installed because otherwise it's all smashed together and hard to read:

curl -k -u <username>:<password> https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input | jq .

 You can also see only an individual one if you append the _key's value for the one you want to the end.  (The _key comes from the output of one of the earlier commands.)

curl -k -u <username>:<password> https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input/6452ce6e55102d0ad735ec31 | jq .

You can also delete them or edit them, though ... obviously be careful and do this in a test environment at first!

curl -k -u <username>:<password> https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input/6452ce6e55102d0ad735ec31 -X DELETE

And I've not found a good way to "edit" them, but it's pretty trivial to just edit the JSON you get from an individual entry, and load that back in wholesale.

curl -k -u <username>:<password> https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input -d '{ "inputName" : "newEntryforMyDB", "value" : "200", "appVersion" : "3.16.0", "columnType" : 4, "timestamp" : "2024-03-21T13:11:41.633-05:00", "_user" : "nobody", "_key" : "65fc6ce1764e95450b0d98e1" }' -H "Content-Type: application/json"

Which would overwrite entry 65fc6... with that new information.

Happy Splunking,

Rich

 

Bazza_12
Path Finder

Sorry as per usual late to the party.

Yes have to agree with the painful to amend checkpoints we use ansible to replicate checkpoints between nodes excuse the terrible pasting of the yaml config. You will also note we have the delete before the post. I'm sure the new version does all this but this was created pre later releases & had to be amended when the updates occurred 😄 

- hosts: "{{ splunk_node_primary }}"
gather_facts: no
become: no

tasks:

- name: Enumerate db connect primary kvstore names
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: GET
return_content: yes
register: db_connect_input_primary_name
failed_when: db_connect_input_primary_name.status == "404"

- name: Set fact for names
ansible.builtin.set_fact:
db_connect_prim_name: "{{ db_connect_prim_name | default([]) + [ item ] }}"
with_items: "{{ db_connect_input_primary_name.json | json_query(inputName_key) }}"
vars:
inputName_key: "[*].{inputName: inputName}"

- name: Set fact last
ansible.builtin.set_fact:
db_connect_prim_name_unique: "{{ db_connect_prim_name | unique }}"

- name: Repeat block DB Connect
ansible.builtin.include_tasks: db_connect_repeat_block.yml
loop: "{{ db_connect_prim_name_unique | default([]) }}"

Then the repeat block 

---
- name: Enumerate db connect primary inputs
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/db_connect/dbxproxy/inputs/{{ item.inputName }}"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: GET
return_content: yes
register: db_connect_primary_list
failed_when: db_connect_primary_list.status == "404"

- name: Enumerate db connect primary kvstore values
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: GET
return_content: yes
register: db_connect_input_primary_value
failed_when: db_connect_input_primary_value.status == "404"

- name: Set fact
ansible.builtin.set_fact:
db_connect_input_chkpt_value: "{{ db_connect_input_chkpt_value | default([]) + [ inp_chkpt_var ] }}"
with_items: "{{ db_connect_input_primary_value.json | json_query(inputName_value) }}"
vars:
inputName_value: "[?inputName=='{{ item.inputName }}'].{inputName: inputName, value: value, appVersion: appVersion, columnType: columnType, timestamp: timestamp}"
loop_control:
label: "{{ inp_chkpt_var }}"
loop_var: inp_chkpt_var

- name: Set fact last
ansible.builtin.set_fact:
db_connect_input_chkpt_val: "{{ db_connect_input_chkpt_value | list | last }}"

- name: Set fact for new Chkpt
ansible.builtin.set_fact:
init_chkpt_value: "{{ db_connect_primary_list.json | regex_replace('.checkpoint.: None,', \"'checkpoint': %s,\" % db_connect_input_chkpt_val , multiline=True, ignorecase=True) }}"

- name: Set fact for disabled
ansible.builtin.set_fact:
init_chkpt_value_disabled: "{{ init_chkpt_value | regex_replace('.disabled.: false,', \"'disabled': true,\", multiline=True, ignorecase=True) }}"

- name: Enumerate db connect secondary kvstore values
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: GET
return_content: yes
register: db_connect_input_secondary_value
failed_when: db_connect_input_secondary_value.status == "404"
delegate_to: "{{ splunk_node_secondary }}"

- name: Set fact for secondary keys
ansible.builtin.set_fact:
db_connect_second_chkpt_key: "{{ db_connect_second_chkpt_key | default([]) + [ item ] }}"
with_items: "{{ db_connect_input_secondary_value.json | json_query(inputName_key) }}"
vars:
inputName_key: "[?inputName=='{{ item.inputName }}'].{_key: _key}"

- name: Show secondary keys
ansible.builtin.debug:
msg: "{{ [ inp_second_key ] }}"
loop: "{{ db_connect_second_chkpt_key | default([]) }}"
loop_control:
label: "{{ inp_second_key }}"
loop_var: inp_second_key
when: db_connect_second_chkpt_key is defined

- name: Delete db connect secondary kvstore values
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/storage/collections/data/dbx_db_input...{{ inp_second_key._key }}"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: DELETE
return_content: yes
delegate_to: "{{ splunk_node_secondary }}"
loop: "{{ db_connect_second_chkpt_key | default([]) }}"
loop_control:
label: "{{ inp_second_key }}"
loop_var: inp_second_key
when: db_connect_second_chkpt_key is defined

- name: Enumerate db connect secondary inputs
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/db_connect/dbxproxy/inputs/{{ item.inputName }}"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: GET
return_content: yes
status_code:
- 404
- 200
- 500
delegate_to: "{{ splunk_node_secondary }}"
register: db_connect_primary_check

- name: Set fact for secondary keys blank
ansible.builtin.set_fact:
db_connect_second_chkpt_key: []

- name: Delete db connect secondary inputs
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/db_connect/dbxproxy/inputs/{{ item.inputName }}"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: DELETE
return_content: yes
status_code:
- 204
delegate_to: "{{ splunk_node_secondary }}"
when: '"errors" not in db_connect_primary_check.content'

- name: Post db connect secondary inputs
ansible.builtin.uri:
url: "https://localhost:8089/servicesNS/nobody/splunk_app_db_connect/db_connect/dbxproxy/inputs"
user: "{{ ansible_user }}"
password: "{{ ansible_password }}"
validate_certs: no
method: POST
body: "{{ init_chkpt_value_disabled }}"
return_content: yes
body_format: json
register: db_connect_secondary_post
retries: 3
delay: 10
until: "db_connect_secondary_post.status == 200"
delegate_to: "{{ splunk_node_secondary }}"

0 Karma

PickleRick
SplunkTrust
SplunkTrust

You can edit your message and insert the ansible part in either a preformatted paragraph or a code box. Then it will not get butchered (most importantly - the indents will be preserved).

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

 Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...