All Apps and Add-ons

API State, High Availability and SH Clustering using the Qualys app(s) and add-on for Splunk

SplunkTrust
SplunkTrust

I am implementing a Qualys cloud to Splunk Cloud setup at the moment. I have 3 SH in the cloud, non-clustered and able to forward data to the indexing layer.

At present, I have the Qualys apps & Qualys Technology Add-on (TA) for Splunk installed on dedicated SH and my questions are:

If a SH with the Qualys apps installed goes down, and I configure the app on another SH using the same Qualys credentials and API site address, will the API site recognize that only a delta from the last pull is required or will it initiate a full data set pull?

Is this app usable in a SH Cluster or due to the HA limitations possibly stemming from above, does this app need a dedicated SH provisioned for it?

TIA

0 Karma

Communicator

If a SH with the Qualys apps installed goes down, and I configure the app on another SH using the same Qualys credentials and API site address, will the API site recognize that only a delta from the last pull is required or will it initiate a full data set pull?

Your Qualys TA which is going to pull the VM detections/WAS findings API data will be installed on forwarder/indexer -- and not on SH.
Even if you have TA installed on SH, it should not be configured to pull VM detections/WAS findings data. (However, you may opt to create input for knowledgebase to keep your local Kb copy up-to-date).

VM/WAS apps are only for reporting purposes, they aren't going to pull data for you. So, even if your SH where apps were installed crashes, and that makes you configure those apps on another SH, it will not create any problem. That's because its TA who would be pulling data and indexing it into indexer. Apps on SH will only do SPL queries and show you results.

Finally, for delta pulls, checkpoints are maintained with TA. So Qualys API will get subsequent delta pull requests only.

Is this app usable in a SH Cluster or due to the HA limitations possibly stemming from above, does this app need a dedicated SH provisioned for it?

I have tried setting up Qualys TA and apps in SHC, and it worked fine. My deployment was like this:
I had 1 forwarder, 1 indexer, 3 search heads and 1 deployer.
TA/App installations were as follows.
TA installed on forwarder. Data inputs created for VM detections and WAS findings. Data is then sent to indexer, nothing stored on forwarder.
No TA/App installed on indexer.
VM and WAS apps installed on all search heads.
TA installed on all search heads because knowledgebase is csv lookup and each SH needs local copy of it. Data input created ONLY for knowledgebase so that SH's local lookup copy is updated periodically.

0 Karma

Explorer

Hello,prabhasgupte
Could you please explaint how you installed and configured TA on forwarder and search heads? I have the similar instalation: forwarders, 3 indexer heads, 3 search heads and deployer.

0 Karma

SplunkTrust
SplunkTrust

Thanks for the response.

My setup is in Splunk Cloud and the data will be used in our ES deployment. We have got the VM data coming straight from the Qualys api to the ES SH (configured to fwd to cloud index layer) and are getting the VM feed with no issues. However we are not getting the KB data, so maybe this is a flaw to the setup but we do have an open case with Qualys to determine if it is possible to get this data with our setup.

My question was mainly regarding the how the api manages the checkpoint for the data. So in the case of a failure if we were to configure the app on another SH using the same account and api url, would it recognise where the account last left off and only pull down delta data.

I have since had a support response from Qualys who advised that the checkpoint is in a file in the app so my proposed solution would not work. However I am still waiting to hear if this file can be backed up and then restored onto the new SH in the event of a failure.

Thanks again for your response.

0 Karma

Communicator

My question was mainly regarding the how the api manages the checkpoint for the data. So in the case of a failure if we were to configure the app on another SH using the same account and api url, would it recognise where the account last left off and only pull down delta data.

The checkpoint files are maintained at SPLUNK_HOME/var/lib/splunk/modinputs/qualys/. One file for each input created and enabled. So, in case where you need to start using another forwarder, you can move those files and put in appropriate location with correct permissions, and that should do the job - the TA will pull data from the date mentioned in the checkpoint file (and it will not redo the entire thing for earlier data).

None of TA and Qualys apps really pull data from SH - unless it is knowledgebase input. And in KB case, the csv lookup file is always overwritten to keep the KB updated.

SplunkTrust
SplunkTrust

Thank you for your response prabhasgupte!

0 Karma

Splunk Employee
Splunk Employee

Typically the checkpoint files are flat text that just point to a counter/timestamp in the remote DB. I can't confirm on the latest version of qualys however, but if you ask their support they should know for sure.

0 Karma