Activity Feed
- Got Karma for When will the TA be available for Wazuh?. 10-07-2020 09:17 AM
- Karma Re: When will the TA be available for Wazuh? for changux. 06-05-2020 12:49 AM
- Karma Re: Threatlist Error after clean install of Palo Alto App and Add-on 6.0 for btorresgil. 06-05-2020 12:49 AM
- Karma Re: Threatlist Error after clean install of Palo Alto App and Add-on 6.0 for panguy. 06-05-2020 12:49 AM
- Got Karma for Re: When will the TA be available for Wazuh?. 06-05-2020 12:49 AM
- Karma Lookup File Editor App for Splunk Enterprise: Why do other users get error "Client is not authorized" trying to open my CSV Lookup? for wweiland. 06-05-2020 12:48 AM
- Karma Re: Lookup File Editor App for Splunk Enterprise: Why do other users get error "Client is not authorized" trying to open my CSV Lookup? for wweiland. 06-05-2020 12:48 AM
- Karma Re: Splunk Add-on for ServiceNow vs Splunk Add-on for ServiceNow for niketn. 06-05-2020 12:48 AM
- Karma How to add a description and caller name fields in ServiceNow incident integration for Splunk Add-on for ServiceNow? for vigneshmadesh. 06-05-2020 12:48 AM
- Got Karma for Qualys Technology Add-on (TA) for Splunk 1.0.3: Why am I getting "Error -5 while decompressing data: incomplete or truncated stream"?. 06-05-2020 12:48 AM
- Got Karma for Qualys Technology Add-on (TA) for Splunk 1.0.3: Why am I getting "Error -5 while decompressing data: incomplete or truncated stream"?. 06-05-2020 12:48 AM
- Got Karma for Re: Problems pulling in incident data. 06-05-2020 12:48 AM
- Got Karma for Re: Problems pulling in incident data. 06-05-2020 12:48 AM
- Got Karma for Re: Lookup File Editor App for Splunk Enterprise: Why do other users get error "Client is not authorized" trying to open my CSV Lookup?. 06-05-2020 12:48 AM
- Got Karma for Re: How to add a description and caller name fields in ServiceNow incident integration for Splunk Add-on for ServiceNow?. 06-05-2020 12:48 AM
- Got Karma for Re: How to add a description and caller name fields in ServiceNow incident integration for Splunk Add-on for ServiceNow?. 06-05-2020 12:48 AM
- Karma How to configure the Splunk Add-on for Netflow or indexer to capture the correct time stamp for Netflow log data? for jeffrey2015. 06-05-2020 12:47 AM
- Karma Re: How to install and configure Splunk DB Connect 2.0.5 in a Splunk 6.3.0 environment with indexer clustering, but no search head clustering? for jcoates_splunk. 06-05-2020 12:47 AM
- Karma Re: How to install and configure Splunk DB Connect 2.0.5 in a Splunk 6.3.0 environment with indexer clustering, but no search head clustering? for napomokoetle. 06-05-2020 12:47 AM
- Karma Re: [Indexer_Name] Streamed search execute failed because: Error in 'script': Getinfo probe failed for external search command 'dbxquery' for jcoates_splunk. 06-05-2020 12:47 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
1 | |||
0 | |||
0 | |||
2 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 |
11-15-2017
02:57 AM
I've reached out to our Customer Success Manager to see if he can get some additional details as well.
... View more
11-14-2017
06:08 AM
It appears that the data extracts continue to work, however when testing integration by creating tickets, I keep getting the following error:
command="snowincidentstream", Failed to create ticket. Return code is 400. Reason is Bad Request
... View more
11-14-2017
05:30 AM
Nothing yet, unfortunately.
... View more
11-07-2017
05:01 AM
So there was a bug that was listed as fixed in the latest version of the TA. It wasn't until the TA was upgraded that I was able to pull the results fields with no additional editing.
Unfortunately, prabhasgutpe's comments about multi-lined results fields rings very true. While it's able to pull the data, it's absolutely useless because of incorrect handling of the multi-lined fields. It'd be great for the data source to be cleaned up so it presents the data in a usable fashion OR the add-on is updated to accommodate the multi-lined fields.
... View more
10-19-2017
05:54 AM
When can we anticipate support for the ServiceNow Jakarta release? Do we anticipate any issues with the Add-on prior to official Jakarta support?
... View more
08-29-2017
01:01 PM
1 Karma
Thanks Changux. Works great!
... View more
08-29-2017
05:55 AM
This is fantastic.
Any help you can provide for pulling back the RESULTS field would be appreciated!
... View more
08-29-2017
04:46 AM
1 Karma
As stated in the subject, when will we see the TA be available? It appears the github repo no longer exists for it.
... View more
- Tags:
- Wazuh
05-22-2017
01:48 PM
Hah, yes. Thanks for circling back, I completely forgot.
Silly error I might add. After doing some digging I found I needed to bypass the Cyphort appliance in the proxy settings. Was right as rain after that.
... View more
05-08-2017
08:00 AM
2 Karma
There is a lot of data that we would like to update via the SNOW integration with Splunk as we create tickets. As it stands, they're quite sparse. We would love to be able to add additional information to minimize the amount of work that a human has to put into the ticket.
The value of the integration is lacking as it stands.
... View more
03-28-2017
03:51 AM
I'm leaning more towards this being an issue with Splunk and not an issue with the Cyphort app. My gut is telling me this is an issue with the Python SSL libraries perhaps. I'm getting the exact same error with another API-based app pulling data over SSL.
... View more
03-23-2017
03:25 AM
Configured this app in a 6.5.2 Distributed environment with the IA running on a standalone heavy forwarder. Currently receiving the following error at every polling cycle:
EOF occurred in violation of protocol (_ssl.c:603)
... View more
01-31-2017
10:25 AM
Any luck on this? We're still seeing this behavior. I tried to counteract it by setting the linebreaker to be the following:
LINE_BREAKER = \{\s+\"product\"\:
but I'm still seeing similar behavior.
... View more
12-22-2016
04:05 AM
These events are the ones being sent RESTful-y so they're being parsed with a sourctype of "fe_json".
If I had to guess, the sample above is actually one event that is being split at the "occurred"
... View more
12-16-2016
03:47 AM
Thanks for getting back to me, Tony.
The troubleshooting piece seemed to help and I figured out how to format my RESTful query appropriately. The challenge that I'm seeing now is that it appears the JSON stream that is being sent to the Splunk server is incomplete. I'm seeing truncation occur at the beginning and the end of the events. So the JSON parser doesn't seem to see it as a complete message. Samples follow. I'm using the latest version of the TA in a distributed environment running Splunk 6.5.1.
Truncation at the end:
{
"product": "Web MPS",
"appliance-id": "<obfuscated>",
"appliance": "<obfuscated>",
"alert": {
"src": {
"ip": "<obfuscated>",
"mac": "<obfuscated>",
"vlan": "0",
"port": "<obfuscated>"
},
"severity": "crit",
"alert-url": "<obfuscated>",
"explanation": {
"malware-detected": {
"malware": {
"name": "<obfuscated>",
"stype": "<obfuscated>",
"sid": "<obfuscated>"
}
},
"cnc-services": {
"cnc-service": {
"location": "<obfuscated>",
"protocol": "tcp",
"port": "80",
"channel": "<obfuscated>",
"address": "<obfuscated>"
}
},
"protocol": "tcp",
"analysis": "content"
},
"locations": "<obfuscated>",
"id": "<obfuscated>",
"action": "notified",
Truncation at the beginning:
"occurred": "2016-12-16 04:31:05+00",
"interface": {
"interface": "pether3",
"mode": "tap",
"label": "A1"
},
"dst": {
"ip": "<obfuscated>",
"mac": "<obfuscated>",
"port": "80"
},
"name": "<obfuscated>"
},
"version": "<obfuscated>",
"msg": "extended"
}
... View more
12-15-2016
08:07 AM
Having all sorts of problems with syslog-based reception of JSON-type events. Would like to be able to capture these events using the HTTP Event Collector (HEC) as I can't seem to get the HTTP RESTful API working. Any advice would be helpful.
... View more
06-16-2016
08:17 AM
Thanks Mate.
Huge help!
... View more
06-16-2016
07:58 AM
1 Karma
How'd you go about doing that?
... View more
06-07-2016
09:28 AM
1 Karma
See my response to /u/markdflip
... View more
06-07-2016
09:27 AM
1 Karma
Thanks for bumping this, Mark. Yes, I actually did.
So the first thing I did was try and pull from a smaller dataset size (i.e. start only a month back rather than all time). That seemed to work but in actuality it did not.
What actually fixed it was modifying the Splunk_TA_snow/bin/snow_data_loader.py script to use the sysparm_limit command as found below:
def collect_data(self, table, timefield, count=5000):
assert table and timefield
objs = []
with self._lock:
last_timestamp = self._read_last_collection_time(table, timefield)
params = "{0}>={1}^ORDERBY{0}&sysparm_limit={2}".format(
timefield, last_timestamp, count)
_, content = self._do_collect(table, params)
if not content:
return
We also got the SNOW folks to change our REST quota value from 60s to 120s.
It seemed to help us but YMMV.
... View more
05-25-2016
04:21 AM
It appears that I'm having this issue as well. Anyone in the admin group seems to work fine but users with new roles can't seem to view or edit files despite granting them explicit read/write access to them. Any thoughts?
... View more
05-12-2016
04:25 AM
Did you try just extracting the .tgz version over your existing installation?
... View more
05-11-2016
06:22 AM
So the good news is that I don't appear to be seeing any errors anymore.
The bad news is the following:
When "knowledge_base" and "host_detection" data inputs are enabled, only "knowledge_base" data seems to be downloaded
I see a tmp file created with "knowledge_base" data. Total file size is ~125M.
The timestamp on the qualys_kb.csv lookup file is updated but no additional data is added to it. Current file size is 5.3M. I also created an empty file and added the csv header information to it. It successful recreates a 5.3M file
When the "knowledge_base" data input is disabled, nothing is downloaded from the "host_detection" data input. I see the following in the logs.
5/11/16
9:14:07.000 AM
making https://qualysapi.qualys.com/msp/about.php request with params={}
host = x.x.com index = _internal source = qualys://host_detection sourcetype = qualys
5/11/16
9:14:07.000 AM
Start qualys TA
host = x.x.com index = _internal source = qualys://host_detection sourcetype = qualys
5/11/16
9:14:04.000 AM
End qualys TA
host = x.x.com index = _internal source = qualys://host_detection sourcetype = qualys
That's all we get.
... View more
05-04-2016
10:58 AM
2 Karma
Since updating to version 1.0.3 of the Qualys Technology Add-on (TA) for Splunk (Running on a dedicated "API Forwarder", a standalone Splunk 6.4.0 instance that forwards data to my indexers), I can no longer ingest data. On version 1.0.2, I was only getting the scan data, no KB data).
Here is the error I'm getting:
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" Traceback (most recent call last):
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" File "/opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py", line 274, in <module>
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" main()
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" File "/opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py", line 267, in main
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" run()
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" File "/opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py", line 144, in run
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" api_password = qualysModule.splunkpopulator.utils.decrypt(qualysConf['setupentity']['password'])
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" File "/opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualysModule/splunkpopulator/utils.py", line 201, in decrypt
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" return zlib.decompress(base64.urlsafe_b64decode(text))
ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-QualysCloudPlatform/bin/qualys.py" zlib.error: Error -5 while decompressing data: incomplete or truncated stream
Any additional insight would be appreciated!
... View more