All Apps and Add-ons

DomainTools TA for Splunk does not abide by the splunk-launch.conf configuration to use a proxy

eric_budke
Path Finder

Hi Mark (developer of the add-on)
https://splunkbase.splunk.com/app/3376/

I also sent this in via email directly, but it may help others here. The DomainTools TA for Splunk reaches out directly, not abiding by the splunk-launch.conf configuration to use a proxy. As such in my case, it never reaches api.domaintools.com. I verified my keys etc on an external Splunk instance. TCPdump also agrees with the assessment.

0 Karma

cschmidt_hurric
Path Finder

Thank you. That helped me nail down the bug specifically. It appears that while all of the API code respects proxy settings, the tldextract module does not when it makes an initial request to cache a list of TLDs. This bug will be resolved and fixed as soon as possible.

In the meantime, you should be able to fix the problem by doing the following:

  1. Download this file: https://raw.githubusercontent.com/john-kurkowski/tldextract/master/tldextract/.tld_set_snapshot
  2. Place .tld_set_snapshot in /opt/splunk/etc/apps/TA-domaintools/bin/tldextract/

Let me know if that fixes it. Thanks for your patience on this.

0 Karma

masonmorales
Influencer

Converted to answer.

0 Karma

eric_budke
Path Finder

On the app home page: whois and domain profile everything I'm licensed for works now.
On the DTSearch page the errors don't show. Now searches on google or my own turn up nothing, including a lack of errors on the screen.

0 Karma

cschmidt_hurric
Path Finder

Ah, good catch. A minor change changed the domaintools command to output as a statistics table as opposed to events, and this dashboard is expecting events. This too will be fixed in the next version of the app. If you'd like to quickly fix it yourself, click "edit" on the top right of the dashboard, then change the visualization type of that panel from "Events" to "Statistics Table". Here is a screenshot for reference:

alt text

0 Karma

eric_budke
Path Finder

That worked well.

0 Karma

eric_budke
Path Finder

Probably the same source error for all based on eyeballing it but
Whois info

Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 168, in **kwargs) File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 63, in generating_main extracted
= tldextract.extract(domain_or_ip) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 303, in extract return TLD_EXTRACTOR(url) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 183, in __call__ suffix_index = self._get_tld_extractor().suffix_index(translations) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 230, in _get_tld_extractor tlds = self._get_snapshot_tld_extractor() File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 268, in
_get_snapshot_tld_extractor snapshot_stream = pkg_resources.resource_stream(__name__, '.tld_set_snapshot') File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 892, in resource_stream self, resource_name File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 1289, in get_resource_stream return open(self._fn(self.module_path, resource_name), 'rb') IOError: [Errno 2] No such file or directory: '/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/.tld_set_snapshot'

Registrar Info
Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 168, in **kwargs) File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 63, in generating_main extracted = tldextract.extract(domain_or_ip) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 303, in extract return TLD_EXTRACTOR(url) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 183, in __call__ suffix_index = self._get_tld_extractor().suffix_index(translations) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 230, in _get_tld_extractor tlds = self._get_snapshot_tld_extractor() File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 268, in _get_snapshot_tld_extractor snapshot_stream = pkg_resources.resource_stream(__name__, '.tld_set_snapshot') File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 892, in resource_stream self, resource_name File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 1289, in get_resource_stream return open(self._fn(self.module_path, resource_name), 'rb') IOError: [Errno 2] No such file or directory: '/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/.tld_set_snapshot'

Reg Info
Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 168, in **kwargs) File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 63, in generating_main extracted = tldextract.extract(domain_or_ip) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 303, in extract return TLD_EXTRACTOR(url) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 183, in __call__ suffix_index = self._get_tld_extractor().suffix_index(translations) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 230, in _get_tld_extractor tlds = self._get_snapshot_tld_extractor() File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 268, in _get_snapshot_tld_extractor snapshot_stream = pkg_resources.resource_stream(__name__, '.tld_set_snapshot') File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 892, in resource_stream self, resource_name File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 1289, in get_resource_stream return open(self._fn(self.module_path, resource_name), 'rb') IOError: [Errno 2] No such file or directory: '/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/.tld_set_snapshot'

All the rest of the fields in the whois/domain profile page give this too.

The New domains dashboard seems to be working because ES is populating the whois index.

The DomainTools Search page gives on a whois search or whois parsed on google.com

Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 168, in **kwargs) File "/opt/splunk/etc/apps/TA-domaintools/bin/domaintools-cmd.py", line 63, in generating_main extracted = tldextract.extract(domain_or_ip) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 303, in extract return TLD_EXTRACTOR(url) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 183, in __call__ suffix_index = self._get_tld_extractor().suffix_index(translations) File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 230, in _get_tld_extractor tlds = self._get_snapshot_tld_extractor() File "/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/tldextract.py", line 268, in _get_snapshot_tld_extractor snapshot_stream = pkg_resources.resource_stream(__name__, '.tld_set_snapshot') File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 892, in resource_stream self, resource_name File "/opt/splunk/lib/python2.7/site-packages/pkg_resources.py", line 1289, in get_resource_stream return open(self._fn(self.module_path, resource_name), 'rb') IOError: [Errno 2] No such file or directory: '/opt/splunk/etc/apps/TA-domaintools/bin/tldextract/.tld_set_snapshot'
0 Karma

cschmidt_hurric
Path Finder

Just to clarify, you mean that you have HTTP_PROXY/HTTPS_PROXY set in splunk-launch.conf? The app should respect those environment variables. I will look into if there is something preventing that from happening.

0 Karma

eric_budke
Path Finder

Was ignoring the lowercase https_proxy. Adding in the uppercase HTTPS_PROXY it does go out for the domain profile but errors out on everything else (reputation score we don't have access to yet, but JSON errors on the rest). Again, no proxy, no issues.

Splunk docs imply that HTTPS_PROXY only needed to be added if proxy requires HTTPS.

0 Karma

cschmidt_hurric
Path Finder

Could you specify which pages/panels are giving JSON errors, as well as include the JSON errors themselves? I just set up a local proxy server and can confirm that each dashboard is working and going through the proxy to do it.

As far as the documentation for Splunk, this is directly related to the behavior of Python Requests module (python-requests.org) and how it handles proxies. From my tests, if the URL schema is 'http' it will look at the HTTP_PROXY variable, and for 'https' it looks for HTTPS_PROXY. I don't think this is how the environment variables were intended to work, but that's how Requests handles them.

0 Karma

cschmidt_hurric
Path Finder

From my testing, HTTPS_PROXY is used while HTTP_PROXY is ignored. Since the requests made are all using HTTPS, it looks like the HTTPS_PROXY variable needs to be set. Could you try setting that as well and see if it fixes your issue? Thanks.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...