In installed the faup addon after seeing it at .conf2013 but have gotten inconsistent results.
index=wsa earliest=-60s | bucket x_wbrs_score | lookup faup url AS cs_url
works as expected: I get the appropriate url_* fields extracted.
index=wsa earliest=-60s | lookup faup url AS cs_url
does not work... the url_* fields are not created, although 100% of events have the cs_url field. I am perplexed.
Splunk 5.0.4 on the SH and Indexers.
solved...
The knowledge bundle from the search head to the indexers did not include the apps/faup/opt directory. When using bucket, the search results were retuned to the search head before the lookup so it worked.
The following changes need to be made to the app:
create apps/faup/default/distsearch.conf to distribute the opt directory to search peers
[replicationWhitelist]
faup-opt = apps/faup/opt/...
The faup.py script in bin has a hard coded path to the faup app in $SPLUNK_HOME/etc/apps/.. edit apps/faup/bin/faup.py to use a relative path
def where_is_faup():
#replace the next four lines with those below to use a relative path from this python script
#if platform.system() == "Darwin":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-darwin"
#if platform.system() == "Linux":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-linux"
if platform.system() == "Darwin":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-darwin")
if platform.system() == "Linux":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-linux")
# I don't know, so let's trust the system
return "faup"
These two changes allowed the search to run properly without installing the faup app on each indexer.
If you want to get this app working you need to update the 2 separate pieces of the faup.py file as listed above.
The first is whether or not it is a Linux or Darwin system. The second is the logging folder. I update the faup.py on Linux to the code listed below and it works fine now. Except that the uri_resource_path will sometimes have uri_query information included. Not sure why that is just yet.
faup.py-
def where_is_faup():
# if platform.system() == "Darwin":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-darw
in"
# if platform.system() == "Linux":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-linu
x"
#
if platform.system() == "Darwin":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-
darwin")
if platform.system() == "Linux":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-
linux")
# I don't know, so let's trust the system
return "faup"
faup_bin = where_is_faup()
And also
def setup_logger():
"""
Setup a logger for our lookup
"""
logger = logging.getLogger('faup')
logger.setLevel(logging.DEBUG)
file_handler = logging.handlers.RotatingFileHandler('/opt/splunk/var/log/splunk/faup.log' )
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
return logger
When running the command use
* | lookup faup url
If the url is extracted as something different, say uri, then use.
* | lookup faup url AS uri
Hopefully they can just write this into Splunk Enterprise base so we can use urlparse just as we can use urldecode.
IAN.
Folks. Feedback is important. I really appreciate the effort here. I want faup to be used without having to patch :). I will take into account patches and improvements to the next version that I plan to release early January. Thank you very much!
I just got it to work... I had to change the following line in the setup_logger definition as well:
file_handler =
logging.handlers.RotatingFileHandler(os.environ['SPLUNK_HOME']
+ '/var/log/splunk/faup.log' )
what did you change it too? I would think keeping it in the default location would work best since Splunk is monitoring that directory and adding it to the _internal index... Of course I'm not getting any log data now- looking at the script it appears to only log when it encounters an error...
solved...
The knowledge bundle from the search head to the indexers did not include the apps/faup/opt directory. When using bucket, the search results were retuned to the search head before the lookup so it worked.
The following changes need to be made to the app:
create apps/faup/default/distsearch.conf to distribute the opt directory to search peers
[replicationWhitelist]
faup-opt = apps/faup/opt/...
The faup.py script in bin has a hard coded path to the faup app in $SPLUNK_HOME/etc/apps/.. edit apps/faup/bin/faup.py to use a relative path
def where_is_faup():
#replace the next four lines with those below to use a relative path from this python script
#if platform.system() == "Darwin":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-darwin"
#if platform.system() == "Linux":
# return os.environ['SPLUNK_HOME'] + "/etc/apps/faup/opt/faup-linux"
if platform.system() == "Darwin":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-darwin")
if platform.system() == "Linux":
return os.path.join(os.path.dirname(__file__), "..","opt","faup-linux")
# I don't know, so let's trust the system
return "faup"
These two changes allowed the search to run properly without installing the faup app on each indexer.
@jeff Confirmed working. Thanks!
@kevenking- yep, I found it was working in my case because I had installed faup on the indexers once upon a time early on. After removing the app from the indexers my edit did not work. I found a hard coded path in the faup.py script. See my edited answer above for the solution. Hopefully @stricaud_splunk can take a look at these changes and incorporate into the next faup release.
I did, and restarted to make sure.
The opt directory does exist under $SPLUNK_HOME/var/run/searchpeers/
Thanks jeff
@kevinking- did you restart your search head? On your indexers, check $SPLUNK_HOME/var/run/searchpeers/
The fix didn't work for me, I patched faup and installed on the search head.
Does the fixed app maybe need to be deployed to the indexers too?
BTW, Thanks for the initial bucket tip.