Hi,
I would like to use pyAOS with my Splunk scripts. Has anyone tried this before? Has anyone imported non-native libraries into their Splunk arch? Thank you for any guidance that can be given.
Thanks,
Hello, I was the same problem with Mysql module that I was install on my Centos server
Splunk didn't work with this library, because splunk has they own python library...then you can fix it only added on the begin your script all libraries of python and also you must to add the python Centos library too... as this way
[root@xxxx]#find / -name site-packages
/usr/lib/python2.7/site-packages
/usr/lib64/python2.7/site-packages
/opt/splunk/etc/apps/Splunk_SA_Scientific_Python_linux_x86_64/bin/linux_x86_64/lib/python2.7/site-packages
/opt/splunk/lib/python2.7/site-packages
[root@xxxx]# whereis python
python: /usr/bin/python2.7 /usr/bin/python /usr/lib/python2.7 /usr/lib64/python2.7 /etc/python /usr/include/python2.7 /opt/splunk/bin/python /opt/splunk/bin/python2.7 /usr/share/man/man1/python.1.gz
include all at begin your script
import sys
sys.path.append('/usr/bin/python2.7')
sys.path.append('/usr/lib/python2.7/site-packages')
sys.path.append('/usr/lib64/python2.7/site-packages')
And that's it , you can run mysql module without any problem and create your alerts with this module.
import mysql.connector
I hope that this fix will help you
Joel Urtubia Ugarte
Yes, This is relatively easy with the use of python egg files. Though you must be careful as you will have to build all the eggs and dependencies per arch type. Looking at the pyAOS module it depends on NumPy/SciPy which is a pain to build all the eggs for. You will need to build them on the same OS as your Splunk instance. If you are using a Windows server you will need to download Intel's Math Kernel Library(INTEL MKL) to build some of the eggs.
You will need to do the following:
1. Make sure you have setuptools installed with an python 2.7.x install.
2. Download pyAOS source.
3. untar. run the following the follow command python setup.py bdist_egg. Note: you will need to do this on the same OS archtype as you production splunk.
Now you will probably get an error. review the error and find missing module. Then download the missing modules source. Repeat step three. You will have to do this until you build all the missing eggs. Once you have build the missing eggs you can use the following code to load all the complied eggs during your scripts run time. The following will find the running path of your script then search an egg directory in the same path.
import os import sys from platform import system platform = system().lower() # Loading eggs into python execution path if platform == 'darwin': platform = 'macosx' running_dir = os.path.dirname(os.path.realpath(\_\_file\_\_)) egg\_dir = os.path.join(running_dir, 'eggs') for filename in os.listdir(egg\_dir): file_segments = filename.split('-') if filename.endswith('.egg'): filename = os.path.join(egg\_dir, filename) if len(file\_segments) <= 3: sys.path.append(filename) else: if platform in filename: sys.path.append(filename) import foo, bar, spam \# \# \#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\# \# Alternative import os, sys import subprocess import splunk.Intersplunk # Remove problematic environmental variables if they exist. for envvar in ("PYTHONPATH", "LD\_LIBRARY\_PATH"): if envvar in os.environ: del os.environ[envvar] # os interpreter python\_executable = "/usr/bin/python" try: realscript = os.path.join(os.path.dirname(\_\_file\_\_), "realscript\.py") p = subprocess.Popen([python\_executable, realscript], stdout=subprocess.PIPE, stderr=subprocess.PIPE) out, err = p.communicate() print out splunk.Intersplunk.outputResults(out); except: # log something pass
This is the same approach I use in my app called Compuware GPN which loads SUDS. I have done this with the NLTK module, but its a pain and took about 6 hours of tinkering with.
Awesome answer. I'll give it a try!
Hi @wweiland
Just following up with this post, but did @bmacias84's answer solve your question?
I had a problem with the egg. It looks like it would have worked if I could have worked out my issues. I was going to try and call a native python based script from within the splunk based script using os.popen.
I've update the post with an alternative, but I don't recommend as it can be problematic sometimes.
Yeah, that's kinda what I set up. Works fine for a day or so of data, but get a couple of thousand records and the process time increases. 49 secs for 432 events. Seems very inefficient. I tried the python setup.py bdist_egg, but it appears that isn't a option in the setup file.
[root@watson aoslib-master]# python setup.py bdist_egg
usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
or: setup.py --help [cmd1 cmd2 ...]
or: setup.py --help-commands
or: setup.py cmd --help
error: invalid command 'bdist_egg'
you could try python -c 'import setuptools; execfile("setup.py")' build.
I also have a bunch of eggs built for mac and gnu/linux which may work for you.
ipython, nltk, numpy, pandas, scipy, six, statsmodels, and sympy. Would you be interested?
Would I be doing this with the native python, or the splunk cmd python
Native python.