Splunk Search

External Lookup python script

New Member

Hi,

I have python script that make query to ip2location.
The script work something like that (from IP2Location

 import IP2Location;

    IP2LocObj = IP2Location.IP2Location();
    IP2LocObj.open("data/IP-COUNTRY-REGION-CITY-LATITUDE-LONGITUDE-ZIPCODE-TIMEZONE-ISP-DOMAIN-NETSPEED-AREACODE-WEATHER-MOBILE-ELEVATION-USAGETYPE-SAMPLE.BIN");
    rec = IP2LocObj.get_all("19.5.10.1");
    #Write csv result to splunk...

If i'm run the external lookup by:
index = myindex | lookup ip2location ip as ip_field

Its can run around 50-60 secs (to 500k records),
My question is:
What can I do except to change my python code to improve the execution time?
* I heard something about load the script into Splunk application memory (MEMORY_CACHE ?)

Thanks

0 Karma

Communicator

How many events do you get back from "index=myindex"?
Is your python script called for every single event?
How long does it take to run this script outside of Splunk (one time, ten times, times - with eventcount being the answer to the first question)?

0 Karma

Champion

hmm interesting case -Now, what is actually causing the performance issue, is the execution in the python script or is it execution in splunk?

See a similar thread here - https://answers.splunk.com/answers/103717/cache-indexes-in-memory.html.
I do doubt that the execution in python will take time as well...

0 Karma