Splunk Search

External Lookup python script

yko84108
New Member

Hi,

I have python script that make query to ip2location.
The script work something like that (from IP2Location

 import IP2Location;

    IP2LocObj = IP2Location.IP2Location();
    IP2LocObj.open("data/IP-COUNTRY-REGION-CITY-LATITUDE-LONGITUDE-ZIPCODE-TIMEZONE-ISP-DOMAIN-NETSPEED-AREACODE-WEATHER-MOBILE-ELEVATION-USAGETYPE-SAMPLE.BIN");
    rec = IP2LocObj.get_all("19.5.10.1");
    #Write csv result to splunk...

If i'm run the external lookup by:
index = myindex | lookup ip2location ip as ip_field

Its can run around 50-60 secs (to 500k records),
My question is:
What can I do except to change my python code to improve the execution time?
* I heard something about load the script into Splunk application memory (MEMORY_CACHE ?)

Thanks

0 Karma

rvany
Communicator

How many events do you get back from "index=myindex"?
Is your python script called for every single event?
How long does it take to run this script outside of Splunk (one time, ten times, times - with eventcount being the answer to the first question)?

0 Karma

Sukisen1981
Champion

hmm interesting case -Now, what is actually causing the performance issue, is the execution in the python script or is it execution in splunk?

See a similar thread here - https://answers.splunk.com/answers/103717/cache-indexes-in-memory.html.
I do doubt that the execution in python will take time as well...

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...