Dashboards & Visualizations

Is dynamic programming possible with Splunk?

SPulse
Explorer

Hello!

I'm interested in working with Dynamic Time Warping with my Splunk data. I currently have data in the form of time series, I would like to find out how these time series are correlated with each other, as in, if some change happens in one time series, how long does it take before the effect of this change is seen in the second, related time series. I was planning to use Dynamic Time Warping for that:
http://www.phon.ox.ac.uk/jcoleman/old_SLP/Lecture_5/DTW_explanation.html. Similar to the functionality provided in the FastDW library in Python: https://pypi.python.org/pypi/fastdtw.

Dynamic Time Warping uses the dynamic programming paradigm, therefore, I think if dynamic programming is possible, or is possible in a simplified manner, then we can possibly do this. If not, could we possibly have an easy way to link to the python library such that the data is processed in real time. We'll be dealing with a lot of data and the data might need to be updated about once in 15-30 days.

I just want to know if this is possible and if so how can I get started? Thank you!

0 Karma
1 Solution

bmacias84
Champion

Looking at http://www.phon.ox.ac.uk/jcoleman/old_SLP/Lecture_5/DTW_explanation.html this looks relatively simple by using lookup table, subsearch, and a join statement.

If you only need to update this every 15 to 30 days you could store your reference signal in the lookup table. Then with the subsearch join on time and with an eval do the simple math.

| inputlookup reference.csv 
| eval t=strftime(_time, "%d") 
| join t [search ...
             | timechart span=1d count as test_signal 
             |  eval t=strftime(_time, "%d") ] 
| eval o= pow((test_signal-ref_signal),2) 
| chart values(o) over t

your lookup would hopefully look something like:

_time, refer_signal
1499985432, 10
...., ....
...., ....
...., ....
...., ....
1499989383, 4

you can play round with the _time field to get time to line up and hope this is what you are looking for.

View solution in original post

bmacias84
Champion

Looking at http://www.phon.ox.ac.uk/jcoleman/old_SLP/Lecture_5/DTW_explanation.html this looks relatively simple by using lookup table, subsearch, and a join statement.

If you only need to update this every 15 to 30 days you could store your reference signal in the lookup table. Then with the subsearch join on time and with an eval do the simple math.

| inputlookup reference.csv 
| eval t=strftime(_time, "%d") 
| join t [search ...
             | timechart span=1d count as test_signal 
             |  eval t=strftime(_time, "%d") ] 
| eval o= pow((test_signal-ref_signal),2) 
| chart values(o) over t

your lookup would hopefully look something like:

_time, refer_signal
1499985432, 10
...., ....
...., ....
...., ....
...., ....
1499989383, 4

you can play round with the _time field to get time to line up and hope this is what you are looking for.

Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...