Splunk Search

How many custom shapes does choropleth map support?

hylam
Contributor

http://blogs.splunk.com/2015/10/01/use-custom-polygons-in-your-choropleth-maps/
Use Custom Polygons in Choropleth Map

I have read the article above. Suppose I am working on a chessboard-like visualization. There are 8x8x16x2 custom shapes. There 16x2 shapes in 1 cell with zero or one of them visible. How well does it scale as the size of the chessboard or the number of pieces grows? How about 1000x1000x100?

The actual business problem is I am visualizing moving objects with statuses on a custom indoor floor plan. The map is much bigger than the screen so I would like to have pan & zoom. I am going to generate the KML file with 1000x1000x100 featureId's. Where is the first bottleneck going to be if I keep increasing the chessboard size? Is the choropleth map visualization heavy on the server, browser, javascript or svg? Most of the chessboard is empty. Most of the custom shapes are invisible. The data is backed by a real-time search.

0 Karma
1 Solution

mporath_splunk
Splunk Employee
Splunk Employee

There isn't any inherent limit to how many polygons you can load on the server side. The algorithms that do the point-in-polygon operation (the lookup) scale as the log of the number of vertices.

When we return polygons to the browser, we use generalization and clipping to limit the amount of data returned while faithfully reproducing the shapes on the client. The biggest performance issue I can think of is the cost of building the lookup. This happens once - the first time you use the lookup. You might see a pause, as building the lookup scales with the number of vertices. However, once that is done, the lookup index is saved on disk.

Note that right now there is a limit of shapes that you can show at a particular time (~2000) due to how many elements SVG can handle in the DOM. That means while you can do the lookup on millions of data points, you can't show more than ~2000 shapes with aggregated data at once.

View solution in original post

ghendrey_splunk
Splunk Employee
Splunk Employee

Unfortunately I believe the UI limits the number of shapes by passing a max 1000 back to the server. I will look into this when I get back in on Monday to see if there is a. Workaround. If you don't see a follow up here on Monday, please ping me ghendrey@splunk.com

0 Karma

mporath_splunk
Splunk Employee
Splunk Employee

There isn't any inherent limit to how many polygons you can load on the server side. The algorithms that do the point-in-polygon operation (the lookup) scale as the log of the number of vertices.

When we return polygons to the browser, we use generalization and clipping to limit the amount of data returned while faithfully reproducing the shapes on the client. The biggest performance issue I can think of is the cost of building the lookup. This happens once - the first time you use the lookup. You might see a pause, as building the lookup scales with the number of vertices. However, once that is done, the lookup index is saved on disk.

Note that right now there is a limit of shapes that you can show at a particular time (~2000) due to how many elements SVG can handle in the DOM. That means while you can do the lookup on millions of data points, you can't show more than ~2000 shapes with aggregated data at once.

gschmitz
Path Finder

Hi mporath,
even a year later this is the only decent post on the limits of the choropleth map 🙂
I just ran into the same problem the OP described, but cannot quite confirm that dragging the map or zooming helps any. I seem to get the first 1 or 2k of grids returned in the search so I was changing the sort order to display the most interesting ones first, but it would still be nice to see more grids. My browser seems to have no performance issues so far, so I'm wondering, if there is any setting in limits.conf or a hidden parameter to geom we could fiddle with. Do you happen to know?
Best Regards
Gerrit

0 Karma

fk319
Builder

I was successful in generating a 300x300 rectangular grid.
I think this was excessive, and I would like to have a 100x100 grid, then 'zoom' into parts of my 300x300.

0 Karma

hylam
Contributor

300x300 tiles or polygons?

0 Karma

hylam
Contributor

return polygons to the browser, we use generalization and clipping
Where is the polygon reduction performed? server side or client side?

0 Karma

mporath_splunk
Splunk Employee
Splunk Employee

server side

0 Karma

ghendrey_splunk
Splunk Employee
Splunk Employee

serverside. When you drag a choropleth map, you will clearly see the clipped edge. When you release the drag on the client, you will see the server reclip/gen the geometry, and the screen updates.

0 Karma

hylam
Contributor

scale as the log of the number of
vertices

What data structures and algorithms do you use? Do you use Graham scan, priority search tree and binary space partitioning? Explain the computational geometry if you please. Thx.

0 Karma

ghendrey_splunk
Splunk Employee
Splunk Employee

Smoothing is done via Douglass Poiker Ramer algorithm. Point in polygon matching and clipping are performed via our proprietary index and algorithm. I intend to make a detailed post on the algorithms.

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...