All Apps and Add-ons

How to change Splunk Add-On for Microsoft Cloud Services 5000 blob limit

dennywebb
Path Finder

Hello, since 2018 our application has been logging to Azure Storage, in a single container, with "folders" broken down as:

/Year2018/Month04/Day12/Hour15/Minute20/Subscription123/User/456/logtype.log

 

My goal is to pull these logs (json) into Splunk, and so I've set up the Add-On and begun ingesting data... but it kept stopping at 2018... never getting to 20(19/20/21/22).  Investigating why, after quite a bit of tinkering around, I found some internal logs that indicated 

The number of blobs in container [containername] is 5000

 Which... upon further research... is the maximum number of records returned without hitting a movenext marker because of forced paging with the API.

So... I mean I can go edit the python script myself... but is there another way/better way to do this or is a fix for this already in the works?  And, if not and I make the change... is there a github or something I can submit the change to?

Labels (2)
0 Karma

dennywebb
Path Finder

Well, reading through the code... it DOES have a marker set so I'm not sure why this would be failing to return the next "batch" of items until it gets to once that are valid...

0 Karma

dennywebb
Path Finder
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...