Getting Data In

How to ingest large volume of data to Splunk via REST API call without pagination?

arjitg
Explorer

Hi Team! I need to make a REST API GET call to ingest a fairly large amount of data to splunk and unfortunately, this REST API doesn't have a pagination logic. So, it literally dumps an entire set of some 50,000 records every time. So may I know which can be the best way to ingest only new data into Splunk? Is there any best practices to ingest only delta records (which are new) to the system? Thanks.   

0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

Hi

Are you writing a modular inputs? If so, you should add a checkpoint method for bookkeeping what you have already ingesting and what is the next one.

If you are asking something else, then you must clarify your question.

r. Ismo

View solution in original post

arjitg
Explorer

Thanks @isoutamo! Yeah So this is a scripted input but certainly, I can implement the checkpoint logic to this solution. 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

Are you writing a modular inputs? If so, you should add a checkpoint method for bookkeeping what you have already ingesting and what is the next one.

If you are asking something else, then you must clarify your question.

r. Ismo

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Build the Future of Agentic AI: Join the Splunk Agentic Ops Hackathon

AI is changing how teams investigate incidents, detect threats, automate workflows, and build intelligent ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...