All Apps and Add-ons

Splunk Add On for Google Cloud Platform - message="Not enough time to send data for indexing."

rijutha
Explorer

Hi,

Splunk Version - Splunk 7.0.2 (build 03bbabbd5c0f) - Role: Heavy Forwarder

Splunk_TA_google-cloudplatform version = 1.2.0

I have configured pub/sub inputs to collect logs from a Google Cloud Platform. As per the recommendations on the Splunk Documentation below, I have created 5 cloned pub/sub inputs for throughput and performance.

https://docs.splunk.com/Documentation/AddOns/released/GoogleCloud/Troubleshoot

Large pub/sub subscriptions
For large pub/sub subscriptions, we recommend cloning existing inputs that are ingesting the same subscriptions to increase data throughput and performance. These identical inputs can be in the same instance or in different instances.

To manage a large number of subscriptions to one Splunk instance, aggregate subscriptions belonging to the same Google Cloud Service account into one input to save resources.

I see data not being indexed on and off.
Checking the pub/sub logs I found this error:

xxxx-xx-xx xx:xx:xx,xxx level=ERROR pid=2383 tid=MainThread logger=splunk_ta_gcp.modinputs.pubsub pos=pubsub.py:_try_send_data:201 | datainput="gcp_qa_pubsub_all_2" start_time=xxxxxxxxxx| message="Not enough time to send data for indexing." lapse=8.34614610672 ttl=10

What is the reason for this error? And how do we fix it?

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...