Getting Data In

How do I ship json file uusing HTTP Event Collector?

eventcollector
Loves-to-Learn

I'm trying to shipa json data set, my code is working fine for file size less than 10kb, but is failing for higher file sizes.

 

 

import pandas as pd
import requests
import json

with open('empire_psremoting_stager_2020-09-20170827.json') as file:
    data = pd.read_json(file, lines=True)

index_count = data.shape[0]

for count in range(index_count):
    logs = data.loc[count]
    for index, value in logs.items():
        logs = f"{index}: {value}"
    logs = str(data.loc[count]) # Convert DataFrame row to dictionary
    url = "https://xxx-xxx-xxx.splunkcloud.com:8088/services/collector"
    headers = {
        "Authorization": "Splunk xxx-xxx-xxx-xxx-xxx"
    }

    payload = {
        "event": logs
    }

    response = requests.post(url, headers=headers, data=json.dumps(payload), verify=False)

    if response.status_code == 200:
        print("Data sent")
    else:
        print("Failed", response.status_code, response.reason)

 

 

 

does any one have other alternatives? or can look at my code.

Labels (2)
0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

In default there is a max Content-Length for inputs and it's 10k. Here is more about it https://www.splunk.com/en_us/blog/tips-and-tricks/handling-http-event-collector-hec-content-length-t.... Even they told to change that limit on .../etc/system/local/limits.conf I prefer to create a separate app where you add this change.

r. Ismo

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...