Getting Data In

How do I ship json file uusing HTTP Event Collector?

eventcollector
Loves-to-Learn

I'm trying to shipa json data set, my code is working fine for file size less than 10kb, but is failing for higher file sizes.

 

 

import pandas as pd
import requests
import json

with open('empire_psremoting_stager_2020-09-20170827.json') as file:
    data = pd.read_json(file, lines=True)

index_count = data.shape[0]

for count in range(index_count):
    logs = data.loc[count]
    for index, value in logs.items():
        logs = f"{index}: {value}"
    logs = str(data.loc[count]) # Convert DataFrame row to dictionary
    url = "https://xxx-xxx-xxx.splunkcloud.com:8088/services/collector"
    headers = {
        "Authorization": "Splunk xxx-xxx-xxx-xxx-xxx"
    }

    payload = {
        "event": logs
    }

    response = requests.post(url, headers=headers, data=json.dumps(payload), verify=False)

    if response.status_code == 200:
        print("Data sent")
    else:
        print("Failed", response.status_code, response.reason)

 

 

 

does any one have other alternatives? or can look at my code.

Labels (2)
0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

In default there is a max Content-Length for inputs and it's 10k. Here is more about it https://www.splunk.com/en_us/blog/tips-and-tricks/handling-http-event-collector-hec-content-length-t.... Even they told to change that limit on .../etc/system/local/limits.conf I prefer to create a separate app where you add this change.

r. Ismo

0 Karma
Get Updates on the Splunk Community!

ATTENTION!! We’re MOVING (not really)

Hey, all! In an effort to keep this Slack workspace secure and also to make our new members' experience easy, ...

Splunk Admins: Build a Smarter Stack with These Must-See .conf25 Sessions

  Whether you're running a complex Splunk deployment or just getting your bearings as a new admin, .conf25 ...

AppDynamics Summer Webinars

This summer, our mighty AppDynamics team is cooking up some delicious content on YouTube Live to satiate your ...