I'm trying to shipa json data set, my code is working fine for file size less than 10kb, but is failing for higher file sizes.
import pandas as pd
import requests
import json
with open('empire_psremoting_stager_2020-09-20170827.json') as file:
data = pd.read_json(file, lines=True)
index_count = data.shape[0]
for count in range(index_count):
logs = data.loc[count]
for index, value in logs.items():
logs = f"{index}: {value}"
logs = str(data.loc[count]) # Convert DataFrame row to dictionary
url = "https://xxx-xxx-xxx.splunkcloud.com:8088/services/collector"
headers = {
"Authorization": "Splunk xxx-xxx-xxx-xxx-xxx"
}
payload = {
"event": logs
}
response = requests.post(url, headers=headers, data=json.dumps(payload), verify=False)
if response.status_code == 200:
print("Data sent")
else:
print("Failed", response.status_code, response.reason)
does any one have other alternatives? or can look at my code.
Hi
In default there is a max Content-Length for inputs and it's 10k. Here is more about it https://www.splunk.com/en_us/blog/tips-and-tricks/handling-http-event-collector-hec-content-length-t.... Even they told to change that limit on .../etc/system/local/limits.conf I prefer to create a separate app where you add this change.
r. Ismo