In HTTP Event Collector, is it possible to send multiple events in one API call? I tried setting line break properties in props.conf, but unfortunately that did not help. Here's what my props.conf look like now:
[host::localhost]
SHOULD_LINEMERGE = false
LINE_BREAKER = \"event
#TIME_PREFIX = sstime\"\:
#MAX_TIMESTAMP_LOOKAHEAD = 10
#TIME_FORMAT = %s
Here's my curl call as an example:
curl http://example.com:8088/services/collector/raw?channel=HIDDENC0-FCH1-46HE-96HA-HIDDENFBC4AB -H 'Authorization: Splunk HIDDEN88-C2GC-4FE6-5982-B245881A8847' -d '{"event":{"host":"localhost","sourcetype":"txt","index":"b","SeqID":1,"TypeID":1,"Name":"test1","Detail":"test event1","Session":"1","Time":"2016/09/19 00:00:00"},"event":{"host":"localhost","sourcetype":"txt","index":"b","SeqID":2,"TypeID":2,"Name":"test2","Detail":"test event 2","Session":"2","Time":"2016/09/20 00:00:00"}}'
What am I doing wrong? How can I create 2 event entries in Splunk with one API call?
You can see example python code with batch mode: http://blogs.splunk.com/2015/12/11/http-event-collect-a-python-class/
I also just updated the python code in the repo to add raw input support for Splunk v6.4+.
the link to the python is broken. Would you happen to have an updated link?
http://dev.splunk.com/view/event-collector/SP-CAAAE6P covers the format of batch events within the HTTP event collector.
Following the below examples you will not need to worry about things like LINE_BREAKER in props.conf
Just ensure things like time, host, source are broken out from the event itself (so your original JSON will need some tweaking). This applies when using the https://example.com:8088/services/collector/event endpoint not the http://example.com:8088/services/collector/raw endpoint.
From the link......
{
"time": 1437522387,
"host": "dataserver992.example.com",
"source": "testapp",
"event": {
"message": "Something happened",
"severity": "INFO"
}
}
To batch (from link)......
{
"event":"event 1",
"time": 1447828325
}
{
"event":"event 2",
"time": 1447828326
}
Combining these you would have something like this
{
"time": 1437522387,
"host": "dataserver992.example.com",
"source": "testapp",
"event": {
"message": "Something happened",
"severity": "INFO"
}
}
{
"time": 1437522388,
"host": "dataserver993.example.com",
"source": "testapp",
"event": {
"message": "Something else happened",
"severity": "DEBUG"
}
}
Hi!
Please, how can we construct stacked (or batched) json object events from a file's rows ?
I've reimplemented my own Python forwarder to batch-up the events like this, and things seem to be faster. Is there a server-side limit on the length of the event-list, however?
max_content_length looks like the setting you are after as defined in limits.conf - https://docs.splunk.com/Documentation/Splunk/latest/Admin/Limitsconf#.5Bhttp_input.5D
You might want to consider making your Python code configurable to handle this in case admins have changed this limit on the Splunk server.
I'm curious why a JSON array can't be used to pass multiple events to the HTTP Event Collector. "Stacking" events isn't valid JSON, which means I have to deal with that on the sending side.
Please @dave_maclean, how did you deal with stacked JSON objects construction ?
I've to construct it from the row of my file but I've faced problem with that in python such as "EOL ...", "Can't concat string with dict..."
Thanks in advance to anyone who can help me to resolve this issue.
You can submit batch events in a post. See example code in my answer above.
upvoting the question and downvoting the answer - since not the answer to dave.maclean's question. Why is it required to do the "stacking" hacks on the sending side instead of following the standard?