Getting Data In

http event collector truncates event to 10,000 characters

sloh_splunk
Splunk Employee
Splunk Employee

I am sending data to my splunk instance like https://docs.splunk.com/Documentation/Splunk/8.0.4/Data/HECExamples says to:

curl -k "https://mysplunkserver.example.com:8088/services/collector" \
-H "Authorization: Splunk CF179AE4-3C99-45F5-A7CC-3284AA91CF67" \
-d '{"sourcetype": "_json","event": {'....over 10,000 character json event'

I get a success return

{"text":"Success","code":0}

 

When query for the event and i do a character count on it, I see it always truncates my events to 10,000 characters.

On my server, looking at

./etc/system/default/limits.conf

It has inside:

[http_input]
# The max number of tokens reported by logging input metrics.
max_number_of_tokens = 10000
# The interval (in seconds) of logging input metrics report.
metrics_report_interval = 60
# The max request content length (800MB, to match HTTP server).
max_content_length = 838860800
# The max number of ACK channels.
max_number_of_ack_channel = 1000000
# The max number of acked requests pending query.
max_number_of_acked_requests_pending_query = 10000000
# The max number of acked requests pending query per ACK channel.
max_number_of_acked_requests_pending_query_per_ack_channel = 1000000

Seems like everything is ok there.

 

And in

./etc/system/local/limits.conf

it has:

[search]
allow_batch_mode = 1
allow_inexact_metasearch = 0
always_include_indexedfield_lispy = 0
default_allow_queue = 1
disabled = 0
enable_conditional_expansion = 1
enable_cumulative_quota = 0
enable_datamodel_meval = 1
enable_history = 1
enable_memory_tracker = 0
force_saved_search_dispatch_as_user = 0
load_remote_bundles = 0
record_search_telemetry = 1
remote_timeline = 1
search_retry = 0
timeline_events_preview = 0
track_indextime_range = 1
track_matching_sourcetypes = 1
truncate_report = 0
unified_search = 0
use_bloomfilter = 1
use_metadata_elimination = 1
write_multifile_results_out = 1

 

Why is Splunk truncating my events to 10,000 characters that I am sending? If my JSON is less than 10,000 characters, I am able to see all the data and it JSON formatted when I do a splunk query.

Labels (1)
0 Karma
1 Solution

livehybrid
Builder

It feels like the limit you are hitting here is a truncate limit in props.conf

Can you confirm that it is arriving in Splunk as "_json" sourcetype? If so try running this on the host receiving the HEC:

/opt/splunk/bin/splunk btool props list _json --debug | grep -i truncate

If it gives 10000 then that is where the limit is being applied!

View solution in original post

livehybrid
Builder

It feels like the limit you are hitting here is a truncate limit in props.conf

Can you confirm that it is arriving in Splunk as "_json" sourcetype? If so try running this on the host receiving the HEC:

/opt/splunk/bin/splunk btool props list _json --debug | grep -i truncate

If it gives 10000 then that is where the limit is being applied!

sloh_splunk
Splunk Employee
Splunk Employee

thanks @livehybrid ! you were correct!

$ /opt/splunk/bin/splunk btool props list _json --debug | grep -i truncate
/opt/splunk/etc/system/default/props.conf TRUNCATE = 10000

 

0 Karma

sloh_splunk
Splunk Employee
Splunk Employee

updated /opt/splunk/etc/system/local/props.conf and put "TRUNCATE = 52428800"  under [_json]

so local/props.conf looks like:

[_json]
TRUNCATE = 52428800

and now TRUNCATE returns as follows:

$ /opt/splunk/bin/splunk btool props list _json --debug | grep -i truncate
/opt/splunk/etc/system/local/props.conf   TRUNCATE = 52428800

 

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...