- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Json file is getting truncated while uploading
Hi all,
when i upload a json file to splunk, the data is getting truncated and the full data is not being uploaded. Because of this i'm not getting proper answers for the queries. Can anyone please suggest solutions to avoid this problem.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Splunk has a default event size of 10,000 bytes. If your Json object is bigger than this it will be truncated before it is parsed.
If your Json block is bigger than this you may need to consider specifying the truncate option in props to configure this to account for your data size.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
[default]
CHARSET = AUTO
LINE_BREAKER_LOOKBEHIND = 100
TRUNCATE = 10000
DATETIME_CONFIG = \etc\datetime.xml
ADD_EXTRA_TIME_FIELDS = True
ANNOTATE_PUNCT = True
HEADER_MODE =
MATCH_LIMIT = 100000
DEPTH_LIMIT = 1000
MAX_DAYS_HENCE=2
MAX_DAYS_AGO=2000
MAX_DIFF_SECS_AGO=3600
MAX_DIFF_SECS_HENCE=604800
MAX_TIMESTAMP_LOOKAHEAD = 128
SHOULD_LINEMERGE = True
BREAK_ONLY_BEFORE =
BREAK_ONLY_BEFORE_DATE = True
MAX_EVENTS = 256
MUST_BREAK_AFTER =
MUST_NOT_BREAK_AFTER =
MUST_NOT_BREAK_BEFORE =
TRANSFORMS =
SEGMENTATION = indexing
SEGMENTATION-all = full
SEGMENTATION-inner = inner
SEGMENTATION-outer = outer
SEGMENTATION-raw = none
SEGMENTATION-standard = standard
LEARN_SOURCETYPE = true
LEARN_MODEL = true
maxDist = 100
AUTO_KV_JSON = true
detect_trailing_nulls = auto
sourcetype =
priority =
Hi @nickhills
I have changed TRUNCATE to 999999 and MAX_EVENTS to 500 but still the data is getting truncated. What will be the problem?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @anooshac,
For large JSON events, i tend to use TRUNCATE=0
to disable truncation. Try that out.
~John
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @jscraig2006, thank you for the response. I changed TRUNCATE to 999999 and restarted the system. The problem got solved now!!
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

can you post what you have in your props.conf
for the sourcetype
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
[default]
CHARSET = AUTO
LINE_BREAKER_LOOKBEHIND = 100
TRUNCATE = 10000
DATETIME_CONFIG = \etc\datetime.xml
ADD_EXTRA_TIME_FIELDS = True
ANNOTATE_PUNCT = True
HEADER_MODE =
MATCH_LIMIT = 100000
DEPTH_LIMIT = 1000
MAX_DAYS_HENCE=2
MAX_DAYS_AGO=2000
MAX_DIFF_SECS_AGO=3600
MAX_DIFF_SECS_HENCE=604800
MAX_TIMESTAMP_LOOKAHEAD = 128
SHOULD_LINEMERGE = True
BREAK_ONLY_BEFORE =
BREAK_ONLY_BEFORE_DATE = True
MAX_EVENTS = 256
MUST_BREAK_AFTER =
MUST_NOT_BREAK_AFTER =
MUST_NOT_BREAK_BEFORE =
TRANSFORMS =
SEGMENTATION = indexing
SEGMENTATION-all = full
SEGMENTATION-inner = inner
SEGMENTATION-outer = outer
SEGMENTATION-raw = none
SEGMENTATION-standard = standard
LEARN_SOURCETYPE = true
LEARN_MODEL = true
maxDist = 100
AUTO_KV_JSON = true
detect_trailing_nulls = auto
sourcetype =
priority =
Hello @jscraig2006 thanks for answering. This is the first part of the props.conf file.
