Getting Data In

Splunk not compressed my 500G of data. Why?

clyde772
Communicator

Hey Gurus,

I have a situation where my data that's been stored in my indexers are bigger than the original data. What happened? How could this possible? We did't touch much config where all ciritical conf should be initial config.

Anybody have any ideas?

Thanks!

Tags (1)
0 Karma

Drainy
Champion

Do you have multiple data sources feeding into the indexer? Also have you setup any index time field extractions?

If you are just forwarding data across with no additional stuffs then there must be additional data being added at some point or excessive index time extractions. Roughly you get 50% compression ratio with Splunk (entirely dependent on your data though), Splunk will also create metadata files associated with your indexes that have metafields to speed indexing as well as bloom filters and other files.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...