Deployment Architecture

Indexing 20gb data

eholz1
Path Finder

Hello Members,
We have a requirement to come up with a hardware solution for indexing a relatively small ammount of data.
20 GB per day. I seen considerable documention on the Splunk forum and site. Most infomation is oriented towards
larger amounts of data - I would assume that our disk subsystem run run an less that 800 IOPS.

For reference we would have less than 4 users so we could run a combined instance. Memory would be from 32GB to maybe 128GB, and we should have 8 to 12 cores and 64 bit > 2GHz.

(yes we could virtualize this deployment).

I am open for tips and suggestions,

Eholz1

Labels (1)
0 Karma
1 Solution

woodcock
Esteemed Legend

Splunk guidance is clear here. It doesn't really depend on data velocity: MINIMUM of 800 IOPS:
https://docs.splunk.com/Documentation/Splunk/latest/Capacity/Referencehardware

View solution in original post

0 Karma

woodcock
Esteemed Legend

Splunk guidance is clear here. It doesn't really depend on data velocity: MINIMUM of 800 IOPS:
https://docs.splunk.com/Documentation/Splunk/latest/Capacity/Referencehardware

View solution in original post

0 Karma

eholz1
Path Finder

Hello Woodcock,
Thanks for the link, I will check this out - the info on the link will be helpful.

eholz1

0 Karma

eholz1
Path Finder

Hello Woodcock,
another good answer. thanks for replying,
I really appreciate the reponses

0 Karma

woodcock
Esteemed Legend
0 Karma

eholz1
Path Finder

Exellent link - would like more guidance on disk subsystem for 20gb/per day data, IOPS, etc.
Thanks again

eholz1

0 Karma
.conf21 CFS Extended through 5/20!

Don't miss your chance
to share your Splunk
wisdom in-person or
virtually at .conf21!

Call for Speakers has
been extended through
Thursday, 5/20!