I have setup Splunk to receive syslogs from our network. These work fine and I can view them in Splunk.
Next step I want to take is to send those logs to logz.io to keep them in cloud. The problem I have is that I don't know where the data collected by Splunk is stored. Where can i find the logs collected by Splunk? What format is it in?
Splunk logs are collected in buckets (hot, warm, cold, frozen, thawed) and depending on the environment settings they follow the process of roll out from hot -> warm -> cold. It has an entire process for this and you could better have a look here for more details:
One note here is that hot buckets (most recent data) is in active writing stage, so you can't backup/move that data until in reaches warm stage.
If you want to sent those logs to cloud, for archiving purposes, you could copy the relevant archives to the cloud and you will be able later to bring them back to thawed and search them. To find out the buckets you need to backup you could use in search | dbinspect
to identify them. Location is usually in $SPLUNKHOME/var/lib/splunk/yourindex/db/ but cold data can be moved to an external storage (other Volume).
| dbinspect index=yourindex
| table bucketId, startEpoch, endEpoch, id, index, modTime, path, sizeOnDiskMB, splunkserver, state
Thank you for your answer, Marian.
Is it possible to send the logs in a human readable format to cloud? Or would I need to use Splunk every time I want to read them again?
The logs (Splunk db - tsidx) are not human readable, they're Splunk proprietary format. You'll have to re-index them to Splunk to be able to query them.