All Apps and Add-ons

Eventgen: Unable to generate multiple output files with defined size

conaku
New Member

I want to generate multiple output files with defined size using the "fileMaxBytes" and "fileBackupFiles" parameters but these parameters are not working for me. Please suggest.
Here is the configuration file that I have developed:

[global]
generatorWorkers = 10
threading = process
#threading = thread


[wls_day-01_1v.sample]
mode=sample
sampleDir = /root/ashish/cybersecurity/splunk_eventgen/eventgen/lanl_datagen
sampletype = csv
hourOfDayRate = { "0": 0.30, "1": 0.10, "2": 0.05, "3": 0.10, "4": 0.15, "5": 0.25, "6": 0.35, "7": 0.50, "8": 0.60, "9": 0.65, "10": 0.70, "11": 0.75, "12": 0.77, "13": 0.80, "14": 0.82, "15": 0.85, "16": 0.87, "17": 0.90, "18": 0.95, "19": 1.0, "20": 0.85, "21": 0.70, "22": 0.60, "23": 0.45 }
dayOfWeekRate = { "0": 0.55, "1": 0.97, "2": 0.95, "3": 0.90, "4": 0.97, "5": 1.0, "6": 0.99 }
randomizeCount = 0.2

interval = 3
#earliest = -1s
#latest = now

outputMode = file
fileName = /nvme_data5/Event1V.log
fileMaxBytes = 10485760
fileBackupFiles = 5
count = 1000

token.0.token = \d{2}/\d{2}/\d{4} \d{2}:\d{2}:\d{2},
token.0.replacementType = timestamp
token.0.replacement = %m/%d/%Y %H:%M:%S,

token.1.token = (12345)
token.1.replacementType = integerid
token.1.replacement = 1000

token.2.token = (4688)
token.2.replacementType = file
token.2.replacement = /root/ashish/cybersecurity/splunk_eventgen/eventgen/lanl_datagen/eventId.sample
0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...