All Apps and Add-ons

Splunk Eventgen | Count of events generated

VivekVenudasan
Engager

Hello fellow Splunkers,

I'm using Splunk Eventgen for simulating some data records that are required to test certain queries. I want to generate 1000 events (each event corresponds to a unique service Id represented using a field svcId) in an interval of 5 minutes. Therefore I expect 1000 svcIds to be generated every 5minutes with one and only one event per svcId in each 5 minute interval. However when I implemented this using a sample app with the required eventgen.conf and a sample record I see that there are 3 records  generated per svcId within every 5 minute interval . 

Based on the data observed from the logs in the eventgen code , I think modinput code is spawning three threads by default and each thread is generating data independently based on the eventgent.conf inputs. I have played around with some of the other settings in the eventgen.conf like maxIntervalsBeforeFlush, maxQueueSize, delay etc , but so far unsuccessful .Not sure what am I doing wrong here.

Appreciate the help from the gurus here , who can help me understand what is being done wrong here. Thanks .

Below are the configurations that I use for my test app

eventgen.conf

[seo]
sampletype = csv
interval = 300
count = 1000
outputMode = splunkstream

token.0.token = (timeRecorded=\d+)000,
token.0.replacementType = timestamp
token.0.replacement = %s

token.1.token = (svcId=\d+)
token.1.replacementType = integerid
token.1.replacement = 1000

token.2.token = lag-105:355.(\d+)
token.2.replacementType = integerid
token.2.replacement = 1000

token.3.token = (policerId=2)
token.3.replacementType = static
token.3.replacement = 2

token.4.token = (timeCaptured=\d+)000,
token.4.replacementType = timestamp
token.4.replacement = %s

token.5.token = (allOctetsDropped=\d+)
token.5.replacementType = static
token.5.replacement = 0

token.6.token = (allOctetsForwarded=\d+),
token.6.replacementType = random
token.6.replacement = integer[1000000:9999999]

token.7.token = (allOctetsOffered=\d+),
token.7.replacementType = static
token.7.replacement = 0

 

Sample file (seo)

index,host,source,sourcetype,"_raw"
"main","test_host2","test_source","test_src_type","timeRecorded=1611533616000,svcId=13088157,0,lag-105:355.1513,policerId=2,timeCaptured=1611535424000,,,,,allOctetsDropped=0,allOctetsForwarded=2924133555698,allOctetsOffered=292713155698,,,,,minimal"

Labels (1)
Tags (1)
0 Karma
1 Solution

VivekVenudasan
Engager

I have found the reason behind this. My splunk instance is having search heads in a cluster. There are 3 searchheads in the cluster . So when the eventgen & the custom bundle is deployed it got deployed to all 3 search head members. By default they custom bundle app is enabled in all 3 search heads, therefore events were generated from all 3 search heads. Hence those 3 duplicate records!!!

It wasnt obvious to me until I changed the host to "localhost" instead of the custom value that I had before (test_host2)! 

So to resolve this issue, once the app is deployed into the SH members, i manually logged into the 2 search head servers and disabled the custom app locally in them and left it enabled in only one of them and then did a debug refresh .  Happy days afterwards!!!

 

 

 

 

 

View solution in original post

0 Karma

VivekVenudasan
Engager

I have found the reason behind this. My splunk instance is having search heads in a cluster. There are 3 searchheads in the cluster . So when the eventgen & the custom bundle is deployed it got deployed to all 3 search head members. By default they custom bundle app is enabled in all 3 search heads, therefore events were generated from all 3 search heads. Hence those 3 duplicate records!!!

It wasnt obvious to me until I changed the host to "localhost" instead of the custom value that I had before (test_host2)! 

So to resolve this issue, once the app is deployed into the SH members, i manually logged into the 2 search head servers and disabled the custom app locally in them and left it enabled in only one of them and then did a debug refresh .  Happy days afterwards!!!

 

 

 

 

 

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...