Getting Data In

index time not getting captured correctly

surekhasplunk
Communicator
[some_alarms]
DATETIME_CONFIG =
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
TIME_PREFIX = 0x11f4e\"\, \"\$\"\:\ "
category = Custom
disabled = false
pulldown_type = true

my data looks like below

{"alarm": {"attribute": [{"@id": "0x10000", "$": "abc"}, {"@id": "0x11d42", "$": "xyz"}, {"@id": "0x12022", "$": "0"}, {"@id": "0x12c05", "$": "Directly Managed"}, {"@id": "0x13345", "$": "0"}, {"@id": "0x12a07", "$": "0"}, {"@id": "0x12a06", "$": "0"}, {"@id": "0x1000a", "$": "0"}, {"@id": "0x11f4d", "$": "false"}, {"@id": "0x12b4c", "$": "TRAP LINK DOWN IS RECEIVED"}, {"@id": "0x11f4e", "$": "1568205019"}, {"@id": "0x1006e", "$": "abcdfe"}, {"@id": "0x11f50", "$": "2228225"}, {"@id": "0x11f57", "$": "true"}, {"@id": "0x11f56", "$": "1"}, {"@id": "0x11f9b", "$": "true"}, {"@id": "0x129fa", "$": "0x1b30db"}, {"@id": "0x11f9c", "$": "sfdfsfssd"}, {"@id": "0x12d7f", "$": "adddff"}], "@id": "sfdff"}, "@preexisting": false}, {"alarm": {"attribute": [{"@id": "0x10000", "$": "ferewr"}, {"@id": "0x11d42", "$": "rerwe"}, {"@id": "0x12022", "$": "0"}, {"@id": "0x12c05", "$": "Directly Managed"}, {"@id": "0x13345", "$": "0"}, {"@id": "0x12a07", "$": "0"}, {"@id": "0x12a06", "$": "0"}, {"@id": "0x1000a", "$": "2"}, {"@id": "0x11f4d", "$": "false"}, {"@id": "0x12b4c", "$": "LINK UP/DOWN TRAPS RECEIVED IN THE LAST 5 MINUTES EXCEEDS THRESHOLD"}, {"@id": "0x11f4e", "$": "1568205042"}, {"@id": "0x1006e", "$": "RCLI1BS0113"}, {"@id": "0x11f50", "$": "2228231"}, {"@id": "0x11f57", "$": "true"}, {"@id": "0x11f56", "$": "2"}, {"@id": "0x11f9b", "$": "true"}, {"@id": "0x129fa", "$": "0x6ec376"}, {"@id": "0x11f9c", "$": "erwer"}, {"@id": "0x12d7f", "$": "rwre"}], "@id": "tyeyy"}, "@preexisting": false}, 

Problem is my time field is not getting captured correctly.
My script runs every 5 minutes and the index is taking the file creation time of alarms.log file not the variable 0x11f4e
Please help ASAP.

Tags (3)
0 Karma

surekhasplunk
Communicator

Hi @rmjharris and @somesoni2

Can you please help me now. I have following data in my props.conf file now and
[spectrum_alarms]
SHOULD_LINEMERGE = false
NO_BINARY_CHECK = true
CHARSET = UTF-8
disabled = false
TIME_PREFIX = {\"\@id\":\ \0x11f4e\"\, \"\$\":\ "
TIME_FORMAT = %s
INDEXED_EXTRACTIONS = JSON
KV_MODE = none
category = Custom
pulldown_type = 1
LINE_BREAKER = (},){
SEDCMD-add_closing_bracket = s/\"$/"}/g
TRUNCATE=0
MAX_TIMESTAMP_LOOKAHEAD=900

And it seems to be working well with proper json format data.
The only problem remaining timeformat.
Its making all the events as the file creation time and not the time filed which we have mentioned in the props.conf file. Please help here.

0 Karma

harsmarvania57
SplunkTrust
SplunkTrust

Hi,

Please use below configuration in props.conf on Heavy Forwarder or Indexer whichever comes first from Universal Forwarder.

props.conf

[yourSourcetype]
TIME_FORMAT=%s
TIME_PREFIX=0x11f4e\"\, \"\$\"\:\ "
MAX_TIMESTAMP_LOOKAHEAD=10

If above configuration do not work then can you please confirm, are you using INDEXED_EXCTRACTIONS = JSON on UF ?

rmjharris
Path Finder

This is the right answer as far as the timestamp is concerned. To correct the line breaking problem you could try either:

LINE_BREAKER = (},\s)({"alarm")

or

BREAK_ONLY_BEFORE = {"alarm"

0 Karma

surekhasplunk
Communicator

Hi @harsmarvania57
now that seems to be working but linebreak isn't happening properly all are coming as single event. 😞

0 Karma

harsmarvania57
SplunkTrust
SplunkTrust

Can you please let us know from where you want to break the event ?

0 Karma

harsmarvania57
SplunkTrust
SplunkTrust

@surekhasplunk Sample data which you have provided is not proper JSON events and due to that it is merging all events in single event. So try below configuration

On Heavy Forwarder/Indexer

props.conf

[yourSourcetype]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n\s]+)\{\"alarm\"
SEDCMD-test=s/\,\s?$//g
TIME_FORMAT=%s
TIME_PREFIX=0x11f4e\"\, \"\$\"\:\ "
MAX_TIMESTAMP_LOOKAHEAD=10

On SearchHead

props.conf

[yourSourcetype]
KV_MODE = json

EDIT: Updated Indexer props.conf LINE_BREAKER regex.

0 Karma

skalliger
SplunkTrust
SplunkTrust

You need to higher the MAX_TIMESTAMP_LOOKAHEAD to something greater than 128 because that's the default which is probably why your timestamp extraction will fail.

Skalli

0 Karma

surekhasplunk
Communicator

Hi,
I tried adding MAX_TIMESTAMP_LOOKAHEAD=130
but still no luck . Its actually taking the file creation time as in my inputs.conf i have below :
[monitor:///xys/abc/erer/alarms.log]
disabled = 0
host = myhost
sourcetype = some_alarms
index = abc

even though i have specified to take the time from time_prefix field its still taking the alarms.log file creation time

0 Karma

skalliger
SplunkTrust
SplunkTrust

Edit: Ignore my comment, I was confusing things.

Skalli

0 Karma

rmjharris
Path Finder

This is a common mistake:

MAX_TIMESTAMP_LOOKAHEAD is not a count of how far into the event to look for the date, it is the number of characters after TIME_PREFIX.

MAX_TIMESTAMP_LOOKAHEAD = 10 would work here.

0 Karma

skalliger
SplunkTrust
SplunkTrust

Oh, right. Thanks for the correction. My mistake. I'll change my comment to prevent confusion.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...