Hi,
I am trying to upload elastic log file to splunk
this is an example of one entry in a long log:
{"_index":"index-00","_type":"_doc","_id":"TyC0RIkBQC0jFzdXd-XG","_score":1,"_source":"{"something_long":"long json"}\n","stream":"stderr","docker":{"container_id":"d48887cdb80442f483a876b9f2cd351ae02a8712ec20960a9dc66559b8ccce87"},"kubernetes":{"container_name":"container","namespace_name":"namespace","pod_name":"service-576c4bcccf-75gzq","container_image":"art.com:6500/3rdparties/something/something-agent:1.6.0","container_image_id":"docker-pullable://art.com:6500/3rdparties/something/something-agent@sha256:02b855e32321c55ffb1b8fefc68b3beb6","pod_id":"3c90db56-3013a73e5","host":"worker-3","labels":{"app":"image-service","pod-template-hash":"576c4bcccf","role":"image-ervice"}},"level":"info","ts":1689074778.913063,"caller":"peermgr/peer_mgr.go:157","msg":"Not enough connected peers","connected":0,"required":1,"@timestamp":"2023-07-11T11:26:19.133326179+00:00"}}
As you can see the timestamp is at the end. So I have setup my props.conf for the following:
[elastic_logs]
DATETIME_CONFIG =
INDEXED_EXTRACTIONS = json
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
category = Custom
description = make sure timestamp is taken
pulldown_type = 1
TIME_PREFIX = "@timestamp":\s*"
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6N%z
MAX_TIMESTAMP_LOOKAHEAD = 1000
I can see the timestamp in splunk entries, but that is all I can see now. all the other fields are not displayed.
what am I doing wrong?
Well, the time comes in OK, so it obviously found the correct timestamp. Without the confoiguration I get some of the fields in the json but not the timestamp. With the configuration I only get the timestamp.
Of course, If I move the timestamp to the beginning, then I get the correct mappings... but I don't want to do that.
Perhaps you could try changing the line breaking? Try something like this
LINE_BREAKER = timestamp\":\"[^\"]+\"}}([\r\n]+)
This might be irrelevant, but you appear to have 9 decimal places in your timestamp not 6 (%6N),, and your timezone variable looks like it should be "%:z" not just "%z", and your sample json is not valid (although this could just be a copy/paste/anonymisation artefact). Also, are you sure 1000 is enough of a lookahead?