Splunk Search

What is the correct time format for "raw" messages?

metylkinandrey
Communicator

Good afternoon!

I'm noticing that my time format in the messages I send to /services/collector/raw isn't being parsed, or even vice versa, this field isn't displayed in splunk.
My field is: "eventTime": "2022-10-13T18:08:30",
Please tell me the correct format.

Tags (1)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
%Y-%m-%dT%T

Time format variables are described here 

0 Karma

metylkinandrey
Communicator

Yes, I understand how to process it, but what if the field is not displayed at all?

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

I am not sure I understand - are you saying the field does not appear in your events?

Please can you share some anonymised _raw events, some where it is working and some where it is not?

Please use a code block using the </> format option.

0 Karma

metylkinandrey
Communicator

"Please use a code block using the </> format option."

Where is there a manual on how to properly highlight code with tags?

0 Karma

metylkinandrey
Communicator

Yes, you got it right.
I send a message like this:

'{
"messageId": "ED280816-E404-444A-A2D9-FFD2D171F323",
"messageType": "RABIS-HeartBeat",
"eventTime": "2022-10-13T18:08:00",
}'

 

But I only see the first two fields. I indicated in the screenshot.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

How are you extracting the fields?

0 Karma

metylkinandrey
Communicator

esterday I tested a lot and realized that this message format works every other time:

curl --location --request POST 'http://10.10.10.10:8088/services/collector/raw' --header 'Authorization: Splunk a2-a2-a2' --header 'Content-Type: text/plain' --data-raw '{
"messageId": "ED280816-E404-444A-A2D9-FFD2D171F111",
"messageType": "RABIS-HeartBeat",
"eventTime": "1985-04-12T23:21:15"
}'

I just saw that the correct messages have 23 spaces, the problematic ones have 22. Not the point, I just copy the correct messages - for the test, we can assume that I figured it out.

The problem remains with messages where the field is: "eventTime": "1985-04-12T23:21:15" in the middle. I have no guarantee that it will be different in production.
Here is an example:
curl --location --request POST 'http://10.10.10.10:8088/services/collector/raw' --header 'Authorization: Splunk a24-a24-a24-a24' --header 'Content-Type: text/plain' --data-raw '{
"messageId": "ED280816-E404-444A-A2D9-FFD2D171F136",
"eventTime": "2022-11-07T17:06:15",
"messageType": "RABIS-HeartBeat"
}'

In this case, I can't find messages at all in the splank index. Although I can see that it was sent successfully in the bash console.
Splank doesn't like our format(( This is how he likes it: 2022-11-0717:06:15

0 Karma

metylkinandrey
Communicator

In this case, no way, since this field is not there anyway.
As an alternative for the remaining two fields, I use this query:

index="external_system" messageType="RABIS-HeartBeat"
| eval timeValue='eventTime'
| eval time=strptime(timeValue,"%Y-%m-%dT%H:%M:%S.")
| sort -eventTime
| eval timeValue='eventTime'
| eval time=strptime(timeValue,"%Y-%m-%dT%H:%M:%S.")
| sort -eventTime
| streamstats values(time) current=f window=1 as STERAM_RESULT global=false by messageType
| eval diff=STERAM_RESULT-time
| stats list(eventTime) as EventTime list(messageType) as MessageType list(messageId) as MessageId by messageType

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...