Splunk Search

Suggestions on Minimum message type (json)?

metylkinandrey
Communicator

Good afternoon!
We have a problem in the workflow: a part of the customer's system, which is not developed by us, is not able to fill in the following fields of business messages that fall into splunk:
"event": "Test1",
"sourcetype": "testsystem-script707",
"fields":
This system, at the moment, is able to form content intended for the field: "fields": {content_data}, but it does not fill the field itself, it looks like this:

--data-raw '{
"messageId": "<ED280816-E404-444A-A2D9-FFD2D171F928>",
"srcMsgId": "<rwfsdfsfqwe121432gsgsfgdg>",
"correlationMsgId": "<rwfsdfsfqwe135432gsgsfgdg>",
"baseSystemId": "<SDS-IN>",
"routeInstanceId": "<TPKSABS-SMEV>",
"routepointID": "<1.SABS-GIS.TO.KBR.SEND>",
"eventTime": "<1985-04-12T23:20:50>",
"messageType": "<ED123>",
"GISGMPResponseID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712345>",
"GISGMPRequestID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712344>",
"tid": "<ED280816-E404-444A-A2D9-FFD2D171F900>",
"PacketGISGMPId": "<7642341379_20220512_123456789>",
"result.code": "<400>",
"result.desc": "<Ошибка: абвгд>"
}'

Unfortunately, this subsystem is part of our system, and while it is being finalized, we also need to set up monitoring for it.

Tell me, can we configure splunk to accept and process such minimal messages?

 

Tags (1)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| rex "--data-raw '(?<fields>\{.*\})'"
| spath input=fields
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Small correction:

| rex "--data-raw '(?<fields>(.*\s)*})"
| spath input=fields
Tags (2)
0 Karma

metylkinandrey
Communicator

It seems that this is not exactly what I need.) Tell me, I can send a message in this format:

curl --location --request POST 'http://170.25.25.25:8088/services/collector/event' --header 'Authorization: Splunk ееееее-еееееееее-ееееее-e6fc' --header 'Content-Type: text/plain' --data-raw '{
"messageId": "<ED280816-E404-444A-A2D9-FFD2D171F928>",
"srcMsgId": "<rwfsdfsfqwe121432gsgsfgdg>",
"correlationMsgId": "<rwfsdfsfqwe135432gsgsfgdg>",
"baseSystemId": "<SDS-IN>",
"routeInstanceId": "<TPKSABS-SMEV>",
"routepointID": "<1.SABS-GIS.TO.KBR.SEND>",
"eventTime": "<1985-04-12T23:20:50>",
"messageType": "<ED123>",
"GISGMPResponseID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712345>",
"GISGMPRequestID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712344>",
"tid": "<ED280816-E404-444A-A2D9-FFD2D171F900>",
"PacketGISGMPId": "<7642341379_20220512_123456789>",
"result.code": "<400>",
"result.desc": "<Ошибка: абвгд>"
}'

And don't get an error: {"text":"No data","code":5}

0 Karma

metylkinandrey
Communicator

As I understand it, these are the fields that need to be added when searching for messages?
But, tell me how to avoid the error:
{"text":"No data","code":5}

When sending a message like this:

curl -s http://monitor.prod.dev.pis.cbr.ru:8088/services/collector/event -H "Authorization: Splunk 02a4c778-93c2-486c-963c-2711dfc1cda7" -H "Content-Type: application/json" --data-raw '{
"messageId": "<ED280816-E404-444A-A2D9-FFD2D171F928>",
"srcMsgId": "<rwfsdfsfqwe121432gsgsfgdg>",
"correlationMsgId": "<rwfsdfsfqwe135432gsgsfgdg>",
"baseSystemId": "<SDS-IN>",
"routeInstanceId": "<TPKSABS-SMEV>",
"routepointID": "<1.SABS-GIS.TO.KBR.SEND>",
"eventTime": "<1985-04-12T23:20:50>",
"messageType": "<ED123>",
"GISGMPResponseID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712345>",
"GISGMPRequestID": "<PS000BA780816-E404-444A-A2D9-FFD2D1712344>",
"tid": "<ED280816-E404-444A-A2D9-FFD2D171F900>",
"PacketGISGMPId": "<7642341379_20220512_123456789>",
"result.code": "<400>",
"result.desc": "<Ошибка: абвгд>"
}'

0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...