- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am using Splunk universal forwarder to forward events from windows event log to Splunk.
The event has data in JSON format which gets posted in Splunk as Message= e.g.
LogName=CustomLog
SourceName=ECEventLogProvider
EventCode=256
EventType=4
Type=Information
ComputerName=CHECHI12
TaskCategory=Network Events
OpCode=None
RecordNumber=133
Keywords=Classic
Message={
"description" : "SSample text",
"event_id" : "47",
"id" : "22",
"logtype" : "Error",
"msgnum" : "0",
"severity" : "Reserved",
"source" : "Sample source",
"status" : "New",
"system_state" : "S4/S5",
"timestamp" : "00-01-01 00:00:00",
"timestamp_accuracy" : "Approximate"
}
I am thinking of parsing the json referred by Message in search query by using spath:
"logname=customlog" | spath input=Message| timechart count by event_id
The above search results in duplicate event_id values (e.g. "\"35\",", 35, "\"47\",", 47 etc.) since Splunk by default parses the data and spath parses it again.
How can I discard the Splunk parsed data and only get stats for the parsing specified in search?
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
but this causes the problem I mentioned above (duplicate events)
I ended up doing following:
- "logname=customlog" | fields host, SourceName, EventCode, EventType, Type, ComputerName, TaskCategory, OpCode, RecordNumber, Keywords, Message | spath input=Message output=EventMessage path=Message | spath input=Message output=Description path=description | spath input=Message output=event_id path=event_id | spath input=Message output=timestamp path=timestamp | spath input=Message output=source path=source | eval NewTime=strptime(timestamp,"%Y-%m-%d %H:%M:%S") | eval _time=NewTime | addinfo | where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity") | timechart count by event_id
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
but this causes the problem I mentioned above (duplicate events)
I ended up doing following:
- "logname=customlog" | fields host, SourceName, EventCode, EventType, Type, ComputerName, TaskCategory, OpCode, RecordNumber, Keywords, Message | spath input=Message output=EventMessage path=Message | spath input=Message output=Description path=description | spath input=Message output=event_id path=event_id | spath input=Message output=timestamp path=timestamp | spath input=Message output=source path=source | eval NewTime=strptime(timestamp,"%Y-%m-%d %H:%M:%S") | eval _time=NewTime | addinfo | where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity") | timechart count by event_id
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
OK, come on back and click Accept
on your answer to close the question.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Like this:
index="YouShouldAlwaysSpecifyAnIndex" AND sourcetype="AndSourcetypeToo" AND "logname=customlog"
| spath input=Message
| timechart count(SomeFieldNameThatIsOnlyCreatedBySpathHere) BY event_id