Getting Data In

Windows Event Log Format and JSON

davidts
Path Finder

Hi,

I have developers who are trying to create a framework for Windows Event Error handling that can be used for any in-house developed application. They have decided that all errors will be logged into a custom windows event log. However, in the message field of the event they have decided to use JSON to describe the events details.

07/22/2013 02:26:45 PM
LogName=aaaa
SourceName=bbbb
EventCode=1000
EventType=2
Type=Error
ComputerName=cccc
User=dddd
Sid=eeee
SidType=1
TaskCategory=None
OpCode=Info
RecordNumber=66
Keywords=Classic
Message={"EventDateTime" : "2013-07-22T04:26:43Z",
"Message" : "A turbo integrator error has occurred",
"User" : "ffff",
"AdminHost" : "gggg",
"Server" : "hhhh",
"DataSource" : "iiii",
"DataSourceType" : "ODBC",
"ProcessStartDateTime" : "2013-07-22T14:26:42",
"LogFileURL" : "\\jjjjdev\Data$\kkkk_dev\Logging\ProcessError_20130722042643.log"}

How do I extract the Message value and parse it as JSON? or write the whole event as XML? Then there is the issue of working from within the Windows Event log schema which is not flexible enough to provide custom fields.

Thanks.

Tags (4)
0 Karma
1 Solution

michael_sanchez
Path Finder

Hi,

Did you try the spath function on the message field ? http://docs.splunk.com/Documentation/Splunk/5.0.3/SearchReference/Spath

It should solve your problem.

View solution in original post

0 Karma

michael_sanchez
Path Finder

Hi,

Did you try the spath function on the message field ? http://docs.splunk.com/Documentation/Splunk/5.0.3/SearchReference/Spath

It should solve your problem.

0 Karma

michael_sanchez
Path Finder

Cool!
It is not recommended to create new fields at index time. The gain is real only for few cases. The reasons are well explained here : http://docs.splunk.com/Documentation/Splunk/5.0.3/Indexer/Indextimeversussearchtime
If you really want to create custom index fields, read this http://docs.splunk.com/Documentation/Splunk/5.0.3/Data/Configureindex-timefieldextraction

0 Karma

angshul
Path Finder

Hi Michael,

I have similar data with Message field json in Windows Event. I am using spath to search the Message json but the problem is that Splunk by default parses the Message field as key value pairs so I end up with duplicate values. E.g
Message={
"description" : "Sample text",
"event_id" : "47",
"id" : "22",
"logtype" : "Error",
"msgnum" : "0",
"severity" : "Reserved",
"source" : "Sample source",
"status" : "New",
"system_state" : "S4/S5",
"timestamp" : "00-01-01 00:00:00",
"timestamp_accuracy" : "Approximate"
}
For the above Message field Splunk already has parsed event_id with value "\"47\",". When I use spath and count by event_id Splunk adds 47 also to the events so I end up with duplicate event_ids for each event_id (1, "1",), (2, "2",) etc.
Is there a way to explicitly turn of Splunk parsing so that I can parse Message in the search (| spath input=Message | stats count by event_id)

0 Karma

davidts
Path Finder

Hi michael. Thanks that works. It has extracted fields from the Message field using : as delimiters. Do you think I could do the same thing at index time using props.conf? Is it worth doing at index time?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...