Getting Data In

Is it possible to process a raw log file and give a out put in tabular structure?.Please help me out.My logs are like this only.

Sayan_nag
New Member

Hi I am new to splunk. My log file sample is given below:
Raw log data given below:

2011-01-01T00:00:50.665998Z : Starting eth11
2011-01-01T00:00:50.689468Z : <<<<< Iniializing IM XN >>>>>>>>
2011-01-01T00:00:50.714001Z : <<<<<<<XN_IM_Init - Registered the Events >>>>>>>>
2001-01-01T00:00:50.714407Z : XN IM init success
2001-01-01T00:00:50.718647Z : Started eth1
2001-01-01T00:00:52.219925Z : Found device
2001-01-01T00:00:52.223080Z : gateway device serial number is uty200866625
2001-01-01T00:00:52.223705Z : gateway device receiver id is u0115613936
2001-01-01T00:00:52.380948Z : Host mac address is p0:n5:02:1d:k8:56
2001-01-01T00:00:52.381392Z : isgateway is true
2001-01-01T00:00:52.391026Z : Item U200866625 not found in the list
2001-01-01T00:00:52.395788Z : Inserted device uty200866625 in the list
2001-01-01T00:00:52.397047Z : Calling gateway script /lib/rdk/gwSetup.sh "169.254.197.230" "search hsd1.ga.comcast.net.;nameserver 75.75.75.75;nameserver 75.75.76.76;" "73.207.244.96 PaceXG1v3;73.207.244.96 PaceXG1v3;" "eth1" "50" &
2001-01-01T00:00:52.413514Z : Output is
{
"xme":
[
{
"a":"uty200866625",
"b":"yes",
"c":"1.5.9.3",
"d":"a00:b5:c62:1d:e8:f56",
"e":"US/Eastern",
"f":"-18000",
"g":"3600",
"h":"0",
"i":"yes",
"j":"http://fakeaddress",
"k":"true",
"l":"ws://1.2.3.4:5",
"m":"http://nextfakeaddress",
"n":"search hsd1.name2.;nameserver 5.5.5.5;nameserver 5.5.6.6;",
"o":"7.2.2.9 Pa;7.2.2.9 Pa;",
"p":"Id1:1030;Id2:19004;Id3:08;Id4:1028801",
"q":"09uop0115613936"
}
]
}

2011-01-01T00:00:52.414854Z : <<<<< Iniializing IM XN >>>>>>>>
2011-01-01T00:00:54.048400Z : <<<<<<<IM XN API Call received>>>>>>>>
2011-01-01T00:00:54.048791Z : <<<<<<<IM XN Call Reached API>>>>>>>>
2011-01-01T00:00:54.049162Z : Output string is

2001-01-01T00:00:52.413514Z : Output is
{
"xme":
[
{
"a":"uty200866600",
"b":"yes",
"c":"1.2.1.2",
"d":"an0:cb5:s62:1de:ef8:gh56",
"e":"US/Eastern",
"f":"-18000000",
"g":"3600000",
"h":"0",
"i":"yes",
"j":"http://fakeaddress",
"k":"true",
"l":"ws://1.2.3.4:5",
"m":"http://nextfakeaddress",
"n":"search hsd1.name2.;nameserver 5.5.5.5;nameserver 5.5.6.6;",
"o":"7.2.2.9 Pa;7.2.2.9 Pa;",
"p":"Id1:1030;Id2:19004;Id3:08;Id4:1028801",
"q":"09uop0115613936"
}
]
}

My out put table should look like:

**timestamp|  device_serial_number | receiver_id | mac_address|  a |b| c |d| e| f |g| h| I| j|  m | q 
2001-01-01  00:00:52|uty200866625|u0115613936|p0:n5:02:1d:k8:56|uty200866625|yes|1.5.9.3|a00:b5:c62:1d:e8:f56|US/Eastern|-18000|3600|0|yes|http://fakeaddress|http://nextfakeaddress|09uop0115613936**

In log file upto end of json data will be consider as my one output row or one record. like my 1 st row of output will be .My question is using splunk can we achieve this?

0 Karma

woodcock
Esteemed Legend

This should get you most of the way there:

... | extract pairdelim=",[\r\n]" kvdelim=":" | rex "gateway device serial number is (?<device_serial_number>.*)" | rex "gateway device receiver id is (?<receiver_id >.*)" | rex "Host mac address is (?<mac_address >.*)" | transaction maxpause=5s | fieldformat _time=strftime(_time, "%Y-%m-%d %H:%M:%S") | table _time device_serial_number receiver_id mac_address a b c d e f g h I j m q
0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...