Getting Data In

If my log file does not have a header, how can I divide my raw data into 2 fields at index-time with the 1st space as the delimiter on each line?

nitesh218ss
Communicator

Hi

20140902191418.351 TrxManagerFactory.CreateTrxManager Done
20140902191418.351 TransactionBaseMgr.Init
20140902191418.352 TransactionBaseMgr.Init Done
20140902191418.352 TransactionBaseMgr.ProcessTransaction
20140902191418.352 TransactionBaseMgr.InitTransaction
20140902191418.353 CardPaymentTrxMgr.IsNewTransaction
20140902191418.353 CardPaymentTrxMgr.IsNewTransaction Done
20140902191418.354 TransactionBaseMgr.GetTransaction
20140902191418.354 DL_TransactionsMgr.GetTransaction
20140902191418.371 DL_TransactionsMgr.GetTransaction Done

That is my log file. In this, there is no header name present, so i want this raw data divided into 2 field names time and Msg
The Msg (2nd field) starts after the first space.

Example:

time                    Msg 
20140902191418.352      TransactionBaseMgr.Init Done
20140902191418.371      DL_TransactionsMgr.GetTransaction Done

I am not able to set space as a delimiter because the 2nd field Msg has many spaces present in the values. Is it possible to do this at index-time?

0 Karma
1 Solution

nitesh218ss
Communicator

In Props.conf you add

FIELD_NAMES = datetime,Msg
FIELD_DELIMITER = tab

Then they divide your event(line) base on tab and provide name as datetime and Msg

View solution in original post

0 Karma

nitesh218ss
Communicator

In Props.conf you add

FIELD_NAMES = datetime,Msg
FIELD_DELIMITER = tab

Then they divide your event(line) base on tab and provide name as datetime and Msg

0 Karma

stephane_cyrill
Builder

Hi I think this can help you:

index=????? sourcetype=???? | rex  field=_raw "^(?P<TIME>[^ ]+)\\s+(?P<MMESSAGE>.+)"|table TIME MMESSAGE
0 Karma

ngatchasandra
Builder

Hi,
In Inputs.conf , put the stanza like follow (you create an inputs.conf in
$SPLUNK_HOME/etc/system/local/ if it don't exist)

[monitor://........./your.csv]
sourcetype=yoursourcetype

In Props.conf put the stanza like follow (you create an props.conf in $SPLUNK_HOME/etc/system/local/ if it don't exist)

[yoursourcetype]
            FIELD_DELIMITER=\s
            TIMESTAMP_FIELDS=time,Msg
            HEADER_FIELD_DELIMITER=\s
0 Karma

nitesh218ss
Communicator

FIELD_DELIMITER=\s that is not work because in msg have some space to they take separate field

i give you one line of log down
20140902191418.213 CardPaymentServices.Authorize Username and password supplied starting the Customer not Present Request

in this 20140902191418.213 is time and after one space after that msg(CardPaymentServices.Authorize Username and password supplied starting the Customer not Present Request )

0 Karma

ngatchasandra
Builder

Try with

FIELD_DELIMITER=\d+\s
0 Karma

Tanefo
Path Finder

hi,
please cant you send me your files. i would see your files initial structure. my mail: tiwa.romuald@yahoo.fr

0 Karma

juvetm
Communicator

what do you want to do exactly can you explain to me more detail

0 Karma

nitesh218ss
Communicator

I want in indexing time(when i add data ) i set 2 field 1st is timlog and 2nd is Msg
if you see in my log data in question
20140902191418.351 is time after that one space after that Msg(TrxManagerFactory.CreateTrxManager Done)
so i want after indexing(add data) the event result come in to field that is
timlog after Msg with default field(_time,host,sourcetype)

0 Karma

juvetm
Communicator

what you ask is impossible to do that indexing time
note that went you are add data you are uploading the file so during that moment you can not give a header name at that moment because you are uploading a file

0 Karma

nitesh218ss
Communicator

when we set Set Sourcetype after uploading that time any possibility to configure Set Sourcetype base on our requirement?

0 Karma

juvetm
Communicator

what you ask me sir i am still think on it if it impossible at indexing time

0 Karma

nitesh218ss
Communicator

i created reg for msg and time but hoe i put in header name in indexing time

reg for msg ^[^ \n]* (?P(\w+.+|.+))
reg for time ^(?P[^ ]+)

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...