Splunk Search

Extract filed from delimited log file

New Member

Hi,

I am trying to extract some custom fields form a log file which is delimited by :: and i made the following set up in props.conf and transforms.conf :

props.conf :

[storngmail_failed]
TRANSFORMS-strongf=parse_strongmail_failed

trasnforms.conf :

[parse_strongmail_failed]
DELIMS = "::"
FIELDS = "Date", "Serial-Number", "mailing-ID", "Database-ID", "Message-ID", "User-ID", "DB-RN", "DB-NAME", "Msg-SN", "Email-Address", "Bounce-Reason", "VSG-Name", "Outbound-IP", "Reciever-IP", "Category"

How can I configure props.conf or transforms.conf in order to do that an where do i should put these files?

Thanks !!!

Tags (1)
0 Karma

New Member

Just one more question :

I can see in Manager-->Field extractions the settings I configured above but the fields are not available when i use the search app.
how can make the new extract fields available ?

Thnks,

0 Karma

Communicator

sorry , I missed something
I saw you use "TRANSFORMS" to extract fields, it's index time field extraction , you need to "reindex" (chean and index again) your log files , I suggest you change to "REPORT" , it's search time field extraction .

for more information about the difference between REPORT and TRANSFORMS , please refer to props.con.spec line 413.

0 Karma

New Member

HI thanks for the answer but, what do you mean by putting this config on search head ?

Thanks Again !!!

0 Karma

Communicator
0 Karma

Communicator

hi,
I think you already make sure your sourcetype is "strongmail_failed"

second, you can put those two conf files under $SPLUNK_HOME/etc/system/local if you haven't create your own App . if you had created your own App said "strongmail", you can put conf files under $SPLUNK_HOME/etc/apps/strongmail/local/

and because it is "search time" configuration , so you should put them on search head

0 Karma