Getting Data In

Why is my field transform using DELIMS not working?

tkwaller
Builder

Hello

I have a field transform setup that doesn't seem to be working:

transforms.conf

 [coldfusionapplication]
 DELIMS = ","
 FIELDS = "status","message_id","message_delivered_date","message_delivered_time","service","payload"

props.conf

 [cfj:applog]
 REPORT-cfjapplog = coldfusionapplication
 EVAL-app= "Coldfusion"
 DATETIME_CONFIG = CURRENT
 LINE_BREAKER = ([\r\n]+)
 SHOULD_LINEMERGE = false

I have this setup on my Search Head Cluster but I'm not seeing the fields from the DELIMS. I DO however see the calculated field "app" from EVAL-app= "Coldfusion" so I know at least PART of this is working.

Fields Available
host
source
sourcetype
FileContent
StatusCode
app
app_pool
datacenter
date_hour
date_mday
date_minute
date_month
date_second
date_wday
date_year
date_zone
environ
eventtype
fieldList
hidden
index
linecount
locale
name
punct
qa_env
rows
sourceId
splunk_server
status
tag
tag::eventtype
target_host
timeendpos
timestartpos
units

are all the available fields.

Any ideas on what I'm doing incorrectly?
Thanks for the help!

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

Try these: (sorry, I changed names, etc:) - u can insert TZ into props.conf if you have systems in disparate timezones.
Props.conf:
[cfapplog]
SHOULD_LINEMERGE = False
pulldown_type = 1
REPORT-getfields = cfapp_fields
Transforms.conf:
[cfapp_fields]
DELIMS=","
FIELDS = "status","message_id","message_delivered_date","message_delivered_time","service","payload"

0 Karma

tkwaller
Builder

Ive found the error log:
11-29-2016 20:00:44.305 +0000 WARN SearchOperator:kv - Invalid key-value parser, ignoring it, transform_name='coldfusionapplication'

just not sure whats wrong with it

0 Karma

tkwaller
Builder

Nope, not working. The transforms dont even appear in Splunk Web(GUI), like it doesnt exist.

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

I just typed this all out, and now it seems to have dissappeared, so apologies if it all appears twice:
I used the props and transfroms I inserted above and all worked for me. A few things -
1) check and make sure you have referred to the correct transforms stanza in your props,conf (coldfusionapplicaiton in your original post), or replace the contents of your stanzas in your props and transforms with the contents of mine above. Make sure to rename the REPORT-getfields to the correct value.
2) check that your sourcetype is not defined in multiple locations (mine are in $SPLUNK_HOME/etc/apps/search/local/ props.conf and transforms.conf
3) You can go to :http://localhost:8000/en-US/debug/refresh and click the refresh button(if on your laptop, or enter your splunk servername) and refresh props and transforms without having to restart splunk.

0 Karma

tkwaller
Builder

So:
1. Yes I referenced them correctly
props.conf
[cfj:applog]
REPORT-coldfusionapplog = coldfusionapplication
pulldown_type = 1
EVAL-app= "Coldfusion"
DATETIME_CONFIG = CURRENT
LINE_BREAKER = ([\r\n]+)
SHOULD_LINEMERGE = false

transforms.conf
[coldfusionapplication]
DELIMS=","
FIELDS = status,message_id,message_delivered_date,message_delivered_time,service,payload

  1. mine are in $SPLUNK_HOME/etc/apps/SA-coldfusion/local/ props.conf and transforms.conf

  2. Im building this into my app so I push the changes from my deployer to the search head cluster. I also refresh as well

  3. List item

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

The FIELDS = needs to be a quoted string list as in:
FIELDS = "status","message_id","message_delivered_date","message_delivered_time","service","payload"
I see that difference and Ido not have DATETIME_CONFIG = CURRENT. I would remove that setting anyway, as the docs say this about it:
DATETIME_CONFIG =
* Specifies which file configures the timestamp extractor, which identifies
timestamps from the event text.
* This configuration may also be set to "NONE" to prevent the timestamp
extractor from running or "CURRENT" to assign the current system time to
each event.
* "CURRENT" will set the time of the event to the time that the event was
merged from lines, or worded differently, the time it passed through the
aggregator processor.

0 Karma

sshelly_splunk
Splunk Employee
Splunk Employee

can u paste a bit of source data by any chance ? maybe 2 or 3 lines, w/IP's changed ?

0 Karma

tkwaller
Builder

"Information","a0-0.1.0.0-4010-2","11/29/16","02:25:16","INTRANET","inside of autoComplete method with string=fort"
"Information","a0-0.2.0.0-4010-2","11/29/16","02:23:42","INTRANET","inside of autoComplete method with string=at&t"
"Information","a0-0.4.0.0-4010-1","11/29/16","02:05:36","INTRANET","inside of autoComplete method with string=oracle"
"Error","a0-0.0.3.0-4010-1","11/29/16","02:05:36","intranet","Exception returned from api call. StatusCode=503 Service Unavailable FileContent=<p>Site is not available since below pool is down :</p> <p>Pool Name: XXXX</p> The specific sequence of files included or processed is: STUFF, line: 358 "

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...