Please save my forehead from slamming into my keyboard over this one because it's eluding me and following the docs at this link isn't getting my anywhere.... http://docs.splunk.com/Documentation/Splunk/6.1.2/Knowledge/Addfieldsfromexternaldatasources
Here's my setup:
Snippet of $SPLUNK_HOME/etc/apps/search/local/transforms.conf
[extra.auth_status]
filename = auth_data.csv
Snippet of SPLUNK_HOME/etc/apps/search/local/props.conf
[inv-json]
LOOKUP-auth = extra.auth_status extra.user_id OUTPUT Create_Date, Cohort, Twitter_Auth, FB_Auth, YT_Auth
Snippet of the auth_data.csv file located in $SPLUNK_HOME/etc/apps/search/lookups
extra.user_id,Create_Date,Cohort,Twitter_Auth,FB_Auth,YT_Auth
XXXXXXXXXXXXXXX,0,DEP,0,0,0
YYYYYYYYYYYYYYY,0,RES,0,0,0
ZZZZZZZZZZZZZZZ,0,INT,0,0,0
The CSV file is also readable by everyone so that should not be an issue...
-rw-rw-r-- 1 ubuntu ubuntu 69593 Apr 22 17:34 auth_data.csv
What I'm trying to do is add the following fields to entries with a corresponding extra.user_id value. So, when a log event comes in with XXXXXXXXXXXXXXX as a value for extra.user_id, add the values of Create_Date, Cohort, Twitter_Auth, FB_Auth, YT_Auth from the lookup table to the event.
Like I said, I've got two other lookups that work fine, the only difference is that those aren't adding multiple fields. According to the docs, adding multiple fields is supported. I'm sure I'm overlooking something simple but, it's eluding me.
I already checked that the cdv file contains no hidden funky characters using strings. It looks clean.
Thoughts?
The answer is, the fields in the lookup table Create_Date and Cohort had spaces in the values which Splunk didn't like. Encapsulating those in quotes made Splunk happy. And me, but, just a little.
Derf, solved it for now. If I include the single tick in the Create_Date>'2013-12-31 statement it works as intended and I can now specify ranges.
Now onto the next part which will likely be a separate question so I'm going to mark this one as answered. Thank for the help Martin!
Well, I figured out how to do it in single quotes but, am struggling to get it out with double quotes thanks to Python. Single quotes works and splunk does not barf but, it does not see Create_Date as a date field and so I can't get a search going where I can put something in like Create_Date>2013-12-31 for example 😕
I'd wrap that in "double quotes".
Yeah, I think I've found the problem and am working on proving it out. For some records in the table the Create_Date and Cohort fields have a space in it which I think is making spunk barf. Seems strange to me since it's between the commas.
Assuming that this is the issue, what does the hive suggest is the best way to deal with it?
Example:
ZZZZZZZZZZZZZZZ,2014-04-19 23:31:55,Free App Default,0,0,0
Copying over those settings works fine for me.