Getting Data In

How to ignore and stop indexing of timestamps from CSV events? (sent from a forwarder)

Contributor

Hi,

I am almost stuck on this for three days now. I am unable to stop indexing of the timestamp from the events. But when I set

DATETIME_CONFIG = NONE or DATETIME_CONFIG = CURRENTI am unable to the see the fields of the CSV file. I even explicitly specified the DELIMS="," & FIELDS_NAME="field1","field2","field3"

Below are the details of configuration and sample event: (Commented options are which I have tested, but not working still.)

This is my props.conf (http://docs.splunk.com/Documentation/Splunk/6.3.1511/Admin/Propsconf?utm_source=answers&utm_medium=i... )

[custom_csv]
DATETIME_CONFIG = NONE
MAX_TIMESTAMP_LOOKAHEAD = 0
SHOULD_LINEMERGE = False
#pulldown_type = true
#INDEXED_EXTRACTIONS = csv
#FIELD_DELIMITER=,
#HEADER_FIELD_DELIMITER=,
#KV_MODE = none
#category = Structured

Sample events:

User ID,First Name,Last Name,Account Enabled,User Locked,Serial Number,Token Type,Token Lost,Token Expiration Date,PIN Type,Token Enabled,Date Last Logged In,Days Since Last Log In

xy111111,Firstname,lastname,Yes,FALSE,xxxxx,myID 200,FALSE,9/30/2016 4:00,code,Yes,11/28/2015 9:13,0
xz000000,first Name,last Name,Yes,FALSE,xxxxxx,myID 700,FALSE,10/31/2016 4:00,code,Yes,7/4/2014 1:37,513
yz222222,firstname,Last Name,Yes,FALSE,xxxxxx,myID 300,FALSE,5/31/2019 4:00,code,Yes,9/9/2014 8:34,445

Main problem is caused by field Expiration Date field which is in Future and 4:00 is considered as time for the events.

Can anyone shed some light if I am missing something.? Or is it a bug in 6.3.1 we are running the latest version.

Thanks,

0 Karma

Esteemed Legend

The problem may be where you are deploying the configuration files, whether you have restarted splunk and what are your expectations regarding already-indexed events.

Did you restart the Splunk instances on the Forwarders wherever you may have changed inputs.conf (e.g. the sourctype may not have updated so the new value is not referencing into props.conf correctly)? Did you restart the Splunk instances on the Indexers where you put props.conf and transforms.conf (or on your Forwarders if using Heavy Forwarders or INDEXED_EXTRACTIONS)? Double-check this list:

* The sourcetype matches replace_sourcetype_with_containing_directory exactly (casing, punctuation, etc.).
* The props.conf and transforms.conf configuration files are deployed to the Indexers or Heavy Forwarders (or Universal Forwarders in some cases, such as INDEXED_EXTRACTIONS = CSV).
* The inputs.conf configuration file is deployed to the Forwarder.
* You must restart/bounce all Splunk instances on the servers where you deploy these files.
* There are no configuration errors during restart (watch the response text during startup on one server of each type).
* You are verifying proper current function by looking at NEW data (post-deploy/post-bounce), not previously indexed data (which is immutable).
0 Karma

Motivator

Hello. choose one or more fields in your event, and Add this attribute in your props.conf file, to tell Splunk to specify all such fields which constitute the timestamp

TIMESTAMP_FIELDS = field1,field2,...,fieldn

eg:

TIMESTAMP_FIELDS = "Date Last Logged In"

Thanks

0 Karma

Contributor

Hi Stephan,

But my query was I do not want any timestamps to be present in date_* fields. timestamp should not be parsed at all.

0 Karma

Explorer

Hi,
I've tried it with your sample events. It worked for me with the following props.conf content:

[custom_csv]
DATETIME_CONFIG = CURRENT
INDEXED_EXTRACTIONS = csv
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
disabled = false
pulldown_type = true

The date is then the current time stamp, all your fields are recognized.

Contributor

Hi Cab you let me know if you are seeing any date_hour,date_min or date_mday such fields.

Because i was getting these date_* fields because of which my fields were distributed everywhere

0 Karma

Explorer

Hi,

No, I don't have time fields, because I use the current time.
Splunk only extracts the date_* fields, if it needs to parse the event for a timestamp.

I understood it in the way, that you don't want to use any time stamp from the raw_ data/ event data.
If you want to use the current time but also need the date_* fields, perhaps i've found one helpful post here on Splunk answers:
link text

I hope this helps.

0 Karma

Contributor

hi i do not need date_* fields even if i set datetime_config to current i still get date_* from events values...

Can you please let me know how i can eliminate that values of date_* values

0 Karma

Explorer

Hey ho,

with my props.conf content you should not get any date_* fields. It worked for me.
Did you try it with the props.conf from me? With new indexed data it should worked, but it has no effect on data which is already indexed

For the data Splunk has already indexed, i don't know a way how to delete the date_* fields. You have to reindex the data for this. This means, you need to clean the fishbucket.

I would first check, if it works with the props.conf from me. Just try it with your sample event data and a one shot file input.
If it works, you can use
1. the command "splunk clean eventdata" to reindex the file. ATTENTION: this will reindex all file monitors in the splunk instance
or
2. splunk cmd btprobe -d $SPLUNK_HOME/var/lib/splunk/fishbucket splunk_private_db --file [file_to_reindex] --reset -> Splunk will only reindex the data of the given file

0 Karma

Contributor

inputs.conf is simple. Below is the inputs.conf stanza

[monitor:///opt/metrics_log_upload/rsa/*.csv]
sourcetype = custom_csv
0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!