Getting Data In

Why are my events logging the incorrect timestamp when Onboarding CSV File Data?

blbr123
Path Finder

Hi All,

I had a request to Onboard the CSV file from a path in source to our splunk Cloud.

I have completed the below configurations:

Inputs.conf

props.conf

transforms.conf

I can now see the data in splunk but I am seeing all the events with same time stamp that I 11:00 am 25th March 2022.

I am not able to find out what's the problem

Below are my configurations:

 

props.conf

 

[sns:CSV]

category=Structured

INDEXED_EXTRACTIONS=CSV

KV_MODE=none

FIELD_DELIMITER=,

HEADER_FIELD_DELIMITER=,

FIELDS_NAMES=field1,field2... and so on

TIMESTAMP_FIELDS=stattime

TRANSFORMS-eliminate_header = eliminate_header

 

 

Transforms.conf

 

[eliminate_header]

REGEX = ^(?: _id)

DEST_KEY = queue

FORMAT = nullQueue

 

 

 

 

 

 

 

 

 

 

Tags (2)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @blbr123,

onlytwo questions:

  • are you sure about the name of the timestamp field (stattime)?
  • where do you located props.conf and transforms.conf files? only for indexed extractions they must be also on Forwarders.

Ciao.

Giuseppe

0 Karma

blbr123
Path Finder

I am not sure about timestamp field and have kept the props and transforms in forwarder and also kept props in hf

0 Karma

blbr123
Path Finder
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @blbr123,

ok, check the timestamp field name and let me know if you solved.

Ciao.

Giuseppe

0 Karma

blbr123
Path Finder

In the CSV file provided to me there is no timestamp field

0 Karma

blbr123
Path Finder

So I am guessing I need to keep the timestamp field blank so that it takes indexed time stamp?

@gcusello 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @blbr123,

if the indexing timestamp is ok for you you can have it inserting in props.conf

DATETIME_CONFIG = CURRENT

instead TIMESTAMP_FIELD=...

but probably the real timestamp is the best solution: can you see the header of your file to be sure about the name of the timestamp field?

Ciao.

Giuseppe

0 Karma

blbr123
Path Finder

@gcusello yes I have checked the CSV file and their is no header field which carries the timestamp so in that case how to get the real time?

And is there any drawback or issue if I am ok with index time stamp?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @blbr123,

did you tested my previous answer?

Ciao.

Giuseppe

0 Karma

blbr123
Path Finder

My change request time window is over so I have to try it on Monday only.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

If the parsing component (in your case - the forwarder) cannot parse the timestamp from the event, it assumes the current timestamp at the moment of event processing. (you can also enable such timestamping explicitly in the config but that's beside the point)

So unless you define a proper field or set of fields that are supposed to carry the timestamp and splunk is able to parse the timestamp out of this field or set of fields (either automatically or by means of defined parsing rules), you'll get a "current" timestamp of the ingestion moment.

0 Karma

blbr123
Path Finder

@PickleRick it's not assuming the current timestamp of even processing,  because  now the time at my place is 5:28PM and the timestamp for the events I see is the same for all the events that is 11:05:29:000 AM.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

The question is what is the time not at "your place" but in your forwarder 🙂

Oh, and sorry, forgot about one other case. With indexed extractions if splunk is not able to guess the time and previously known time (which could also be incorrect) is not longer than some period (configurable, I don't remember what the default value is), it will assume that the event has the same timestamp as the previous one.

You should have logs in your splunkd.log saying that it did so - it used previous event's timestamp.

0 Karma

blbr123
Path Finder

I tried this query 

index=_internal source=splunkd.log

Does not show anything

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Try

index=_internal source=*splunkd.log

As a general rule using wildcard at the beginning of sought terms is bad but in this case it's relatively harmless (especially since it's an indexed field).

0 Karma

blbr123
Path Finder

@PickleRick  great thank you but from search output how do I check if it's using previous time?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

In my case I get something like that:

03-25-2022 14:11:05.202 +0100 WARN DateParserVerbose [15176 merging] - Failed to parse timestamp in first MAX_TIMESTAMP_LOOKAHEAD (150) characters of event. Defaulting to timestamp of previous event (Thu Mar 24 13:57:00 2022).

After that you have source context.

0 Karma

venkatasri
SplunkTrust
SplunkTrust

@blbr123  Here is the tested settings, they need to be split among Uf/HF if you have intermediate forwarder (HF).

Copy these settings to both UF and HF, if no intermediate forwarder copy it to Indexer. There are some settings don't need to be on HF however no harm it just ignores!

I would recommend to test the CSV with 'Add Data' option in dev and load to see with props settings changing  to get the desired output. 

Field names might be different in your case, 

## props.conf
[set-csv-test]
INDEXED_EXTRACTIONS = CSV
TIMESTAMP_FIELDS = Datetime
FIELD_NAMES = Units,Variable_code,Variable_name,Variable_category, Datetime
HEADER_FIELD_LINE_NUMBER = 1
TIME_FORMAT = %d/%m/%Y %H:%M:%S
TRANSFORMS-remove-header= Testing-header-removal-csv

# transforms.conf
[Testing-header-removal-csv]
REGEX = ^Units\,
DEST_KEY = queue
FORMAT = nullQueue

CSV sample tested:

Units,Variable_code,Variable_name,Variable_category,Datetime
Dollars (millions),H01,Total income,Financial performance, 05/04/2022 11:00:00
Dollars (millions),H04,Sales, government funding grants and subsidies Financial performance, 05/04/2022 11:00:01
Dollars (millions),H05,Interest, dividends and donations Financial performance, 05/04/2022 11:00:02
Dollars (millions),H07,Non-operating income,Financial performance, 05/04/2022 11:00:03
Dollars (millions),H08,Total expenditure,Financial performance, 05/04/2022 11:00:04
Dollars (millions),H09,Interest and donations,Financial performance, 05/04/2022 11:00:05
Dollars (millions),H10,Indirect taxes,Financial performance, 05/04/2022 11:00:06
Dollars (millions),H11,Depreciation,Financial performance, 05/04/2022 11:00:07
Dollars (millions),H12,Salaries and wages paid,Financial performance, 05/04/2022 11:00:08
Dollars (millions),H13,Redundancy and severance,Financial performance, 05/04/2022 11:00:09
Dollars (millions),H14,Salaries and wages to self employed commission agents,Financial performance, 05/04/2022 11:00:10
Dollars (millions),H19,Purchases and other operating expenses,Financial performance, 05/04/2022 11:00:11
Dollars (millions),H20,Non-operating expenses,Financial performance, 05/04/2022 11:00:12

 

Tested final look, no header and timestamp parsed and header being extracted as fields.

venkatasri_0-1649121277402.png

Hope this helps!

 

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...