Getting Data In

How to extract date field from the filename in Splunk and assign _time value to that in indexing phase

snehalk
Communicator

Hello Everyone,

I have text file 20170701.txt where 2017-year, 07-month and 01-date.

This file is coming from the universal forwarder, below is my inputs.conf (C:\Program Files\SplunkUniversalForwarder\etc\system\local)

[monitor://C:\sampletestfile\*]
index=test
sourcetype=largefile
crcSalt = <SOURCE>

In the heavy forwarder side i wrote the datetime.xml and props.conf (C:\Program Files\Splunk\etc\apps\myapp\local)

  <define name="_mydatetime" extract="year, month, day">
    <text><![CDATA[source::.*?_(\d{4})(\d{2})(\d{2}).txt]]></text>
 </define>
 <timePatterns>
    <use name="_mydatetime"/>
 </timePatterns>
 <datePatterns>
    <use name="_mydatetime"/>
 </datePatterns>
 </datetime> 

props.conf

[largefile] 
DATETIME_CONFIG = /etc/apps/myapp/local/datetime.xml

After the doing changes i have restarted both UF and HF.

The problem is am not getting "20170701" date as indexing time in Splunk.

And in Splunkd.log am getting below error

07-05-2017 16:13:54.893 +0530 ERROR AggregatorMiningProcessor - Uncaught exception in Aggregator, skipping an event: Error parsing regex XML file: C:\Program Files\Splunk\etc\apps\myapp\local\datetime.xml - Couldn't find 'timePatterns' in config data for AggregatorProcessor. - data_source="C:\sampletestfile\20170701.txt"

Can any one please guide me where am wrong.

Thanks in Advance

jplumsdaine22
Influencer

There's no tag at the start of your xml file, which is probably causing the whole thing to be skipped.

0 Karma

snehalk
Communicator

Hello jplumsdaine22,
Thanks for reply, after adding tag in code, still am getting same error. could you please help me here?

0 Karma

gregbo
Communicator

Did you ever get help on this?

0 Karma

woodcock
Esteemed Legend

Why are you using Heavy Forwarder here? It seems to add no benefit and therefore makes no sense.

0 Karma

snehalk
Communicator

Hello woodcock,

Yes,in current scenario there is no need, but we have this as current environment, and also we have some files where we are using parsing. but not for this source.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...