Getting Data In

Why am I unable to capture an accurate timestamp from a CSV field in the middle of a log message and recognize it as GMT?

aculveruwo
Explorer

We're collecting logs which have the timestamp in the middle of the log message, which is also in GMT. I'm trying to define the pattern for the timestamp and tell Splunk to treat it as GMT.

I've defined the following in props.conf:

[splunk@ziva local]$ pwd
/opt/splunk/etc/deployment-apps/DS-its-o365-audit/local
[splunk@ziva local]$ cat props.conf 
[o365-audit-smtp]
TIME_FORMAT = "%m/%d/%y %I:%M:%S %p"
TZ = GMT

And I also have the following transform to handle the CSV fields:

[splunk@ziva local]$ pwd
/opt/splunk/etc/deployment-apps/DS-transform/local
[splunk@ziva local]$ head -10 transforms.conf | tail -5
# o365-audit CSV
[o365-audit-smtp]
DELIMS = ","
FIELDS = "PSComputerName","RunspaceId","PSShowComputerName","Organization","MessageId","Received","SenderAddress","RecipientAddress","Subject","Status","ToIP","FromIP","Size","MessageTraceId","StartDate","EndDate","Index"

The Received field is the timestamp that should be used for the _time field of the message. Received is properly populated, but the _time field is often off by a few seconds to several minutes, as is the case here:

17/12/2015 
12:14:28.000    
"ps.outlook.com","2d2269bb-7461-4cb2-b528-8cc6fb965d4b","False","uwoca.onmicrosoft.com","<a2e367c288f033fa7e4ccff7269cab78.squirrel@www.stats.uwo.ca>","12/17/2015 12:22:07 PM","sender@stats.uwo.ca","recipient@uwoca.onmicrosoft.com","Re: AS 2053 -- A Quick Question","Resolved","","129.100.1.9","19497","f06db2f9-9d22-470d-b27b-08d306dcae20","12/17/2015 6:00:00 AM","12/17/2015 12:00:00 PM","81130"
Event Actions
Type    
Field   Value   Actions
Selected    
FromIP
129.100.1.9 
RecipientAddress
recipient@uwoca.onmicrosoft.com 
SenderAddress
sender@stats.uwo.ca 
Subject
Re: AS 2053 -- A Quick Question 
host
O365-Audit  
Event   
EndDate
12/17/2015 12:00:00 PM  
Index
81130   
MessageId
<a2e367c288f033fa7e4ccff7269cab78.squirrel@www.stats.uwo.ca>    
MessageTraceId
f06db2f9-9d22-470d-b27b-08d306dcae20    
Organization
uwoca.onmicrosoft.com   
PSComputerName
ps.outlook.com  
PSShowComputerName
False   
Received
12/17/2015 12:22:07 PM  
RunspaceId
2d2269bb-7461-4cb2-b528-8cc6fb965d4b    
SenderUsername
sender  
Size
19497   
StartDate
12/17/2015 6:00:00 AM   
Status
Resolved    
index
its-o365-audit  
linecount
1   
splunk_server
ducky.its.uwo.pri   
user
sender  
user_combined
sender  
Time        
_time
2015-12-17T12:14:28.000-05:00   
Default 
punct
"..","----","","..","<.@...>","//_::_","@..","@.."  
source
C:\Logs\SMTP\SMTP_Logs_6HRS_12-17-2015_12-00-00.csv 
sourcetype
o365-audit-smtp 

It's also not handling the time as GMT, if it's even handling that timestamp at all. What am I doing wrong?

0 Karma

woodcock
Esteemed Legend

Add this to your props.conf:

TIME_PREFIX = ^(?:"[^"]*",){5}"

Also change GMT to UTC.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The timestamp is pretty far into the event - so far as to risk running into 150-character limit before finding it. Consider adding MAX_TIMESTAMP_LOOKAHEAD=300 to props.conf.

---
If this reply helps you, an upvote would be appreciated.
0 Karma

sundareshr
Legend

Have you considered using INDEXED_EXTRACTIONS=csv and TIMESTAMP_FIELDS=Received?

http://docs.splunk.com/Documentation/Splunk/6.2.0/Data/Extractfieldsfromfileheadersatindextime

aculveruwo
Explorer

I have the props.conf with the timestamp directives in the DS-its-o365-auth app, which is run on the forwarder where the CSV files are being read. The DS-transform app that handles the CSV format is run on the heavy forwarders and search heads.

Would the INDEX_EXTRACTIONS and TIMESTAMP_FIELDS directives work in the app on the forwarder, before CSV field extraction has happened? Or would I need them to be run after the DS-transform has handled the CSV field extraction?

0 Karma
.conf21 Now Fully Virtual!
Register for FREE Today!

We've made .conf21 totally virtual and totally FREE! Our completely online experience will run from 10/19 through 10/20 with some additional events, too!