Getting Data In

Timestamp Issues

mawwx3
Explorer

I have used the SEDCMD to take out an excess time that was added to the beginning of my logs so that the timestamp would use the second time (now the only time) showing in the event. The timestamp however still does not correspond to the time in the event.

Here is my props.conf:

[dns_data]
SEDCMD-dns = s/^\S+\s+\S+\s+\S+\s+//

Here is an event I am looking at (after the SEDCMD has taken affect):

foo.bar.com Mon Jun 28 11:42:02 2010: 123.123.123.12 -> 122.122.122.22: 60053 NOERR 'some.thing.com.' A IN (a#2) (n#6) (x#3) ANS some.thing.com. A IN 123.123.123.23

The timestamp however shows:

6/28/10
11:44:11.000 AM

Which does not correlate to the time in the event. What can I do to fix this?

Thanks.

Tags (2)
0 Karma
1 Solution

Lowell
Super Champion

Timestamp recognition is done before any event transformations. Therefore changes made to your event using SEDCMD do not take effect until after the timestamp is extracted from your events.

You will have to use a different approach, such as setting a custom TIME_PREFIX regular expression to select a different timestamp for splunk to use.

Docs:


I could be mistaken, but it seems like this is the 3rd time you are reposting essentially the same question. That's got to be frustrating....

So I'm thinking that your problem isn't with timestamp recognition but with something else. There seems to be something in your config settings that are keeping them from being applied where you want them to be. One option you might want to consider if opening a support case with splunk support. You can run the "splunk diag" utility and send them the generated file, this includes all your configuration information which should give them enough details to track down the real issue.

That's my 2 cents.

View solution in original post

grahampoulter
Path Finder

The feature of timestamping being done before SEDCMD is useful. I have logs which are pipe-separated key=value pairs, but with a leading timestamp before the key-value pairs.

The key-value extractor churns out lots of spurious field names starting with a timestamp, but I am going to try using SEDCMD to remove the timestamp before using REPORT-fields = pipe-kv

0 Karma

Lowell
Super Champion

Timestamp recognition is done before any event transformations. Therefore changes made to your event using SEDCMD do not take effect until after the timestamp is extracted from your events.

You will have to use a different approach, such as setting a custom TIME_PREFIX regular expression to select a different timestamp for splunk to use.

Docs:


I could be mistaken, but it seems like this is the 3rd time you are reposting essentially the same question. That's got to be frustrating....

So I'm thinking that your problem isn't with timestamp recognition but with something else. There seems to be something in your config settings that are keeping them from being applied where you want them to be. One option you might want to consider if opening a support case with splunk support. You can run the "splunk diag" utility and send them the generated file, this includes all your configuration information which should give them enough details to track down the real issue.

That's my 2 cents.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...