Splunk Search

Timestamp is extracted as none

alvinsullivan01
Explorer

I have issue to transform data and extracting the fields value. Here is my sample data.

2025-07-20T10:15:30+08:00 h1 test[123456]: {"data": {"a": 1, "b": 2, "c": 3}}

The data has timestamp and other information in the beginning, and the data dictionary at the end. I want my data to go into Splunk in JSON format. Other than data, what I need is the timestamp. So I create a transform to pick only the data dictionary and move the timestamp into that dictionary. Here is my transforms.

[test_add_timestamp]
DEST_KEY = _raw
REGEX = ([^\s]+)[^:]+:\s*(.+)}
FORMAT = $2, "timestamp": "$1"}
LOOKAHEAD = 32768

Here is my props to use the transforms.

[test_log]
SHOULD_LINEMERGE = false
TRUNCATE = 0
KV_MODE = json
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
disabled = false
pulldown_type = true
TIME_PREFIX = timestamp
MAX_TIMESTAMP_LOOKAHEAD = 100
TIME_FORMAT = %Y-%m-%dT%H:%M:%S
TZ = UTC

TRANSFORMS-timestamp = test_add_timestamp

After transform, the data will look like this.

{"data": {"a": 1, "b": 2, "c": 3}, "timestamp": "2025-07-20T10:15:30+08:00"}

But when I search the data in Splunk, why do I see "none" as value in timestamp as well?

Screenshot 2025-07-31 at 4.57.29 PM.png

Another thing I noticed is in my Splunk index that has many data, I can see few data has this timestamp extracted, and most of them have no timestamp, which is fine. But when I click "timestamp" under interesting fields, why is it showing only "none"? I also noticed some of the JSON keys are not available under "interesting fields". What is the logic behind this?

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @alvinsullivan01 ,

you used a wrong TIME_PREFIX: timestap il calculated as first operation, so please try:

TIME_PREFIX = ^

Ciao.

Giuseppe

View solution in original post

PrewinThomas
Motivator

@alvinsullivan01 

If the field does not exist at the moment Splunk attempts to extract time, it might report it as "none".
Can you try with below config,


props.conf

[test_log]
SHOULD_LINEMERGE = false
KV_MODE = json
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
disabled = false
pulldown_type = true
TRUNCATE = 0

TRANSFORMS-addtimestamp = test_add_timestamp

TIME_PREFIX = "timestamp":"
TIME_FORMAT = %Y-%m-%dT%H:%M:%S%z
MAX_TIMESTAMP_LOOKAHEAD = 50
TZ = UTC


transforms.conf

[test_add_timestamp]
DEST_KEY = _raw
REGEX = ^([^\s]+).*?({.*})$
FORMAT = {"timestamp":"$1", "data":$2}

 

Regards,
Prewin
Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!

0 Karma

alvinsullivan01
Explorer

Thank you for your reply @PrewinThomas 

Unfortunately I'm still getting the same both "2025-07-20" and "none" value in 1 event.

Screenshot 2025-08-01 at 12.38.37 PM.png

0 Karma

PickleRick
SplunkTrust
SplunkTrust

If you're ok with the timestamp just being assigned to an event (no need to have it explicitly written in the event itself),  just parse out the timestamp, cut the whole header and just leave the json part on its own.

Timestamp recognition takes place very early in the ingestion pipeline so you can do this way and not have to have the "timestamp" field in your json. You'll just have the _time field.

alvinsullivan01
Explorer

Thank you for your reply @PickleRick 

Yes you are right, that is another possible solution. But if possible, the requirement I have now is to have the "timestamp" field included in the event JSON. Do you have any idea why my "timestamp" field has both value "2025-07-20" and "none"?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

It's  hard to say without knowing your full config.

Anyway, answering your earlier question, the "interesting fields" section shows only fields for which there is a value in at least some part of your all events (don't remember the threshold - 10%? 15%?). So if you have a field which appears in just 3% of your search results it will not be listed under interesting fields.

One more thing, if you have a value of "none" for your timestamp field it means that it has been explicitly assigned this value. If there is no value there's just no value. "none" in this case is a string saying "none".

alvinsullivan01
Explorer

@PickleRick Thank you for the info! That may explain why some of the fields are not showing up in interesting fields. Though I can see as in my screenshot that there is only 1 "timestamp" field in my JSON and it has value, so I'm still not sure where is the "none" coming from.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

That's why I'm saying that without knowing your full config it's impossible to tell you where it comes from. The "none" value seems like some kind of a safeguard calculated value so that you have _some_ value assigned even if no/wrong value was extracted in the first place (it's a typical operation for some fields in CIM datamodels for example). But normally I'd expect it to overwrite the original field. So for no we can only guess.

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @alvinsullivan01 

If this is a JSON event then you should be able to use 

TIMESTAMP_FIELDS = timestamp
#Also adjust TIME_FORMAT to include the timezone.
TIME_FORMAT = %Y-%m-%dT%H:%M:%S%z

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

alvinsullivan01
Explorer

Thank you for your reply @livehybrid 

I am currently using TIME_FORMAT as you suggested, but correct me if I'm wrong, is that for _time, not "timestamp" field?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @alvinsullivan01 ,

this seems to be a json format, the raw data in a json format are different than the visualized ones, so to check you regex, open the raw data visualization and not the json visualization: probablu you have to add some backslash to your regex.

Ciao.

Giuseppe

0 Karma

alvinsullivan01
Explorer

Thank you for your reply @gcusello 

Here is how it looks like in raw visualization.

Screenshot 2025-08-01 at 10.19.43 AM.png

It looks like a valid JSON format. Do you have any idea why do I have both "2025-07-20" and "none" value under timestamp?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @alvinsullivan01 ,

you used a wrong TIME_PREFIX: timestap il calculated as first operation, so please try:

TIME_PREFIX = ^

Ciao.

Giuseppe

alvinsullivan01
Explorer

Thank you, that works!

Out of curiosity, do you know why Splunk extracted both "2025-07-20" and "none" when TIME_PREFIX was wrong?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Ahhh... I doublechecked and it seems Splunk also populates the timestamp by default (to be honest, I didn't know that; I thought it was only the _time field and those "partial" fields like date_hour, date_day and so on).

So when you had your TIME_PREFIX wrong Splunk - since as I said timestamp recognition happend very early in the ingestion pipeline - would attempt to parse out the timestamp from the message. Since the prefix was wrong Splunk wouldn't properly extract the time so it would set the _time using default logic (either to the timestamp of the previous event or the current time of the receiving host; the details of the time assignment are pretty well docummented, for example here - https://help.splunk.com/en/data-management/get-data-in/get-data-into-splunk-cloud-platform/10.0.2503... ) and it would set the  timestamp field to "none".

And then in search time your already transformed json would get parsed into fields so the value from the event would be added to the already present value of "none" making the field a multivalued field.

 

alvinsullivan01
Explorer

I see, that's good to know. Thanks for the explanation!

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Good catch. Timestamp recognition takes place right after line breaking. Definitely way before all this fields shuffling. So the timestamp at this point is still in the header of the event.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

 Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What's New in Splunk Observability - August 2025

What's New We are excited to announce the latest enhancements to Splunk Observability Cloud as well as what is ...

Introduction to Splunk AI

How are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. Lucky for ...