Getting Data In
Highlighted

Json data issues with timestamp

Contributor

Hello,

I'm using python scripts to get data into splunk.
They are getting data in JSon format. Here an example:

{"urgency": "Medium", "first_authenticated_response_at": null, "created_at": "1546463747.48092", "ticket": "rrrrrrrrr", "pending_closure_at": "1546617654.71499", "customer": "ssssssss", "state": "Closed", "last_update_at": "1547224812.31054", "created_by": "aaaaaa", "impact": "Service Degraded", "configuration_item": ["ddddddd", "fffffffff"], "time_spent": "840", "priority": "ttttttttt", "authenticated_at": "1546463747", "publication_type": "Internal", "first_response_at": "1546617654.77845", "closed_at": "1547224812.01911", "cti_item": "Degraded Performance", "initial_team": "yyyyyyyyyyyyy"}

As you can see, the data fields are in epoch time. I would like for Splunk, to use the created_at field as the main filed and get the moment from it. Right now, when I'm indexing data from January, for example, it indexes it with the current time.

This is the configuration I have in the "props.conf":

[tickets_test]
DATETIME_CONFIG = NONE
INDEXED_EXTRACTIONS = json
NO_BINARY_CHECK = true
category = Custom
pulldown_type = 1
TIMESTAMP_FIELDS = "created_at"
disabled = false
TIME_FORMAT = %s.%6N
TIME_PREFIX = "created_at":
sourcetype =  tickets_test

Any idea what I could change to get that data indexed recognizing it's from January?

Thank you in advance.

0 Karma
Highlighted

Re: Json data issues with timestamp

Contributor

Hi. You seem to mostly have the right settings except for the width of subseconds. The time you want to use only has 5 digits for subseconds, you're using 6 which is probably why it's not recognizing the pattern. Also make sure your lookahead is at least 100.

EDIT: I also just noticed that your time prefix doesn't include the space and the double quote after the colon which is probably another reason why your current settings weren't locating the timestamp.

TIME_FORMAT=%s.%5N
TIME_PREFIX="created_at": "
Highlighted

Re: Json data issues with timestamp

Contributor

Still not working.

I have realized that the field disappears when I add from the web and save the changes, and it's not there if I add it through the server file directly.

Now the file props.conf is showing this:

[tickets_test]
DATETIME_CONFIG =
INDEXED_EXTRACTIONS = json
NO_BINARY_CHECK = true
category = Custom
pulldown_type = 1
TIMESTAMP_FIELDS = created_at
disabled = false
TIME_FORMAT = %s.%5N
TIME_PREFIX = "created_at": "
MAX_TIMESTAMP_LOOKAHEAD = 500

Is anything else wrong?

0 Karma
Highlighted

Re: Json data issues with timestamp

Contributor

![Here's what I used along with a screenshot of the results. You can see the timestamp highlighted in red, and all the fields are there.
Are you doing this on a distributed environment?

[jasontest]
DATETIME
CONFIG =
LINEBREAKER = ([\r\n]+)
NO
BINARYCHECK = true
TIME
FORMAT = %s.%5N
TIMEPREFIX = createdat": "
category = Custom
pulldowntype = true
INDEXED
EXTRACTIONS = JSON]1

View solution in original post

Highlighted

Re: Json data issues with timestamp

Contributor

Sorry, I was trying to upload a screenshot and didn't do it right.

0 Karma
Highlighted

Re: Json data issues with timestamp

Contributor

Non- distributed environment. I have changed the things we had different and now it works!
It is recognizing the events for the 1st of October.

Thanks a lot! 🙂

Highlighted

Re: Json data issues with timestamp

Contributor

Great! Glad to help.

0 Karma
Highlighted

Re: Json data issues with timestamp

Loves-to-Learn Lots

@oscar84x  

Hi ,

My timestamp in data looks like: 2020-07-02T18:00:18+02:00 with name last_modified_date which i want to be extracted

i have written below props.conf:

[_json]
INDEXED_EXTRACTIONS = json
KV_MODE = none
AUTO_KV_JSON = false
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
TIMESTAMP_FIELDS = last_modified_date
TIME_FORMAT = %Y-%m-%dT%H:%M:%S+%2N:%2N
MAX_TIMESTAMP_LOOKAHEAD = 25

and getting time extracted as : 

7/2/20
6:00:18.020 PM 


 but I want the time field extracted in same way as in data with + value as well like: 

 7/2/20
6:00:18+02:00 PM

Please let me know what i am doing wrong as i am not getting expected output with + value.

Note: this +02:00 value is fixed with every timestamp in data .

Here’s my sample log data:

{"_timestamp":"2020-07-02 18:00:46","_ver":"2","asset_name":"","assigned_group":"Troubleshooting - Tier 2","assignee":"Buhle Mahlaba","ci":"","cause":"","city":"","client_type":"","closed_date":"","closure_source":"","company":"MTN BUSINESS","contact_phone":"","contact_site":"","country":"","created_from_template":"","customer_phone":"###","customer_site":"INTERNET SOLUTIONS(PTY) LTD","debtor_code":"MTN000","direct_contact_city":"","direct_contact_company":"","direct_contact_corporate_id":"","direct_contact_country":"","direct_contact_country_code":"","direct_contact_department":"","direct_contact_desk_location":"","direct_contact_extension":"","direct_contact_first_name":"","direct_contact_internet_email":"","direct_contact_last_name":"","direct_contact_local_number":"","direct_contact_location_details":"","direct_contact_middle_initial":"","direct_contact_organization":"","direct_contact_region":"","direct_contact_site_group":"","direct_contact_state_province":"","direct_contact_street":"","direct_contact_time_zone":"","direct_contact_zip_postal_code":"","first_name":"Melvern","impact":"2-Significant\/Large","incident_id":"MTNB00001289400","incident_type":"User Service Restoration","last_acknowledged_date":"","last_modified_by":"412877","last_modified_date":"2020-07-02T18:00:44+02:00","last_name":"Banoo","last_resolved_date":"","middle_name":"","notes":"HI Team\n\nThe mentioned link is down ,Please investigate and advise.\n\n\nRP\/0\/RSP0\/CPU0:mi-za-bry-mspe4#sho log | inc BVI906\nRP\/0\/RSP0\/CPU0:Jul  2 14:43:49.894 SAST: mpls_ldp[1204]: %ROUTING-LDP-5-HELLO_ADJ_CHANGE : VRF 'default' (0x60000000), Link hello adja...","operational_categorization_tier_1":"TES_Link","operational_categorization_tier_2":"Microwave PTP","operational_categorization_tier_3":"Link Down","owner_group":"General Support","priority":"Critical","product_categorization_tier_1":"TES_Managed Networks","product_categorization_tier_2":"Access Service","product_categorization_tier_3":"Cloud Connect","product_name":"","region":"","reported_date":"2020-07-02T16:36:04+02:00","reported_source":"Email","resolution":"","resolution_categorization_tier_1":"","resolution_categorization_tier_2":"","resolution_categorization_tier_3":"","resolution_product_categorization_tier_1":"","resolution_product_categorization_tier_2":"","resolution_product_categorization_tier_3":"","responded_date":"2020-07-02T18:00:43+02:00","slm_real_time_status":"Within the Service Target","satisfaction_rating":"","service_manager":"","service_request_id":"","site_group":"","state_province":"","status":"In Progress","status_reason_hidden":"","street":"","submit_date":"2020-07-02T16:36:04+02:00","submitter":"AR_ESCALATOR","summary":"INC000147465| me-za-gp80-hoedspru-bry-1 | | E2379","time_zone":"","urgency":"1-Critical","vendor_group":"","vendor_name":"","vendor_ticket_number":"","zip_postal_code":""}

0 Karma