Getting Data In

Splunk + talend

smaiti
New Member

I am working with Talend open Studio v5.2.
When a job fails in Talend a log file is generated in a specified location with predefined format (pipe delimited format).
eg : -

moment|pid|project|job|language|origin|status|substatus|description
2013-01-21 18:44:29|Reek96 ; Process_Name : wf_Process_Name ; Process_sk : 121212 ; Process_Run_sk : 481 ; Batch_sk : 566556|TALEND|Job_Name|java||Failed|Job execution error|ORA-00904: "ENTITY_NAMED": invalid identifier

The above log is generated when a Talend job fails.

Please note the Bold part is for pid.

Now moving one step forward i want to integrate this with Splunk.

So, is this possible?

Thanks in advance.

Regards,
Sam

Tags (3)
0 Karma

smaiti
New Member

Thanks a lot guys.

Currently i am checking the visibility.
Surely will have few more queries when i start implementing the same, may be in couple of days.

Regards,
Sam

0 Karma

Damien_Dallimor
Ultra Champion

There are essentially 2 main steps to perform to get the Talend log event data into Splunk :

1) Setup Splunk to monitor the directory where the log file gets written to : http://docs.splunk.com/Documentation/Splunk/latest/Data/Monitorfilesanddirectories

2) Configure field extraction based on the header row(which you'll use as the field names) and pipe delimited fields(which will be the field values) : http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileheadersatindextime

0 Karma

Ayn
Legend

Yes, it is possible. What part of the integration are you unsure about?

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...