Knowledge Management

Combine the result of two tables into one and add a column displaying the name of the table for each row

man03359
Communicator

I have two queries which is giving me two tables, naming Distributed & Mainframe as below -

Distributed-

 

 

index=idx-esp source="*RUNINFO_HISTORY.*"
| rename STATUS as Status "COMP CODE" as CompCode APPL as ScheduleName NODE_GROUP as AgentName NODE_ID as HostName CPU_TIME as CPU-Time
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| eval Source=if(like(source,"%RUNINFO_HISTORY%"),"Control-M","ESP")
| dedup Source ScheduleName ScheduleDate AgentName HostName JobName StartTime EndTime
| table ScheduleDate  JobName StartTime EndTime Duration
| sort 0 - Duration
| head 10

 

 

Mainframe-

 

 

index=idx-esp source="*RUNINFO_HISTORY.*" DATA_CENTER=CPUA JOB_ID="0*"
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| table ScheduleDate  JOB_MEM_NAME StartTime EndTime Duration
| sort 0 - Duration
| head 10

 

 

I am trying to append both the tables into one, using something like this -

 

 

index=idx-esp source="*RUNINFO_HISTORY.*"
| rename STATUS as Status "COMP CODE" as CompCode APPL as ScheduleName NODE_GROUP as AgentName NODE_ID as HostName CPU_TIME as CPU-Time
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| eval Source=if(like(source,"%RUNINFO_HISTORY%"),"Control-M","ESP")
| dedup Source ScheduleName ScheduleDate AgentName HostName JobName StartTime EndTime
| table ScheduleDate  JobName StartTime EndTime Duration
| sort 0 - Duration
| head 50
| append
    [search index=idx-esp source="*RUNINFO_HISTORY.*" DATA_CENTER=CPUA JOB_ID="0*"
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| table ScheduleDate  JOB_MEM_NAME StartTime EndTime Duration
| sort 0 - Duration
| head 50]

 

 

 The issue is I am trying to add a column named "Log_Source" at the start which tells either Distributed or Mainframe for its corresponding result. I am not sure how to achieve it. Pls help.

Labels (2)
Tags (2)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Use an eval command in each subsearch to set the Log_Source field.

index=idx-esp source="*RUNINFO_HISTORY.*"
| eval Log_Source = "Distributed"
| rename STATUS as Status "COMP CODE" as CompCode APPL as ScheduleName NODE_GROUP as AgentName NODE_ID as HostName CPU_TIME as CPU-Time
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| eval Source=if(like(source,"%RUNINFO_HISTORY%"),"Control-M","ESP")
| dedup Source ScheduleName ScheduleDate AgentName HostName JobName StartTime EndTime
| table ScheduleDate  JobName StartTime EndTime Duration
| sort 0 - Duration
| head 50
| append
    [search index=idx-esp source="*RUNINFO_HISTORY.*" DATA_CENTER=CPUA JOB_ID="0*"
| eval Log_Source="Mainframe"
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| table ScheduleDate  JOB_MEM_NAME StartTime EndTime Duration
| sort 0 - Duration
| head 50]

 

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Use an eval command in each subsearch to set the Log_Source field.

index=idx-esp source="*RUNINFO_HISTORY.*"
| eval Log_Source = "Distributed"
| rename STATUS as Status "COMP CODE" as CompCode APPL as ScheduleName NODE_GROUP as AgentName NODE_ID as HostName CPU_TIME as CPU-Time
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| eval Source=if(like(source,"%RUNINFO_HISTORY%"),"Control-M","ESP")
| dedup Source ScheduleName ScheduleDate AgentName HostName JobName StartTime EndTime
| table ScheduleDate  JobName StartTime EndTime Duration
| sort 0 - Duration
| head 50
| append
    [search index=idx-esp source="*RUNINFO_HISTORY.*" DATA_CENTER=CPUA JOB_ID="0*"
| eval Log_Source="Mainframe"
| eval epoc_end_time=strptime(EndTime,"%Y-%m-%d %H:%M:%S")
| eval epoc_start_time=strptime(StartTime,"%Y-%m-%d %H:%M:%S")
| eval UpdatedTime=if(isnull(epoc_end_time),_indextime,epoc_end_time)
| eval DurationSecs=floor(UpdatedTime - epoc_start_time)
| eval Duration=tostring(DurationSecs,"duration")
| table ScheduleDate  JOB_MEM_NAME StartTime EndTime Duration
| sort 0 - Duration
| head 50]

 

---
If this reply helps you, Karma would be appreciated.
0 Karma

man03359
Communicator

@richgalloway 

It worked 🙂 thanks a lot 🙂

Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...