Splunk Search

How to force today's date column to display if no data generated for today's date?

adnankhan5133
Communicator

I currently have the following SPL query that generates a table, and appears as follows:

Service IDResource NameTransaction NamePriorityService AreaConsumer2020-8-102020-08-092020-08-08
100GETTransaction A2ServiceArea2App1253731
200PUTTransaction B1ServiceArea2App3132214

 

The date columns represent today's date, as well as the previous 6 days of data. I didn't include the dates from 8/4-8/7 since would take up far too much space. The counts in the date columns represent the total count of the "Consumer" column, which captures the different application_name fields. Now, today's date column will only generate if there is at least one entry for the application_name field in the logs.

I would like to modify the query logic to force today's date column to generate even when there is a zero count for transactions associated to the application_name field. This would allow for today's date column to generate at 12AM PST (midnight). At that time, there would probably be no transactions, so I would like Splunk to return a value of "NULL" rather than a blank value or "0".

Based on the below existing logic, can someone provide some guidance on how the below query should be updated to force the 'today' column (i.e. 2020-08-10 in the above example) to generate at midnight rather than at a time dependent on the application_name (Consumer) transaction having at least one entry?

So, for example, by tomorrow midnight (2020-8-11), I would expect to see the following assuming that no application transactions occur at midnight.

 

Service IDResource NameTransaction NamePriorityService AreaConsumer2020-08-112020-08-102020-08-09
100GETTransaction A2ServiceArea2App1NULL2537
200PUTTransaction B1ServiceArea2App3NULL1322

 

Current SPL:

index=new_idx_1 sourcetype=file_level_data

| eval epoch_Timestamp=strptime(Timestamp, "%Y-%m-%dT%H:%M:%S.%3QZ")-14400

| rename "Transaction Name" as trans_name, "Application Name" as application_name, "File Status" as file_status

| eval service_id=case(Verb="GET" AND trans_name="Transaction A" AND application_name="App1", "ServiceID1", Verb="GET" AND trans_name="Transaction B" AND application_name="App3", "ServiceID2", Verb="PUT" AND trans_name="Transaction B" AND application_name="App3", "ServiceID3", 1=1, "Unqualified")

| where service_id!="Unqualified"

| eval Priority=case(Verb="GET" AND trans_name="Transaction A" AND application_name="App1", "2", Verb="GET" AND trans_name="Transaction B" AND application_name="App3", "2", Verb="PUT" AND trans_name="Transaction B" AND application_name="App3", "1", 1=1, "Unqualified")

| where Priority!="Unqualified"

| eval service_area=case(Verb="GET" AND trans_name="Transaction A" AND application_name="App1", "ServiceArea1", Verb="GET" AND trans_name="Transaction B" AND application_name="App3", "ServiceArea2", Verb="PUT" AND trans_name="Transaction B" AND application_name="App3", "ServiceArea2", 1=1, "Unqualified")

| where service_area!="Unqualified"

| eval date_reference=strftime(epoch_Timestamp, "%Y-%m-%d")

| stats count(eval(file_status)) as count by service_id, Verb, trans_name, Priority, service_area, application_name, date_reference

| eval combined=service_id."@".Verb."@".trans_name."@".Priority."@".service_area."@".application_name."@"

| xyseries combined date_reference count

| rex field=combined "^(?<service_id>[^\@]+)\@(?<Verb>[^\@]+)\@(?<trans_name>[^\@]+)\@(?<Priority>[^\@]+)\@(?<service_area>[^\@]+)\@(?<application_name>[^\@]+)\@$"

| fillnull value="0"

| table service_id, Verb, trans_name, Priority, service_area, application_name, *-*-31, *-*-30, *-*-29, *-*-28, *-*-27, *-*-26, *-*-25, *-*-24, *-*-23, *-*-22, *-*-21, *-*-20, *-*-19, *-*-18, *-*-17, *-*-16, *-*-15, *-*-14, *-*-13, *-*-12, *-*-11, *-*-10, *-*-09, *-*-08, *-*-07, *-*-06, *-*-05, *-*-04, *-*-03, *-*-02, *-*-01

| rename service_id as "Service ID", Verb as "Resource Name", trans_name as "Transaction Name", Priority as "Priority", service_area as "Service Area", application_name as "Consumer"

Labels (3)
0 Karma

adnankhan5133
Communicator

Any ideas, anyone?

0 Karma
Get Updates on the Splunk Community!

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...