I have a script that extracts table data from a database and loads the data to other tables in another database. Everything is written to a single log file, with all values reported for each table in the log file (e.g. table name, return code, row count, etc). I'm trying to write the results for each table into a Splunk table, but I'm finding that fields with a single value (e.g. ReturnCode) only print on the first row.
How can I modify this to print the value for every table?
Use the filldown command.
@richgallowaythank you for the responcse, but filldown didn't solve my problem
@richgallowayHere's the code I'm running:.
index=miscapps sourcetype="app:z-ingest"
transaction source
| where ProcessType = "UseCaseIngestion" AND isnotnull(TableName)
| eval CurrDate = now()
| eval RunDate = strptime(TimeStamp, "%m/%d/%y")
| eval DaysSinceLoad = round((CurrDate - RunDate)/86400)
| fillnull value=0 DaysSinceLoad, ExtractRowCount, LandingRowCount, ProdRowCount, ReturnCode
| convert dur2sec(Duration) as Duration2
| eval Hour = floor(Duration2/3600)
| eval Duration2 = strftime(Duration2, "%M:%S")
| eval result = (Hour.":".Duration2)
| eval Status=if((ReturnCode = 0) AND (DaysSinceLoad < 8), "Successful", "Error")
| eval ErrMsg= case(ReturnCode !=0, ErrorMessage, DaysSinceLoad > 7, "Stale Data Ingestion", 1=1, "None")
| rename Duration2 as Duration
| dedup UseCase, DatabaseName, TableName
| filldown UseCase, DatabaseName, ReturnCode, DaysSinceLoad
| table UseCase, DatabaseName, TableName, Status, ReturnCode, SourceRowCount, ExtrRowCount, DaysSinceLoad, Duration
The ouput leaves the UseCase, DatabaseName. Status, ReturnCode, and DaysSinceLoad fields blank on the second row. I have everything for this job in a single log file, and I repeat every field in the log file each time I extract a new table.
Other than putting filldown before dedup, I don't know what to suggest.
@richgalloway I changed the order of the statements, but that didn't have any effect on my output. Thanks for your input though.
index=miscapps sourcetype="app:z-ingest"
transaction source
| where ProcessType = "UseCaseIngestion" AND isnotnull(TableName)
| eval CurrDate = now()
| eval RunDate = strptime(TimeStamp, "%m/%d/%y")
| eval DaysSinceLoad = round((CurrDate - RunDate)/86400)
| fillnull value=0 DaysSinceLoad, ExtractRowCount, LandingRowCount, ProdRowCount, ReturnCode
| convert dur2sec(Duration) as Duration2
| eval Hour = floor(Duration2/3600)
| eval Duration2 = strftime(Duration2, "%M:%S")
| eval result = (Hour.":".Duration2)
| eval Status=if((ReturnCode = 0) AND (DaysSinceLoad < 8), "Successful", "Error")
| eval ErrMsg= case(ReturnCode !=0, ErrorMessage, DaysSinceLoad > 7, "Stale Data Ingestion", 1=1, "None")
| rename Duration2 as Duration
| table UseCase, DatabaseName, TableName, Status, ReturnCode, SourceRowCount, ExtrRowCount, DaysSinceLoad, Duration
| dedup UseCase, DatabaseName, TableName
| filldown UseCase, DatabaseName, ReturnCode, DaysSinceLoad
It's better to make table first before filldown.
@to4kawaThanks for the response, but unfortunately that didn't make a difference in the output.
Is it possible to "force" a new transaction in a log file? I have a similar Splunk report that works well, the only difference being that the log for each table is contained in a separate log file.
I can't imagine it for a minute. please make a table.