You could do a |fillnull before your stats so that way the null value actually has a value then when stats runs it can populate it correctly.
.... | fillnull X value="NULL" | stats latest(X) by Y
Yes, as @somesoni2 pointed out. If your running your search against time then you will have null results. Are you trying to exclude the null values from your results?
I'm actually trying to include instead of exclude the null values from my result.
The latest functions based on _time so yes it could be possible to have a latest value as null. Run the query in verbose mode and check if the latest event with your criteria indeed has null value for that field.
You could do a |fillnull before your stats so that way the null value actually has a value then when stats runs it can populate it correctly.
.... | fillnull X value="NULL" | stats latest(X) by Y
Thanks MattZerfas. Your answer is working for me.
I do have a large set of events before | stats though. Anyone know if there should be any concern on the cost or performance if there are thousands or millions events to convert the null value?
Large can be a relative term in Splunk.. There could definitely be a performance issue if you're doing > 10 million +. If you see an impact on performance then you may want to consider optimizing your query or setting up a summary index (This is needed on rare occasions when the data is massive)