I have log data for a web service call. We log the web service call response status (success OR failure) as well as the payload that is send as request. These information (status and the payload) are logged in different lines. The payload huge XML data.
We would like to built the Splunk report base don the status and when the user click on the status we would like to display the Payload. This will help the user to diagnosis the issue in case of failure response.
We built the “multi-line” event so that we can have the “status” and the “payload” in the same event. This will help us in building the queries. But the challenge we face is that since the payload is huge the extraction fields that we created on the events are not showing up. We observe that the fields that are under the large data are not captured by the Splunk. But the field that we extracted on the top part of the data is showing up. Hence Splunk is NOT capturing the fields in an event after certain limit.
How can we resolve this issue?
Thanks for the answer to4lawa. I'll try this out.
Meanwhile I found the solution using the regular expression fields. The UI for extraction of fields didn't fetch the field on a big data. But the "rex" is able to fetch the field.
| makeresults count=20000 | streamstats count | stats values(count) as big_data by _time
20000 lines of data are created. There is no particular problem.
time1 status time1 payload time2 status time2 payload ....
In such a log, I think that it is better to use
stats command. instead of creating multi-line events.