Our app is enclosed within a Docker container environment. We can access the app only through standard web interfaces and APIs. We have no access to the underlying operating system. So, through an API we retrieve the logs and store them on a remote server. We unzip them, put them in the known paths, and the Splunk UF on that device forwards them to Splunk.
We retrieve our logs every hour. They overwrite what is there. This means that when seen by the Splunk UF, they appear to be new logs. However, within them they are the same file, just with another hour of data in them.
Could you please advise on how to deal with those seemingly duplicate log information? Is there a way to work the results in a Splunk pipe search? Or should we adjust it in our log collection process before the Splunk UF send them to the Splunk Cloud Plattform?
Thank you.
... View more