Splunk Enterprise

Opentelemetry collector

pranay03
Observer

Hi I have been trying to deploy opentelemetry collector in my aws EKS cluster to send logs to Splunk enterprise, I have deployed this using SignalFx open telemetry collector helm chart: https://github.com/signalfx/splunk-otel-collector

But it seems to be an issue, what it is doing is not sending logs that have older timestamps than Otel pods it starts sending logs once new logs start ingesting any log files, old logs it is not ingesting, seems to be an issue with some configuration, want help in this 

Labels (1)
0 Karma

mattymo
Splunk Employee
Splunk Employee

Hi! Can you clarify what the actual source of these logs are?

Are these logs in the pod's stdout/stderr logs from /var/log/pods, or are they in some other location? What is the path to them?

Nix nodes or windows nodes?

As mentioned already, OTel does fingerprint logs and filelog settings expect to be tailing live logs, so if this is a bit different use case we can add a custom receiver in the "extraFilelogs" section of the helm chart. 

- MattyMo
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @pranay03,

Since your otel collector is already installed without start_at=beginning parameter, you should remove otel , delete checkpoint files on your nodes under "/var/addon/splunk/otel_pos/" folder, and install otel again.. This should make otel to re-read all files. 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

pranay03
Observer

Hi @scelikok 

Hi I have done all the things that you have tell in this communication , still it is not sending logs , can it be an issue of time or cluster, or any other configuration we have to do in helm chart 

Your help will be appreciated
Thanks

0 Karma

pranay03
Observer

Hi @scelikok 

Below is my configuration of filelog receiver inside the helm chart 

{{- if and (eq (include "splunk-otel-collector.logsEnabled" .) "true") (eq .Values.logsEngine "otel") }}
{{- if .Values.logsCollection.containers.enabled }}
filelog:
{{- if .Values.isWindows }}
include: ["C:\\var\\log\\pods\\*\\*\\*.log"]
{{- else }}
include: ["/var/log/pods/*/*/*.log"]
{{- end }}
# Exclude logs. The file format is
# /var/log/pods/<namespace_name>_<pod_name>_<pod_uid>/<container_name>/<restart_count>.log
exclude:
{{- if .Values.logsCollection.containers.excludeAgentLogs }}
{{- if .Values.isWindows }}
- "C:\\var\\log\\pods\\{{ .Release.Namespace }}_{{ include "splunk-otel-collector.fullname" . }}*_*\\otel-collector\\*.log"
{{- else }}
- /var/log/pods/{{ .Release.Namespace }}_{{ include "splunk-otel-collector.fullname" . }}*_*/otel-collector/*.log
{{- end }}
{{- end }}
{{- range $_, $excludePath := .Values.logsCollection.containers.excludePaths }}
- {{ $excludePath }}
{{- end }}
start_at: beginning
include_file_path: true
include_file_name: false
poll_interval: 200ms
# Disable force flush until this issue is fixed:
# https://github.com/open-telemetry/opentelemetry-log-collection/issues/292
retry_on_failure:
enabled: true
{{- end }}

and also when I checked the config map start_at mapped with the beginning. Still it is not showing the older logs in Splunk 

0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @pranay03,

By default, the file_input receiver doesn’t read logs from a file that is not actively being written to because start_at defaults to endThis setting will be ignored if previously read file offsets are retrieved from a persistence mechanism.  So this behavior will not be a problem on normal running, only on the first installation otel will start reading new files. 

If you want to read old files too, you can configure start_at parameter as beginning

Here is the related document;

https://docs.splunk.com/observability/en/gdi/opentelemetry/components/filelog-receiver.html#settings 

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

Kamesh_Munusamy
Observer

@pranay03 Are you able to resolve this issue.

I am as well facing same issue. In the console log it shows it started watching, however i could not see the logs in Web portal.

Please let me know your thoughts and suggestions.

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...