我创建了在 kubernetes 集群之外运行的 elasticsearch 和 kibina 设置,并使用 fluentd 收集 kubernetes 日志并将数据发送到 elasticsearch。我在 kubernetes 中将 fluentd 作为守护程序集运行。
apiVersion: apps/v1
kind: DaemonSet
metadata:
namespace: efk
name: fluentd
labels:
app: fluentd
spec:
selector:
matchLabels:
app: fluentd
template:
metadata:
labels:
app: fluentd
spec:
serviceAccount: fluentd
serviceAccountName: fluentd
containers:
- name: fluentd
image: fluent/fluentd-kubernetes-daemonset:v1.4.2-debian-elasticsearch-1.1
env:
- name: FLUENT_ELASTICSEARCH_HOST
value: "HOST_IP"
- name: FLUENT_ELASTICSEARCH_PORT
value: "9200"
- name: FLUENT_ELASTICSEARCH_SCHEME
value: "http"
- name: FLUENTD_SYSTEMD_CONF
value: disable
- name: FLUENT_CONTAINER_TAIL_PARSER_TYPE
value: /^(?<time>.+) (?<stream>stdout|stderr) [^ ]* (?<log>.*)$/
- name: FLUENT_CONTAINER_TAIL_EXCLUDE_PATH
value: /var/log/containers/fluent*
resources:
limits:
memory: 512Mi
requests:
cpu: 100m
memory: 200Mi
volumeMounts:
- name: varlog
mountPath: /var/log
- name: varlibdockercontainers
mountPath: /var/lib/docker/containers
readOnly: true
terminationGracePeriodSeconds: 30
volumes:
- name: varlog
hostPath:
path: /var/log
- name: varlibdockercontainers
hostPath:
path: /var/lib/docker/containers
但是通过此代码,我只能收集 kubernetes 节点级日志,我想从特定路径收集应用程序 pod 的日志,例如:/tmp/logs/* 这是我的应用程序生成日志的路径。
要做到这一点,只需运行 fluentD 作为 Deamonset 就可以了,还是我们需要将 fluentD 作为所有应用程序 pod 的 sidecar 运行?
答案1
您可以自定义 DaemonSet 配置以包含 FluentD 配置,该配置跟踪主机节点上的特定日志路径。这涉及编辑 FluentD 配置以包含您想要监控的特定路径模式,并在 DaemonSet YAML 中为这些路径设置 hostPath 卷挂载。
containers:
- name: fluentd
image: fluent/fluentd-kubernetes-daemonset:v1.4.2-debian-elasticsearch-1.1
env:
# FluentD configuration for log collection
volumeMounts:
- name: varlog
mountPath: /var/log
- name: varlibdockercontainers
mountPath: /var/lib/docker/containers
- name: tmplogs
mountPath: /tmp/logs # Mount path for your application logs
readOnly: true
volumes:
- name: varlog
hostPath:
path: /var/log
- name: varlibdockercontainers
hostPath:
path: /var/lib/docker/containers
- name: tmplogs
hostPath:
path: /path/to/your/application/logs