如何从 filebeat 获取多个日志?

如何从 filebeat 获取多个日志?

我有一台运行多个服务(如 nginx mongodb 等)的服务器。我想从中获取以下日志 /var/log/nginx/access.log /var/log/tomcat/catalina.out/ /var/log/audit/audit.log 等。 我的filebeat配置如下。

filebeat:
  prospectors:
    -
      paths:
        - /var/log/auth.log
        - /var/log/syslog
      document_type: syslog
      input_type: log

  prospectors:
    -
      paths:
        - /var/log/nginx/access.log
      document_type: nginx-access
      input_type: log

output:

  ### Logstash as output
  logstash:
    # The Logstash hosts
    hosts: ["logstashserver.pr:5044"]
    # default is 2048.

logstash 配置是

filter {
  if [type] == "nginx-access" {
    grok {
       match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
       overwrite => [ "message" ]
       add_field => [ "received_at", "%{@timestamp}" ]
       add_field => [ "received_from", "%{host}" ]
    }

  }
}
---------
    filter {
      if [type] == "syslog" {
        grok {
          match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
          add_field => [ "received_at", "%{@timestamp}" ]
          add_field => [ "received_from", "%{host}" ]
        }
        syslog_pri { }
        date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
        }
      }
    }

但只有最后一个 nginx 出现在 elastic search 中。我不知道如何在 ElasticSearch Kibana 中获取和插入两个日志

答案1

您在 grok match 行中使用了不同的语法。将其更改为 { "message" => "..." },就像第二个一样。

相关内容