使用 Logstash 作为托运人?

使用 Logstash 作为托运人?

我们正在从服务器发送日志,并使用每台服务器上的 logstash 进行发送。

因此我们从 glob 中读取日志"/root/Desktop/Logstash-Input/**/*_log"

 input {
            file{
                    path => "/root/Desktop/Logstash-Input/**/*_log"
                    start_position => "beginning"
            }
    }

从这个球体中提取字段从中path我们想要添加到事件中。例如:从目录中path提取serverlogtype等。我们这样做:

filter {

grok {

        match => ["path", "/root/Desktop/Logstash-Input/(?<server>[^/]+)/(?<logtype>[^/]+)/(?<logdate>[\d]+.[\d]+.[\d]+)/(?<logfilename>.*)_log"]
}
}

然后我们使用lumberjack输出插件将这些日志输出到中央 logstash 服务器。

output {

        lumberjack {
                hosts => ["xx.xx.xx.xx"]
                port => 4545
                ssl_certificate => "./logstash.pub"
    }

        stdout { codec => rubydebug }
}

问题在于,发送到中央服务器的日志会丢失使用 添加的字段grok。例如,中央服务器上不存在serverlogtype、 等。但是,客户端计算机控制台显示已添加的字段,但在中央 logstash 服务器上仅存在messagetimestamp、 。version

客户端(日志发送地)控制台:

output received {:event=>{"message"=>"2014-05-26T00:00:01+05:30 host crond[268]: (root) CMD (2014/05/31/server2/cron/log)", "@version"=>"1", "@timestamp"=>"2014-07-16T06:07:21.927Z", "host"=>"host", "path"=>"/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log", "server"=>"Server2", "logtype"=>"CronLog", "logdate"=>"2014.05.31", "logfilename"=>"cron"}, :level=>:debug, :file=>"(eval)", :line=>"37"}
    {
              "message" => "2014-05-26T00:00:01+05:30 bx920as1 crond[268]: (root) CMD (2014/05/31/server2/cron/log)",
             "@version" => "1",
           "@timestamp" => "2014-07-16T06:07:21.927Z",
                 "host" => "host",
                 "path" => "/root/Desktop/Logstash-Input/Server2/CronLog/2014.05.31/cron_log",
               "server" => "Server2",
              "logtype" => "CronLog",
              "logdate" => "2014.05.31",
          "logfilename" => "cron"
    }

中央服务器(日志发送至此)控制台

{
       "message" => "2014-07-16T05:33:17.073+0000 host 2014-05-26T00:00:01+05:30 bx920as1 crond[288]: (root) CMD (2014/05/31/server2/cron/log)",
      "@version" => "1",
    "@timestamp" => "2014-07-16T05:34:02.370Z"
}

因此,grokked 字段在运输过程中被删除。为什么会这样?

我怎样才能保留这些字段?

答案1

解决了:

我通过添加codec => "json"伐木工的输出和输入解决了这个问题。

输出:

output {

    lumberjack {
            hosts => ["xx.xx.xx.xx"]
            port => 4545
            ssl_certificate => "./logstash.pub"
            codec => "json"
}

输入:

input { 
    lumberjack {
        port => 4545
        ssl_certificate => "/etc/ssl/logstash.pub"
        ssl_key => "/etc/ssl/logstash.key"  
        codec => "json"
  }
}

相关内容