我一直在尝试通过 NXlog 将已经以 JSON 格式格式化的日志从 Windows 应用程序传递到 logstash。
当我让 NXLOG 将文件发送到 Logstash 时,我在 logstash.log 中收到大量错误:
:message=>"An error occurred. Closing connection",
:client=>"10.xxx.xxx.147:61047",
:exception=>#<IndexError: string not matched>
错误全文:
{:timestamp=>"2015-04-25T15:15:37.084000-0900", :message=>"An error occurred. Closing connection", :client=>"10.xxx.xxx.147:61047", :exception=>#<IndexError: string not matched>, :backtrace=>["org/jruby/RubyString.java:3910:in `[]='", "/opt/logstash/lib/logstash/event.rb:62:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-json_lines-0.1.6/lib/logstash/codecs/json_lines.rb:37:in `decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-line-0.1.5/lib/logstash/codecs/line.rb:36:in `decode'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-line-0.1.5/lib/logstash/codecs/line.rb:35:in `decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-json_lines-0.1.6/lib/logstash/codecs/json_lines.rb:35:in `decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.3/lib/logstash/inputs/tcp.rb:116:in `handle_socket'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.3/lib/logstash/inputs/tcp.rb:145:in `client_thread'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.3/lib/logstash/inputs/tcp.rb:143:in `client_thread'"], :level=>:error}
{:timestamp=>"2015-04-25T15:15:38.097000-0900", :message=>"JSON parse failure. Falling back to plain-text", :error=>#<LogStash::Json::ParserError: Unexpected end-of-input: expected close marker for OBJECT (from [Source: [B@26f64966; line: 1, column: 2])
at [Source: [B@26f64966; line: 2, column: 5]>, :data=>" {\r\n", :level=>:info}
这是我的 NXLOG 配置:
## Please set the ROOT to the folder your nxlog was installed into,
## otherwise it will not start.
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
<Extension json>
Module xm_json
</Extension>
# Nxlog internal logs
<Input internal>
Module im_internal
Exec $EventReceivedTime = integer($EventReceivedTime) / 1000000; to_json();
</Input>
# Windows Event Log
<Input eventlog>
Module im_msvistalog
Exec $EventReceivedTime = integer($EventReceivedTime) / 1000000; to_json();
</Input>
#Server Logs
<Input Selected_Directory>
Module im_file
File 'E:\\ELK\\logs\\*.json'
SavePos False
</Input>
#EventLog Out
<Output out>
Module om_tcp
Host 10.xxx.xxx.127
Port 3515
</Output>
#<output perf_out>
# Module om_tcp
# Host 10.xxx.xxx.127
# Port 3517
#</Output>
#JSON Out
<Output out2>
Module om_tcp
Host 10.xxx.xxx.127
Port 3516
</Output>
<Route 1>
Path internal, eventlog => out
</Route>
<Route 2>
Path Selected_Directory => out2
</Route>
LogStash 配置:
input {
tcp {
type => "eventlog"
port => 3515
codec => json_lines
}
tcp {
type => "log"
port => 3516
codec => json
}
}
output {
elasticsearch {
cluster => "MyElkCluster"
host => "127.0.0.1"
}
}
应用程序的 JSON 文件格式示例:
[
{
"timestamp":"19:54:01.117_0005",
"type":"N",
"calllevel":0,
"thread":772,
"topic":"ExmpleTopic",
"level":61,
"file":"//blah/blah/blah.cpp",
"function":"functiontext",
"line":312,
"message":"Example Message Text",
"attributes":
{
"ThreadName":"1234"
}
},
{
"timestamp":"20:07:54.038_0691",
"type":"N",
"calllevel":0,
"thread":2324,
"topic":"ExampleTopic",
"level":61,
"file":"//blah/blah/blah.cpp",
"function":"ExampleFunction",
"line":2962,
"message":"Example Message Text",
"attributes":
{
"ThreadName":"1234"
}
}
]
除了显而易见的“帮我找出这个错误”之外,我还有两个其他问题
- 在 logstash 输入方面,json_lines 和 json 有什么区别?我的理解是 json_lines 用于流式文本,而 json 暗示我将一次发送整个文件。
- 我是否需要将“exec to_json();”添加到名为“selected_directory”的 nxlog 输入中?
答案1
我建议尝试 tcp 输入,忘记在输入中定义编解码器,我发现这通常是一个坏主意:
input {
tcp {
type => "eventlog"
port => 3515
}
}
filter {
multiline {
pattern => "^\s"
what => "previous"
}
json {
"source" => "message"
}
}
output {
elasticsearch {
cluster => "MyElkCluster"
host => "127.0.0.1"
}
}
因此,在此配置中,它将接受定义的 tcp 端口上的流量,不设置编解码器,然后将输入传递给多行过滤器将查找以空格开头的行,如果找到,它将把这些行连接到前一行。然后,由 muliline 创建的新行将传递给 json 过滤器,该过滤器应该能够解析该条目。
答案2
我在使用 json_lines 编解码器时也遇到了错误,提示:
IndexError: string not matched
并通过确保我生成的 JSON 字符串不包含任何换行符(即:“\n”)来解决该问题,除了紧跟 JSON 对象之后作为分隔符。