我正在尝试让 filebeat 使用 kafka 输入从 kafka 消费消息。由于某种原因,我无法使用 SASL 进行身份验证,但我不确定这是为什么。尝试使用 SASL 时,Kafka 和 Filebeat 的文档都有些欠缺。
我的filebeat配置如下:
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
filebeat.inputs:
- type: kafka
hosts: 'the.kafka.server.com:9092'
topics: 'my_topic'
group_id: 'my_group'
ssl.enabled: yes
username: "$ConnectionString"
password: "org.apache.kafka.common.security.plain.PlainLoginModule required username='my_username' password='my_password';"
processors:
- add_cloud_metadata: ~
- add_docker_metadata: ~
output.console:
pretty: true
输出显示
INFO input/input.go:114 Starting input of type: kafka; ID: 14409252276502564738
INFO kafka/log.go:53 kafka message: Initializing new client
INFO kafka/log.go:53 client/metadata fetching metadata for all topics from broker the.kafka.server.com:9092
INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
INFO cfgfile/reload.go:171 Config reloader started
INFO cfgfile/reload.go:226 Loading of config files completed.
INFO kafka/log.go:53 kafka message: Successful SASL handshake. Available mechanisms: %!(EXTRA []string=[PLAIN OAUTHBEARER])
INFO kafka/log.go:53 Failed to read response while authenticating with SASL to broker the.kafka.server.com:9092: EOF
INFO kafka/log.go:53 Closed connection to broker the.kafka.server.com:9092
INFO kafka/log.go:53 client/metadata got error from broker -1 while fetching metadata: EOF
我不确定这里发生了什么。我也尝试添加,compression: none
但没有帮助,并使用 openssl 验证服务器证书是否能够被验证。我在这里做错了什么?有问题的 kafka 服务器是一个云托管的 kafka 服务器,我看不到服务器配置,我从 kafka 的云 UI 获得了“连接字符串”。
答案1
我发现问题在于,该$ConnectionString
语法不适用于 Confluent Cloud Kafka 集群。正确的语法如下:
filebeat.inputs:
- type: kafka
hosts: 'the.kafka.server.com:9092'
topics: 'my_topic'
group_id: 'my_group'
ssl.enabled: yes
username: <API KEY>
password: <API SECRET>
这足以让它连接和消费