Kafka实战——flume中消息输出到Kafka中

标签: kafka

flume的一个高可用、高可靠、分布式海量日志收集、聚合和传输的系统。flume常用来收集日志,输出到不同的地方,如文件、网络、数据库、Kafka,其中Kafka是一个较常用的输出源。初次接触Kafka,简单测试了下flume消息绑定到Kafka topic中的过程,记录如下。

 

机器:一台Linux服务器

依赖:JDK1.8

步骤:

1.flume安装与测试

        1)下载flume: http://flume.apache.org/download.html

        2)解压:tar xvzf  apache-flume-1.8.0-bin.tar.gz 

        3)进目录: cd apache-flume-1.8.0-bin

        4)测试是否安装成功: bin/flume-ng version

2.kafka安装与测试

3.写config/flue-kafka.properties配置文件

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = netcat
a1.sources.r1.bind = localhost
a1.sources.r1.port = 44444

# Describe the sink
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.bootstrap.servers=192.168.23.128:9092
a1.sinks.k1.kafka.topic=flume2kafka
a1.sinks.k1.serializer.class=kafka.serializer.StringEncoder
a1.sinks.k1.kafka.producer.acks=1
a1.sinks.k1.custom.encoding=UTF-8

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

 sinks可以直接指定为KafkaSink,是因为flume/lib目录下自带了 flume-ng-kafka-sink-1.8.0.jar这个jar包。

4.创建Kafka topic

 bin/kafka-topics.sh --zookeeper 192.168.23.128:2181 --create --topic kafka-log4j --partitions 1 --replication-factor 1

5.启动flume服务

 bin/flume-ng agent -n a1 -c conf -f conf/flume-kafka.properties -Dflume.root.logger=INFO,console

6.启动kafka consumer,查看flume传过来的数据

bin/kafka-console-consumer.sh --bootstrap-server 192.168.23.128:9092 --topic flume2kafka --from-beginning

7.启动telnet,并输入数据,结果如下图所示:

 

原文链接:加载失败,请重新获取