本文共 832 字,大约阅读时间需要 2 分钟。
监控日志增加,并且将日志信息存储到hdfs上
将hadoop对应jar包放到flume的lib目录下
htrace-core-3.1.0-incubating.jarhadoop-hdfs-2.7.3.jarhadoop-common-2.7.3.jarhadoop-auth-2.7.3.jarcommons-io-2.4.jarcommons-configuration-1.6.jar
#a1是代理# Name the components on this agenta1.sources = r1a1.sinks = k1a1.channels = c1# Describe/configure the sourcea1.sources.r1.type = execa1.sources.r1.command = tail -f /var/log/httpd/access_loga1.sources.r1.channels = c1# Use a channel which buffers events in memorya1.channels.c1.type = memorya1.channels.c1.capacity = 1000a1.channels.c1.transactionCapacity = 100# Describe the sinka1.sinks.k1.type=hdfsa1.sinks.k1.hdfs.path=hdfs://bigdata.ibeifeng.com:8020/flume02a1.sinks.k1.channel = c1
(1)启动hdfs
(2)启动flume
bin/flume-ng agent --name a1 --conf conf --conf-file conf/apache-sink-hdfs.conf
转载地址:http://tvygi.baihongyu.com/