1.配置filebeat_nginx.yml

filebeat.modules:
- module: nginx
access:
enabled: true
var.paths: ["/var/log/nginx/access.log*"]
error:
enabled: true
var.paths: ["/var/log/nginx/error.log*"] #----------------------------------Kafka output--------------------------------#
output.kafka:
version: "1.0.1"
enabled: true
hosts: ['xxx:9092', 'xxx:9092', 'xxx:9092']
topic: 'temp'
required_acks: 1 #default
compression: gzip #default
max_message_bytes: 1000000 #default
codec.format:
string: '%{[message]}'

2.启动filebeat

./filebeat -e -c filebeat_nginx.yml

3.访问nginx

tail -f /var/log/nginx/access.log

日志文件输出

{"ts":"2019-10-14 10:53:22","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:23","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:23","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:30","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:31","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}

kafka输出

{"ts":"2019-10-14 10:53:23","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:23","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:22","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:30","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}
{"ts":"2019-10-14 10:53:31","host":"127.0.0.1","clientip":"127.0.0.1","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"localhost","url":"/index.html","domain":"localhost","xff":"-","referer":"-","status":"304"}

使用filebeat发送nginx日志到kafka的更多相关文章

  1. ELK整合Filebeat监控nginx日志

    ELK 日志分析 1. 为什么用到 ELK 一般我们需要进行日志分析场景:直接在日志文件中 grep. awk 就可以获得自己想要的信息.但在规模较大的场景中,此方法效率低下,面临问题包括日志量太大如 ...

  2. 利用ELK分析Nginx日志生产实战(高清多图)

    本文以api.mingongge.com.cn域名为测试对象进行统计,日志为crm.mingongge.com.cn和risk.mingongge.com.cn请求之和(此二者域名不具生产换环境统计意 ...

  3. 利用ELK分析Nginx日志

    本文以api.mingongge.com.cn域名为测试对象进行统计,日志为crm.mingongge.com.cn和risk.mingongge.com.cn请求之和(此二者域名不具生产换环境统计意 ...

  4. ELK 6安装配置 nginx日志收集 kabana汉化

    #ELK 6安装配置 nginx日志收集 kabana汉化 #环境 centos 7.4 ,ELK 6 ,单节点 #服务端 Logstash 收集,过滤 Elasticsearch 存储,索引日志 K ...

  5. 一次flume exec source采集日志到kafka因为单条日志数据非常大同步失败的踩坑带来的思考

    本次遇到的问题描述,日志采集同步时,当单条日志(日志文件中一行日志)超过2M大小,数据无法采集同步到kafka,分析后,共踩到如下几个坑.1.flume采集时,通过shell+EXEC(tail -F ...

  6. ELK - nginx 日志分析及绘图

    1. 前言 先上一张整体的效果图: 上面这张图就是通过 ELK 分析 nginx 日志所得到的数据,通过 kibana 的功能展示出来的效果图.是不是这样对日志做了解析,想要知道的数据一目了然.接下来 ...

  7. 在kibana中查看nginx日志的Discover,Dashboards

    官方的操作: 1.安装filebeat,配置filebeat获取nginx日志,来源有两种: 第一种是使用自带的模块进行收集,在modules.d目录中启用模块配置,运行Filebeat时启用模块,在 ...

  8. CentOS6.9安装Filebeat监控Nginx的访问日志发送到Kafka

    一.下载地址: 官方:https://www.elastic.co/cn/downloads/beats/filebeat 百度云盘:https://pan.baidu.com/s/1dvhqb0 二 ...

  9. Kafka+Zookeeper+Filebeat+ELK 搭建日志收集系统

    ELK ELK目前主流的一种日志系统,过多的就不多介绍了 Filebeat收集日志,将收集的日志输出到kafka,避免网络问题丢失信息 kafka接收到日志消息后直接消费到Logstash Logst ...

随机推荐

  1. 剑指Offer(十七):树的子结构

    剑指Offer(十七):树的子结构 搜索微信公众号:'AI-ming3526'或者'计算机视觉这件小事' 获取更多算法.机器学习干货 csdn:https://blog.csdn.net/baidu_ ...

  2. IE haslayout 问题引起的常见 bug

    http://www.qianduan.net/comprehensive-haslayout/ 要想更好的理解 css, 尤其是 IE 下对 css 的渲染,haslayout 是一个非常有必要彻底 ...

  3. oracle之随机数

    一.首先创建一个测试表 select * from DIM_IA_TEST1 生成随机数 select t.*,rownum rn from  (select * from DIM_IA_TEST1 ...

  4. hive中删除表的错误Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException

    hive使用drop table 表名删除表时报错,return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException 刚 ...

  5. Java Excel 导入导出(一)

    本文主要描述通过java实现Excel导入导出 一.读写Excel三种常用方式 1.JXL——Java Excel开放源码项目:读取,创建,更新 2.POI——Apache POI ,提供API给Ja ...

  6. js编程思想:模型进化论--JS 的 new 到底是干什么的?

    想象我们在制作一个策略类战争游戏,玩家可以操作一堆士兵攻击敌方. 我们着重来研究一下这个游戏里面的「制造士兵」环节. 一个士兵的在计算机里就是一堆属性,如下图: 一.荒蛮时代:对象是数据的集合 我们只 ...

  7. Eclipse中将java类打成jar包形式运行

    记录一次帮助小伙伴将java类打成jar包运行 1.创建java project项目 file > new > project > java project 随便起一个项目名称,fi ...

  8. 2019牛客国庆集训派对day3 买一送一

    题目链接: 题意:有n个点,n-1条单向边,每个点都销售一类商品 问从点1开始走,买第一样商品类型为x,买第二样商品类型为y,问不同有序对<x,y>的数量 解法: col[i]表示这个点的 ...

  9. vim文本编辑器——文件导入、命令查找、导入命令执行结果、自定义快捷键、ab命令、快捷键的保存

    1.文件的导入(r): 导入前: 导入后: 在光标处,将tmp目录下的zhbb文件的内容导入到了当前文件. 2.命令的查找: 3.导入命令的执行结果: 光标所在行为导入的位置. 4.自定义快捷键: ( ...

  10. JavaScript == 与 === 区别

    1.对于 string.number 等基础类型,== 和 === 是有区别的 a)不同类型间比较,== 之比较 "转化成同一类型后的值" 看 "值" 是否相等 ...