在外理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如log4j。运行时日志跟访问日志最大的不同是,运行时日志是多行,也就是说,连续的多行才能表达一个意思。

在filter中,加入以下代码:

filter {

  multiline {  }

}

如果能按多行处理,那么把他们拆分到字段就很容易了。

字段属性:

对于multiline插件来说,有三个设置比较重要:negate , pattern 和 what

negate:类型是boolean默认为false

pattern:

必须设置,并且没有默认值,类型为string,要匹配下则表达式

what:

必须设置,并且没有默认值,可以为previous(之前的)或next

下面看看这个例子:

# cat logstash_multiline_shipper.conf
input {
file {
path => "/apps/logstash/conf/test/c.out"
type => "runtimelog"
codec => multiline {
pattern => "^\["
negate => true
what => "previous"
}
start_position => "beginning"
sincedb_path => "/apps/logstash/logs/sincedb-access"
ignore_older =>
}
}
output {
stdout{
codec => rubydebug
}
}

说明:区配以"["开头的行,如果不是,那肯定是属于前一行的

测试数据如下:

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS asc

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS desc

[-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category

where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null

order by product_category.ORDERS asc

[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.

启动logstash:

# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}

加入测试数据到被监控的log后,查看输出:

# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.403Z,
"@version" => "",
"host" => "ofs1",
"message" => "# ./../bin/logstash -f logstash_multiline_shipper.conf \nSending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties",
"type" => "runtimelog",
"tags" => [
[] "multiline"
]
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.409Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}",
"type" => "runtimelog",
"tags" => []
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.410Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started",
"type" => "runtimelog",
"tags" => []
}

logstash之multiline插件,匹配多行日志的更多相关文章

  1. Logstash——multiline 插件,匹配多行日志

    本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...

  2. Logstash——multiline 插件,匹配多行日志

    本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...

  3. Logstash-安装logstash-filter-multiline插件(解决logstash匹配多行日志)

    ELK-logstash在搬运日志的时候会出现多行日志,普通的搬运会造成保存到ES中日志一条一条的保存,很丑,而且不方便读取,logstash-filter-multiline可以解决该问题. 接下来 ...

  4. logstash匹配多行日志

    在工作中,遇到一个问题就是日志的处理,首选的方案就是ELFK(filebeat+logstash+es+kibana) 因为之前使用过logstash采集日志的时候,非常的消耗系统的资源,所以这里我选 ...

  5. Logstash之multiline 插件

    input { stdin { codec => multiline { pattern => "^\[" negate => true what => & ...

  6. logstash 中multiline插件的用法

    input { stdin { codec =>multiline { charset=>... #可选 字符编码 max_bytes=>... #可选 bytes类型 设置最大的字 ...

  7. Python正则处理多行日志一例

    正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...

  8. Python正则处理多行日志一例(可配置化)

    正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...

  9. 写给大忙人的ELK最新版6.2.4学习笔记-Logstash和Filebeat解析(java异常堆栈下多行日志配置支持)

    接前一篇CentOS 7下最新版(6.2.4)ELK+Filebeat+Log4j日志集成环境搭建完整指南,继续对ELK. logstash官方最新文档https://www.elastic.co/g ...

随机推荐

  1. C++中各种容器特点总结

    1.vector 内部数据结构:数组,可随机访问元素,在末尾增加或删除元素与元素数目无关,在其 他部分增加或删除元素随着元素数目呈线性变化. 2.deque 数组,按页/块来分配存储,每页/块包含固定 ...

  2. JVM(java 虚拟机)内存设置

    一.设置JVM内存设置 1. 设置JVM内存的参数有四个: -Xmx   Java Heap最大值,默认值为物理内存的1/4,最佳设值应该视物理内存大小及计算机内其他内存开销而定: -Xms   Ja ...

  3. YUVviewerPlus使用教程

    1.YUVviewerPlus用于播放yuv文件,点击Open File打开yuv文件 2.点击Play播放yuv文件

  4. 【Other】最近正在看的

    待看: https://my.oschina.net/yunfound/blog/141222 https://www.zhihu.com/question/22925358 http://study ...

  5. MVC Create

    本文介绍如何在MVC里往数据库中插入新的记录. 这里用到的数据表如下: Employees Step 1: 在Control文件里加入method public ActionResult Create ...

  6. poj 1511(spfa)

    ---恢复内容开始--- http://poj.org/problem?id=1511 一个spfa类的模板水题. 题意:就是求从1到n个点的来回的所有距离和. 对spfa类的题还是不太熟练,感觉还是 ...

  7. (转)利用eclipse external tool 执行mvn jetty:run

    一.如果这个工程是标准的maven-webapp那么基本上不用修改,直接运行jetty:run就可以执行. 但是有时候会报错说 [ERROR] No plugin found for prefix ' ...

  8. C++实现VPN工具之代码示例

    创建.连接.挂断.删除VPN实现起来并不难,下面给出一套比较完整的代码.该段代码只是示例代码,但是已经通过了编译,对API的使用和VPN操作步骤是没问题的.具体每个API代表的意义可以参看<C+ ...

  9. Linux下安装gcc和g++

    以CentOS为例,安装后是没有C语言和C++编译环境的,需要手动安装,最简单的是用yum的方式安装,过程如下: 1.安装gcc yum install gcc 询问是否,按y键回车即可,或者 yum ...

  10. 细看INNODB数据落盘

    本文来自:沃趣科技 http://www.woqutech.com/?p=1459 1.  概述 前面很多大侠都分享过MySQL的InnoDB存储引擎将数据刷新的各种情况.我们这篇文章从InnoDB往 ...