logstash之multiline插件,匹配多行日志
在外理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如log4j。运行时日志跟访问日志最大的不同是,运行时日志是多行,也就是说,连续的多行才能表达一个意思。
在filter中,加入以下代码:
filter {
multiline { }
}
如果能按多行处理,那么把他们拆分到字段就很容易了。
字段属性:
对于multiline插件来说,有三个设置比较重要:negate , pattern 和 what
negate:类型是boolean默认为false
pattern:
必须设置,并且没有默认值,类型为string,要匹配下则表达式
what:
必须设置,并且没有默认值,可以为previous(之前的)或next
下面看看这个例子:
# cat logstash_multiline_shipper.conf
input {
file {
path => "/apps/logstash/conf/test/c.out"
type => "runtimelog"
codec => multiline {
pattern => "^\["
negate => true
what => "previous"
}
start_position => "beginning"
sincedb_path => "/apps/logstash/logs/sincedb-access"
ignore_older =>
}
}
output {
stdout{
codec => rubydebug
}
}
说明:区配以"["开头的行,如果不是,那肯定是属于前一行的
测试数据如下:
[-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS asc [-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS desc [-- :: DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS asc [-- :: DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
启动logstash:
# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}
加入测试数据到被监控的log后,查看输出:
# ./../bin/logstash -f logstash_multiline_shipper.conf
Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties
[--09T15::,][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>, "pipeline.batch.size"=>, "pipeline.batch.delay"=>, "pipeline.max_inflight"=>}
[--09T15::,][INFO ][logstash.pipeline ] Pipeline main started
[--09T15::,][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.403Z,
"@version" => "",
"host" => "ofs1",
"message" => "# ./../bin/logstash -f logstash_multiline_shipper.conf \nSending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties",
"type" => "runtimelog",
"tags" => [
[] "multiline"
]
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.409Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}",
"type" => "runtimelog",
"tags" => []
}
{
"path" => "/apps/logstash/conf/test/c.out",
"@timestamp" => --09T07::.410Z,
"@version" => "",
"host" => "ofs1",
"message" => "[2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started",
"type" => "runtimelog",
"tags" => []
}
logstash之multiline插件,匹配多行日志的更多相关文章
- Logstash——multiline 插件,匹配多行日志
本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...
- Logstash——multiline 插件,匹配多行日志
本文内容 测试数据 字段属性 按多行解析运行时日志 把多行日志解析到字段 参考资料 在处理日志时,除了访问日志外,还要处理运行时日志,该日志大都用程序写的,比如 log4j.运行时日志跟访问日志最大的 ...
- Logstash-安装logstash-filter-multiline插件(解决logstash匹配多行日志)
ELK-logstash在搬运日志的时候会出现多行日志,普通的搬运会造成保存到ES中日志一条一条的保存,很丑,而且不方便读取,logstash-filter-multiline可以解决该问题. 接下来 ...
- logstash匹配多行日志
在工作中,遇到一个问题就是日志的处理,首选的方案就是ELFK(filebeat+logstash+es+kibana) 因为之前使用过logstash采集日志的时候,非常的消耗系统的资源,所以这里我选 ...
- Logstash之multiline 插件
input { stdin { codec => multiline { pattern => "^\[" negate => true what => & ...
- logstash 中multiline插件的用法
input { stdin { codec =>multiline { charset=>... #可选 字符编码 max_bytes=>... #可选 bytes类型 设置最大的字 ...
- Python正则处理多行日志一例
正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...
- Python正则处理多行日志一例(可配置化)
正则表达式基础知识请参阅<正则表达式基础知识>,本文使用正则表达式来匹配多行日志并从中解析出相应的信息. 假设现在有这样的SQL日志: SELECT * FROM open_app WHE ...
- 写给大忙人的ELK最新版6.2.4学习笔记-Logstash和Filebeat解析(java异常堆栈下多行日志配置支持)
接前一篇CentOS 7下最新版(6.2.4)ELK+Filebeat+Log4j日志集成环境搭建完整指南,继续对ELK. logstash官方最新文档https://www.elastic.co/g ...
随机推荐
- C#高级编程笔记 Day 2, 2016年8月 31日 构造函数
1.构造函数: 实例构造函数(只要创建了对象,就会执行)一般使用 this 关键字区分成员字段和同名的参数.可以把构造函数定义为private 或 protected .这样不相关的类也不能访问他们. ...
- Sqli-LABS通关笔录-18-审计SQL注入2-HTTP头注入
在此关卡我学习到了 1.只要跟数据库交互的多观察几遍.特别是对于http头这种类型的注入方式. 2. <?php //including the Mysql connect parameter ...
- 关于ubuntukylin安装后界面中英文混杂的问题
起因 一直使用的是ubuntu原版的系统,ubuntukylin出来后也没用使用过.一次去其论坛逛了一圈之后决定使用一下. 安装后的截面和ubuntu原版的差不多,还是挺漂亮的. 但是有一个问题是,安 ...
- Servlet的使用方法详细说明
Servlet的生命周期方法: init() destroy() doGet(HttpServletRequest request,HttpServletResponse response) 客户端请 ...
- 3D音效
摘自:http://baike.baidu.com/view/1330437.htm?fr=aladdin 3D音效就是用扬声器仿造出似乎存在但是虚构的声音.例如扬声器仿造头顶上有一架飞机从左至右飞过 ...
- ubuntu安装文件比较工具Meld
Meld是一款可视化的文件及目录对比(diff) / 合并 (merge) 工具,通过它你可以对两个或三个文件/目录进行对比,并以图形化的方式显示出它们的不同之处,同时还提供编辑及合并功能,另外还支持 ...
- Python sorted函数对列表排序
http://jingyan.baidu.com/article/f3ad7d0ffe8e1409c2345b48.html http://www.cnblogs.com/100thMountain/ ...
- WCF 定制自己的签名验证逻辑
关键点: 1. 保证在客户端设置签名. client.ClientCredentials.ClientCertificate.SetCertificate(StoreLocation.CurrentU ...
- 通过分析 JDK 源代码研究 TreeMap 红黑树算法实现
本文转载自http://www.ibm.com/developerworks/cn/java/j-lo-tree/ 目录: TreeSet 和 TreeMap 的关系 TreeMap 的添加节点 Tr ...
- jq获取元素
<tr><td><div id="add"></div></td></tr>$("#add&quo ...