logstash
logstash作为数据搜集器,主要分为三个部分:input->filter->output 作为pipeline的形式进行处理,支持复杂的操作,如发邮件等
input配置数据的输入和简单的数据转换
filter配置数据的提取,一般使用grok
output配置数据的输出和简单的数据转换
运行:logstash -f /etc/logstash.conf
-f 指定配置文件
-e 只在控制台运行
具体的配置见官网
https://www.elastic.co/products/logstash
Centralize, Transform & Stash Your Data
input
Plugin |
Description |
Github repository |
Receives events from the Elastic Beats framework |
||
Streams events from CouchDB’s |
||
Reads query results from an Elasticsearch cluster |
||
Streams events from files |
||
Reads GELF-format messages from Graylog2 as events |
||
Generates random log events for test purposes |
||
Reads metrics from the |
||
Generates heartbeat events for testing |
||
Receives events over HTTP or HTTPS |
||
Decodes the output of an HTTP API into events |
||
Creates events from JDBC data |
||
Reads events from a Kafka topic |
||
Reads events over a TCP socket from a Log4j |
||
Receives events using the Lumberjack protocl |
||
Pulls events from a RabbitMQ exchange |
||
Reads events from a Redis instance |
||
Streams events from files in a S3 bucket |
||
Pulls events from an Amazon Web Services Simple Queue Service queue |
||
Reads events from standard input |
||
Reads syslog messages as events |
||
Reads events from a TCP socket |
||
Reads events from the Twitter Streaming API |
||
Reads events over UDP |
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
Plugin |
Description |
Github repository |
Pulls events from the Amazon Web Services CloudWatch API |
||
Retrieves watchdog log events from Drupal installations with DBLog enabled |
||
Pulls events from the Windows Event Log |
||
Captures the output of a shell command as an event |
||
Reads Ganglia packets over UDP |
||
Pushes events to a GemFire region |
||
Reads events from a GitHub webhook |
||
Streams events from the logs of a Heroku app |
||
Reads mail from an IMAP server |
||
Reads events from an IRC server |
||
Retrieves metrics from remote Java applications over JMX |
||
Receives events through an AWS Kinesis stream |
||
Captures the output of command line tools as an event |
||
Streams events from a long-running command pipe |
||
Receives facts from a Puppet server |
||
Receives events from a Rackspace Cloud Queue service |
||
Receives RELP events over a TCP socket |
||
Captures the output of command line tools as an event |
||
Creates events based on a Salesforce SOQL query |
||
Creates events based on SNMP trap messages |
||
Creates events based on rows in an SQLite database |
||
Creates events received with the STOMP protocol |
||
Reads events over a UNIX socket |
||
Reads from the |
||
Reads events from a websocket |
||
Creates events based on the results of a WMI query |
||
Receives events over the XMPP/Jabber protocol |
||
Reads Zenoss events from the fanout exchange |
||
Reads events from a ZeroMQ SUB socket |
filter
Plugin |
Description |
Github repository |
Aggregates information from several events originating with a single task |
||
Replaces field values with a consistent hash |
||
Parses comma-separated value data into individual fields |
||
Parses dates from fields to use as the Logstash timestamp for an event |
||
Computationally expensive filter that removes dots from a field name |
||
Extracts unstructured event data into fields using delimiters |
||
Performs a standard or reverse DNS lookup |
||
Drops all events |
||
Fingerprints fields by replacing values with a consistent hash |
||
Adds geographical information about an IP address |
||
Parses unstructured event data into fields |
||
Parses JSON events |
||
Parses key-value pairs |
||
Merges multiple lines into a single event |
||
Performs mutations on fields |
||
Executes arbitrary Ruby code |
||
Sleeps for a specified time span |
||
Splits multi-line messages into distinct events |
||
Parses the |
||
Throttles the number of events |
||
Replaces field contents based on a hash or YAML file |
||
Decodes URL-encoded fields |
||
Parses user agent strings into fields |
||
Adds a UUID to events |
||
Parses XML into fields |
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
Plugin |
Description |
Github repository |
Performs general alterations to fields that the |
||
Checks IP addresses against a list of network blocks |
||
Applies or removes a cipher to an event |
||
Duplicates events |
||
Collates events by time or count |
||
Calculates the elapsed time between a pair of events |
||
Copies fields from previous log events in Elasticsearch to current events |
||
Stores environment variables as metadata sub-fields |
||
Extracts numbers from a string |
||
Removes special characters from a field |
||
Serializes a field to JSON |
||
Adds arbitrary fields to an event |
||
Takes complex events containing a number of metrics and splits these up into multiple events, each holding a single metric |
||
Aggregates metrics |
||
Parse OUI data from MAC addresses |
||
Prunes event data based on a list of fields to blacklist or whitelist |
||
Strips all non-punctuation content from a field |
||
Checks that specified fields stay within given size or length limits |
||
Replaces the contents of the default message field with whatever you specify in the configuration |
||
Takes an existing field that contains YAML and expands it into an actual data structure within the Logstash event |
||
Sends an event to ZeroMQ |
output
Elastic supported plugins
These plugins are maintained and supported by Elastic.
Plugin |
Description |
Github repository |
Writes events to disk in a delimited format |
||
Stores logs in Elasticsearch |
||
Sends email to a specified address when output is received |
||
Writes events to files on disk |
||
Writes metrics to Graphite |
||
Sends events to a generic HTTP or HTTPS endpoint |
||
Writes events to a Kafka topic |
||
Sends events using the |
||
Pushes events to a RabbitMQ exchange |
||
Sends events to a Redis queue using the |
||
Sends Logstash events to the Amazon Simple Storage Service |
||
Prints events to the standard output |
||
Writes events over a TCP socket |
||
Sends events over UDP |
Community supported plugins
These plugins are maintained and supported by the community. These plugins have met the Logstash development & testing criteria for integration. Contributors include Community Maintainers, the Logstash core team at Elastic, and the broader community.
Plugin |
Description |
Github repository |
Sends annotations to Boundary based on Logstash events |
||
Sends annotations to Circonus based on Logstash events |
||
Aggregates and sends metric data to AWS CloudWatch |
||
Sends events to DataDogHQ based on Logstash events |
||
Sends metrics to DataDogHQ based on Logstash events |
||
Stores logs in Elasticsearch using the |
||
Runs a command for a matching event |
||
Writes metrics to Ganglia’s |
||
Generates GELF formatted output for Graylog2 |
||
Writes events to Google BigQuery |
||
Writes events to Google Cloud Storage |
||
Sends metric data on Windows |
||
Writes events to HipChat |
||
Writes metrics to InfluxDB |
||
Writes events to IRC |
||
Writes strutured JSON events to JIRA |
||
Pushes messages to the Juggernaut websockets server |
||
Sends metrics, annotations, and alerts to Librato based on Logstash events |
||
Ships logs to Loggly |
||
Writes metrics to MetricCatcher |
||
Writes events to MongoDB |
||
Sends passive check results to Nagios |
||
Sends passive check results to Nagios using the NSCA protocol |
||
Sends logstash events to New Relic Insights as custom events |
||
Writes metrics to OpenTSDB |
||
Sends notifications based on preconfigured services and escalation policies |
||
Pipes events to another program’s standard input |
||
Sends events to a Rackspace Cloud Queue service |
||
Creates tickets using the Redmine API |
||
Writes events to the Riak distributed key/value store |
||
Sends metrics to Riemann |
||
Sends events to Amazon’s Simple Notification Service |
||
Stores and indexes logs in Solr |
||
Pushes events to an Amazon Web Services Simple Queue Serice queue |
||
Sends metrics using the |
||
Writes events using the STOMP protocol |
||
Sends events to a |
||
Sends Logstash events to HDFS using the |
||
Publishes messages to a websocket |
||
Posts events over XMPP |
||
Sends events to a Zabbix server |
||
Writes events to a ZeroMQ PUB socket |
logstash的更多相关文章
- Logstash实践: 分布式系统的日志监控
文/赵杰 2015.11.04 1. 前言 服务端日志你有多重视? 我们没有日志 有日志,但基本不去控制需要输出的内容 经常微调日志,只输出我们想看和有用的 经常监控日志,一方面帮助日志微调,一方面及 ...
- logstash file输入,无输出原因与解决办法
1.现象 很多同学在用logstash input 为file的时候,经常会出现如下问题:配置文件无误,logstash有时一直停留在等待输入的界面 2.解释 logstash作为日志分析的管道,在实 ...
- logstash服务启动脚本
logstash服务启动脚本 最近在弄ELK,发现logstash没有sysv类型的服务启动脚本,于是按照网上一个老外提供的模板自己进行修改 #添加用户 useradd logstash -M -s ...
- Logstash时区、时间转换,message重组
适用场景 获取日志本身时间 日志时间转Unix时间 重组message 示例日志: hellow@,@world@,@2011-11-01 18:46:43 logstash 配置文件: input{ ...
- logstash日志分析的配置和使用
logstash是一个数据分析软件,主要目的是分析log日志.整一套软件可以当作一个MVC模型,logstash是controller层,Elasticsearch是一个model层,kibana是v ...
- logstash+elasticsearch+kibana管理日志(安装)
logstash1.先安装jdk2.wget https://download.elastic.co/logstash/logstash/logstash-2.4.0.tar.gz tar -xzvf ...
- 使用Logstash进行日志分析
LogStash主要用于数据收集和分析方面,配合Elasticsearch,Kibana用起来很方便,安装教程google出来很多. 推荐阅读 Elasticsearch 权威指南 精通 Elasti ...
- LogStash filter介绍(九)
LogStash plugins-filters-grok介绍 官方文档:https://www.elastic.co/guide/en/logstash/current/plugins-filter ...
- kafka(logstash) + elasticsearch 构建日志分析处理系统
第一版:logstash + es 第二版:kafka 替换 logstash的方案
- 海量日志分析方案--logstash+kibnana+kafka
下图为唯品会在qcon上面公开的日志处理平台架构图.听后觉得有些意思,好像也可以很容易的copy一个,就动手尝试了一下. 目前只对flume===>kafka===>elacsticSea ...
随机推荐
- Eclipse问题提示
记录下来以下两种方法: 一.只弹出简单的单词提示(如输入system.的时候自动弹出out.in等字段的那种): 点击eclipse上面的windows --> preferences --&g ...
- html5_d登陆界面_注册界面
<!DOCTYPE html><html><head><script type="text/javascript">function ...
- apache日志文件太大的问题
apache日志文件太大的问题 处理Apache日志的两种方法 rotatelogs 是 Apache 2.2 中自带的管道日志程序 rotatelogs [ -l ] logfile [ rotat ...
- HDU 3555 数位dp入门
开始想用dp[i][j]来记录第i位j开头含有49的数的个数 但是init后并不知道如何进行cal 想了想可以用不要62的思想 当作不要49来做 然后减一下 就好 看网上的代码 不要62和这道题用的d ...
- [办公自动化] 再读《让EXCEL飞》(从excel导入access数据时,union联合查询,数据源中没有包含可见的表格)
一年多以前就买了@Mrexcel的<让excel飞>这本书.整体思路是利用access结合excel,大幅度提高数据分析效率. 最近又拿出来看了看.第十五章,比高级筛选更“高级”,P241 ...
- [ZZ] HDR the bungie way
http://blog.csdn.net/toughbro/article/details/6755394 bufferencoding游戏float算法 bungie 06年,gamefest上的p ...
- 34. 求e的近似值
求e的近似值 #include <stdio.h> double fact (int n); int main() { int i, n; double item, sum; while ...
- PHP 设计模式 笔记与总结(6)基础设计模式:工厂模式、单例模式和注册树模式
三种基础设计模式(所有面向对象设计模式中最常见的三种): ① 工厂模式:使用工厂方法或者类生成对象,而不是在代码中直接new 在 Common 目录下新建 Factory.php: <?php ...
- wordpress插入腾讯视频的方法
wordpress插入腾讯视频的方法 最近网站需要插入腾讯视频,但是腾讯视频目前没有分享代码,只有分享到微信,qq,微博等具体选项.百度这个问题,貌似没有很好地解决办法,好像有两个插件可以使用,安装试 ...
- What is Heterogeneous Computing?
http://developer.amd.com/resources/heterogeneous-computing/what-is-heterogeneous-computing/ Heteroge ...