不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 8: Use Kafka Streams to process data Step : 使用Kafka Stream来处理数据 Kafka Streams is a client library of Kafka for real-time stream processing and analyzing data stored in Kafka brokers. This…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 7: Use Kafka Connect to import/export data Step : 使用 Kafka Connect 来 导入/导出 数据 Writing data from the console and writing it back to the console is a convenient place to start, but you'll p…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Kafka as a Storage System kafka作为一个存储系统 Any message queue that allows publishing messages decoupled from consuming them is effectively acting as a storage system for the in-flight messages. Wh…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Kafka as a Messaging System kafka作为一个消息系统 How does Kafka's notion of streams compare to a traditional enterprise messaging system? Kafka的流与传统企业消息系统相比的概念如何? Messaging traditionally has two mode…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 6: Setting up a multi-broker cluster Step : 设置多个broker集群 So far we have been running against a single broker, but that's no fun. For Kafka, a single broker is just a cluster of size one,…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 4: Send some messages Step : 发送消息 Kafka comes with a command line client that will take input from a file or from standard input and send it out as messages to the Kafka cluster. By defau…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 2: Start the server Step : 启动服务 Kafka uses ZooKeeper so you need to first start a ZooKeeper server if you don't already have one. You can use the convenience script packaged with kafka to…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 5: Start a consumer Step : 消费消息 Kafka also has a command line consumer that will dump out messages to standard output. Kafka也提供了一个消费消息的命令行工具,将存储的信息输出出来. > bin/kafka-console-consumer.sh --…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 3: Create a topic Step 3: 创建一个主题(topic) Let's create a topic named "test" with a single partition and only one replica: 创建一个名为“test”的Topic,只有一个分区和一个备份:   > bin/kafka-topics.sh --c…
Redis安装 在安装之前需要获取Redis安装包.在这里我们就不详细介绍安装包的获取了.这里Redis-x64-3.2.100.zip安装包为例通过dos命令取安装.通过dos命令找到安装目录. 在命令行中输入redis-server --service-install redis.windows-service.conf --loglevel verbose 这样Redis服务就安装好了. 2. Redis停止 redis-server --service-stop 3. Redis启动 r…
MQTT(Message Queuing Telemetry Transport,消息队列遥测传输)是IBM开发的一个即时通讯协议,有可能成为物联网的重要组成部分.该协议支持所有平台,几乎可以把所有联网物品和外部连接起来,被用作各种传感器以及智能家居产品的数据通信协议. MQTT是建立在TCP协议之上的用于计算能力有限,带宽低,且不可靠的网络的远程传感器和控制设备通讯而设计的协议,协议头部只有两个字节,实现了数据传输和协议交换的最少化,并且减少网络流量.非常适用于嵌入式设备. 目前各大云服务平台…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Kafka for Stream Processing kafka的流处理 It isn't enough to just read, write, and store streams of data, the purpose is to enable real-time processing of streams. 仅仅读,写和存储是不够的,kafka的目标是实时的流处理. In…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Guarantees Kafka的保证(Guarantees) At a high-level Kafka gives the following guarantees: Messages sent by a producer to a particular topic partition will be appended in the order they are sent. T…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Consumers 消费者(Consumers) Consumers label themselves with a consumer group name, and each record published to a topic is delivered to one consumer instance within each subscribing consumer grou…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Producers 生产者(Producers) Producers publish data to the topics of their choice. The producer is responsible for choosing which record to assign to which partition within the topic. This can be…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Distribution 分布式(Distribution) The partitions of the log are distributed over the servers in the Kafka cluster with each server handling data and requests for a share of the partitions. Each p…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Topics and Logs 话题和日志 (Topic和Log) Let's first dive into the core abstraction Kafka provides for a stream of records—the topic. 让我们更深入的了解Kafka中的Topic. A topic is a category or feed name to whic…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Apache Kafka™ is a distributed streaming platform. What exactly does that mean? Kafka是一个分布式流数据处理平台.这到底是什么意思呢? We think of a streaming platform as having three key capabilities: 我们认为一个流数据处理平台必须…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ 不要局限于,这个版本,我只是以最新的版本,来做个引子,让大家对官网的各个kafka版本懂得如何独立去看. > tar -xzf kafka_2.-0.10.2.0.tgz > cd kafka_2.-0.10.2.0 over…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Commit Log 提交日志 Kafka can serve as a kind of external commit-log for a distributed system. The log helps replicate data between nodes and acts as a re-syncing mechanism for failed nodes to res…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Event Sourcing 事件采集 Event sourcing is a style of application design where state changes are logged as a time-ordered sequence of records. Kafka's support for very large stored log data makes i…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Stream Processing 流处理 Many users of Kafka process data in processing pipelines consisting of multiple stages, where raw input data is consumed from Kafka topics and then aggregated, enriched,…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Log Aggregation 日志聚合 Many people use Kafka as a replacement for a log aggregation solution. Log aggregation typically collects physical log files off servers and puts them in a central place (…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Metrics 指标 Kafka is often used for operational monitoring data. This involves aggregating statistics from distributed applications to produce centralized feeds of operational data. kafka也常常用于监…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Website Activity Tracking 网站活动追踪 The original use case for Kafka was to be able to rebuild a user activity tracking pipeline as a set of real-time publish-subscribe feeds. This means site acti…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ 下面是一些关于Apache kafka 流行的使用场景.这些领域的概述,可查看博客文章. Messaging 消息 Kafka works well as a replacement for a more traditional message broker. Message brokers are used for a variety of reasons (to decoupl…
不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Putting the Pieces Together 拼在一起 This combination of messaging, storage, and stream processing may seem unusual but it is essential to Kafka's role as a streaming platform. 消息传递,存储和流处理的组合看似反常,…
enode框架step by step之框架要实现的目标的分析思路剖析1 enode框架系列step by step文章系列索引: 分享一个基于DDD以及事件驱动架构(EDA)的应用开发框架enode enode框架step by step之事件驱动架构(EDA)思想的在框架中如何体现 enode框架step by step之saga的思想与实现 本文想介绍一下enode框架要实现的目标以及部分实现分析思路剖析.总体来说enode框架是一个基于cqrs架构和消息驱动的应用开发框架.在说实现思路之…
最近项目组需要ceph环境,第一次搭建ceph,各种不顺,装了卸,卸了装,一遍又一遍地按照官网的操作进行.最气人的是网速差,从官网取包太慢.一轮尝试就浪费一上午. 因此想到本地新建yum源. 首先,按照官网http://docs.ceph.com/docs/master/start/搭建,(自恃高明的开发人员肯定是把防火墙全部关掉,搭建环境一律用root用户,暂且不提) 当走到STEP 2: STORAGE CLUSTER中 “ceph-deploy install node1 node2 no…
前几个月时间,研究微软Power BI技术,由于没有任何文档和资料,只能在英文官网瞎折腾,同时也发布了英文文档的相关文章:系列文章,刚好上周把文章发布完,结果简体中文版上线了.哈哈,心里有苦啊,早知道这么快,我就省点时间了. 1.为何简体中文版姗姗来迟 之前我主要是进入英文版和繁体中文版,为什么之初没有简体中文版,我估计的主要原因有以下几个: 1) 由于国内的云监管政策,导致Power BI相关服务落地有点延后: 2) Power BI Desktop本身目前还有诸多不完善,所以今年的更新也挺频…