kafka-php

kafka-php的github地址  https://github.com/jacky5059/kafka-php

生产者produce示例代码

<?php
set_include_path(
implode(PATH_SEPARATOR, array(
realpath(__DIR__ . '/../lib'),
get_include_path(),
))
);
require 'autoloader.php';
$host = 'localhost';
$port = 9092;
$topic = 'test';
$producer = new Kafka_Producer($host, $port, Kafka_Encoder::COMPRESSION_NONE);
$in = fopen('php://stdin', 'r');
while (true) {
echo "\nEnter comma separated messages:\n";
$messages = explode(',', fgets($in));
foreach (array_keys($messages) as $k) {
//$messages[$k] = trim($messages[$k]);
}
$bytes = $producer->send($messages, $topic);
printf("\nSuccessfully sent %d messages (%d bytes)\n\n", count($messages), $bytes);
}

简单消费者simple consumer示例代码

<?php
set_include_path(
implode(PATH_SEPARATOR, array(
realpath(__DIR__ . '/../lib'),
get_include_path(),
))
);
require 'autoloader.php';
$host = 'localhost';
$zkPort = 2181; //zookeeper
$kPort = 9092; //kafka server
$topic = 'test';
$maxSize = 10000000;
$socketTimeout = 2;
$offset = 0;
$partition = 0;
$nMessages = 0;
$consumer = new Kafka_SimpleConsumer($host, $kPort, $socketTimeout, $maxSize);
while (true) {
try {
//create a fetch request for topic "test", partition 0, current offset and fetch size of 1MB
$fetchRequest = new Kafka_FetchRequest($topic, $partition, $offset, $maxSize);
//get the message set from the consumer and print them out
$partialOffset = 0;
$messages = $consumer->fetch($fetchRequest);
foreach ($messages as $msg) {
++$nMessages;
echo "\nconsumed[$offset][$partialOffset][msg #{$nMessages}]: " . $msg->payload();
$partialOffset = $messages->validBytes();
}
//advance the offset after consuming each message
$offset += $messages->validBytes();
//echo "\n---[Advancing offset to $offset]------(".date('H:i:s').")";
unset($fetchRequest);
//sleep(2);
} catch (Exception $e) {
// probably consumed all items in the queue.
echo "\nERROR: " . get_class($e) . ': ' . $e->getMessage()."\n".$e->getTraceAsString()."\n";
sleep(2);
}
}

基于zookeeper的消费者zkconsumer示例代码

<?php
set_include_path(
implode(PATH_SEPARATOR, array(
realpath(__DIR__ . '/../lib'),
get_include_path(),
))
);
require 'autoloader.php';
// zookeeper address (one or more, separated by commas)
$zkaddress = 'localhost:8121';
// kafka topic to consume from
$topic = 'testtopic';
// kafka consumer group
$group = 'testgroup';
// socket buffer size: must be greater than the largest message in the queue
$socketBufferSize = 10485760; //10 MB
// approximate max number of bytes to get in a batch
$maxBatchSize = 20971520; //20 MB
$zookeeper = new Zookeeper($zkaddress);
$zkconsumer = new Kafka_ZookeeperConsumer(
new Kafka_Registry_Topic($zookeeper),
new Kafka_Registry_Broker($zookeeper),
new Kafka_Registry_Offset($zookeeper, $group),
$topic,
$socketBufferSize
);
$messages = array();
try {
foreach ($zkconsumer as $message) {
// either process each message one by one, or collect them and process them in batches
$messages[] = $message;
if ($zkconsumer->getReadBytes() >= $maxBatchSize) {
break;
}
}
} catch (Kafka_Exception_OffsetOutOfRange $exception) {
// if we haven't received any messages, resync the offsets for the next time, then bomb out
if ($zkconsumer->getReadBytes() == 0) {
$zkconsumer->resyncOffsets();
die($exception->getMessage());
}
// if we did receive some messages before the exception, carry on.
} catch (Kafka_Exception_Socket_Connection $exception) {
// deal with it below
} catch (Kafka_Exception $exception) {
// deal with it below
}
if (null !== $exception) {
// if we haven't received any messages, bomb out
if ($zkconsumer->getReadBytes() == 0) {
die($exception->getMessage());
}
// otherwise log the error, commit the offsets for the messages read so far and return the data
}
// process the data in batches, wait for ACK
$success = doSomethingWithTheMessages($messages);
// Once the data is processed successfully, commit the byte offsets.
if ($success) {
$zkconsumer->commitOffsets();
}
// get an approximate figure on the size of the queue
try {
echo "\nRemaining bytes in queue: " . $consumer->getRemainingSize();
} catch (Kafka_Exception_Socket_Connection $exception) {
die($exception->getMessage());
} catch (Kafka_Exception $exception) {
die($exception->getMessage());
}

kafka-php的更多相关文章

  1. Spark踩坑记——Spark Streaming+Kafka

    [TOC] 前言 在WeTest舆情项目中,需要对每天千万级的游戏评论信息进行词频统计,在生产者一端,我们将数据按照每天的拉取时间存入了Kafka当中,而在消费者一端,我们利用了spark strea ...

  2. 消息队列 Kafka 的基本知识及 .NET Core 客户端

    前言 最新项目中要用到消息队列来做消息的传输,之所以选着 Kafka 是因为要配合其他 java 项目中,所以就对 Kafka 了解了一下,也算是做个笔记吧. 本篇不谈论 Kafka 和其他的一些消息 ...

  3. kafka学习笔记:知识点整理

    一.为什么需要消息系统 1.解耦: 允许你独立的扩展或修改两边的处理过程,只要确保它们遵守同样的接口约束. 2.冗余: 消息队列把数据进行持久化直到它们已经被完全处理,通过这一方式规避了数据丢失风险. ...

  4. .net windows Kafka 安装与使用入门(入门笔记)

    完整解决方案请参考: Setting Up and Running Apache Kafka on Windows OS   在环境搭建过程中遇到两个问题,在这里先列出来,以方便查询: 1. \Jav ...

  5. kafka配置与使用实例

    kafka作为消息队列,在与netty.多线程配合使用时,可以达到高效的消息队列

  6. kafka源码分析之一server启动分析

    0. 关键概念 关键概念 Concepts Function Topic 用于划分Message的逻辑概念,一个Topic可以分布在多个Broker上. Partition 是Kafka中横向扩展和一 ...

  7. Kafka副本管理—— 为何去掉replica.lag.max.messages参数

    今天查看Kafka 0.10.0的官方文档,发现了这样一句话:Configuration parameter replica.lag.max.messages was removed. Partiti ...

  8. Kafka:主要参数详解(转)

    原文地址:http://kafka.apache.org/documentation.html ############################# System ############### ...

  9. kafka

    2016-11-13  20:48:43 简单说明什么是kafka? Apache kafka是消息中间件的一种,我发现很多人不知道消息中间件是什么,在开始学习之前,我这边就先简单的解释一下什么是消息 ...

  10. Spark Streaming+Kafka

    Spark Streaming+Kafka 前言 在WeTest舆情项目中,需要对每天千万级的游戏评论信息进行词频统计,在生产者一端,我们将数据按照每天的拉取时间存入了Kafka当中,而在消费者一端, ...

随机推荐

  1. CF1059C Sequence Transformation 题解

    这几天不知道写点什么,状态也不太好,搬个题上来吧 题意:给定一个数n,设一个从1到n的序列,每次删掉一个序列中的数,求按字典序最大化的GCD序列 做法:按2的倍数找,但是如果除2能得到3的这种情况要特 ...

  2. python 序列结构-列表,元组,字典,字符串,集合

    列表 """ name_list.__add__( name_list.__getslice__( name_list.__new__( name_list.append ...

  3. 不同的color-model

    RGB color mode YIQ color mode Y: brightness,亮度 I: In-phase,色彩从橙色到青色 -Q: Quadrature-phase, 色彩从紫色到黄绿色 ...

  4. .net core 2.0部署到CentOS7系统

    1.Nginx的安装(重启Nginx命令: systemctl restart nginx) 输入命令( 根据提示输入Y 即可): sudo yum install epel-release sudo ...

  5. PyQt5(5)——加载资源文件

    在实际中我们需要美化界面,就需要许多的自定义图片. 但是我们发现直接导入图像使用,等程序运行时会报错.???? 这就需要建立资源文件并且加载它们,程序就可以顺利运行了. 设计界面是如何加载资源文件呢? ...

  6. linux新服务器分区挂载

    新买一台服务器,需要自己手动对硬盘进行分区挂载:(这是centos下,其他版本应该也类似) 1.查看没有分区的硬盘:fdisk  -l  由图上信息可知,该服务器由三块硬盘 vda.vdb.vdc,其 ...

  7. Bootrap 项目实战(微金所前端首页)第一部分

    微金所前端首页成果图:(这是本人自己按照微金所官网首页,采用Bootrap,JS,JQuery,css制作的网页效果图,在第二部分我会公布网页源代码) 如需网页源代码,请在下方留言,备注你的qq邮箱. ...

  8. 关闭tensorflow运行时的警告信息

    执行简单的矩阵相乘的程序: import tensorflow as tf m1 = tf.constant([[3,3]]) m2 = tf.constant([[2],[3]]) product ...

  9. 01. css sprite是什么,有什么优缺点?

    1.css sprite是什么,有什么优缺点? 通常被意译为“CSS图像拼合”或“CSS贴图定位” 1)CSS Sprites的优点 利用CSS Sprites能很好地减少网页的http请求,从而大大 ...

  10. selenium(python)登录时账号密码错误提示语

    selenium(python)登录时账号密码错误提示语的获取 可以用text