Java连接kafka
1、maven依赖:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.xxx.test</groupId>
<artifactId>xxx</artifactId>
<version>1.0-SNAPSHOT</version>
<inceptionYear>2008</inceptionYear>
<properties>
<scala.version>2.11.12</scala.version>
<kafka.version>0.10.2.0</kafka.version>
<!--<kafka.version>1.1.0</kafka.version>-->
</properties> <repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories> <pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories> <dependencies> <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>${kafka.version}</version>
</dependency> <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>${kafka.version}</version>
</dependency> </dependencies> <build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.8</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<configuration>
<downloadSources>true</downloadSources>
<buildcommands>
<buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
</buildcommands>
<additionalProjectnatures>
<projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
</additionalProjectnatures>
<classpathContainers>
<classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
<classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
</classpathContainers>
</configuration>
</plugin> <plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<!-- 此处指定main方法入口的class -->
<mainClass></mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>assembly</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build> <reporting>
<plugins> <plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin> </plugins>
</reporting>
</project>
2、Java代码:
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer; import java.util.Collections;
import java.util.Properties; public class MyConsumer{ private final static String TOPIC = "test_topic";
private final static String KAFKA_SERVER_URL = "10.31.7.200";
private final static String KAFKA_SERVER_PORT = "9092";
public static final String HOST_NAME = KAFKA_SERVER_URL; public static KafkaConsumer<Integer, String> getConsumer() {
KafkaConsumer<Integer, String> consumer;
Properties props = new Properties();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, KAFKA_SERVER_URL + ":" + KAFKA_SERVER_PORT);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "DemoConsumer");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");
props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000");
props.put("host.name",HOST_NAME); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer"); // props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.IntegerDeserializer");
// props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
// props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter"); consumer = new KafkaConsumer<Integer, String>(props);
return consumer;
} public static void main(String[] args) {
KafkaConsumer<Integer, String> consumer = getConsumer();
consumer.subscribe(Collections.singletonList(TOPIC));
ConsumerRecords<Integer, String> records ; records = consumer.poll(1000); for (ConsumerRecord<Integer, String> record : records) {
System.out.println("Received message: (" + record.key() + ", " + record.value() + ") at offset " + record.offset());
}
}
}
3、可能的报错:
需要设置 props.put("host.name",ip);
org.apache.kafka.common.errors.TimeoutException
这个报错需要检查k/v的序列化类,要求序列化类是org.apache.kafka.common.serialization.Deserializer的子类
Exception in thread "main" org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition event_topic-0 at offset 161529729
Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by IntegerDeserializer is not 4
1、需要设置 value.deserializer 的值
Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "value.deserializer" which has no default value.
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:436)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:56)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:63)
at org.apache.kafka.clients.consumer.ConsumerConfig.<init>(ConsumerConfig.java:426)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:597)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:579)
at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)
at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)
2、k/v的 序列化类 org.apache.kafka.connect.json.JsonConverter 不是 org.apache.kafka.common.serialization.Deserializer 的子类,所以报错了。
Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:717)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:597)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:579)
at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)
at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.connect.json.JsonConverter is not an instance of org.apache.kafka.common.serialization.Deserializer
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:205)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:645)
... 4 more
这个报错也是类似
Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:717)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:597)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:579)
at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)
at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.serialization.StringSerializer is not an instance of org.apache.kafka.common.serialization.Deserializer
at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:205)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:637)
... 4 more
Java连接kafka的更多相关文章
- java 连接Kafka报错java.nio.channels.ClosedChannelExcep
Java 客户端连接Kafka报如下错误 java.nio.channels.ClosedChannelExcep 是由于Kafka server.properties中的advertised.hos ...
- 【kafka】Java连接出现Connection refused: no further information的解决方法
在Linux机器(ip:10.102.16.203)安装完kafka(参考:kafka的安装及使用),在windows上使用Java接口访问服务时(参考:Java实现Kafka的生产者.消费者),报异 ...
- JAVA版Kafka代码及配置解释
伟大的程序员版权所有,转载请注明:http://www.lenggirl.com/bigdata/java-kafka.html.html 一.JAVA代码 kafka是吞吐量巨大的一个消息系统,它是 ...
- Java操作Kafka执行不成功的解决方法,Kafka Broker Advertised.Listeners属性的设置
创建Spring Boot项目继承Kafka,向Kafka发送消息始终不成功.具体项目配置如下: <?xml version="1.0" encoding="UTF ...
- SpringBoot 连接kafka ssl 报 CertificateException: No subject alternative names present 异常解决
当使用较新版本SpringBoot时,对应的 kafka-client 版本也比较新,如果使用了 2.x 以上的 kafka-client ,并且配置了 kafka ssl 连接方式时,可能会报如下异 ...
- Java版Kafka使用及配置解释
Java版Kafka使用及配置解释 一.Java示例 kafka是吞吐量巨大的一个消息系统,它是用scala写的,和普通的消息的生产消费还有所不同,写了个demo程序供大家参考.kafka的安装请参考 ...
- Java连接远程Redis
redis-server & //后台启动redis redis-cli //使用redis 打开redis.conf文件在NETWORK部分有说明 /usr/local/src ...
- Java连接SQLServer2008终极解决办法(亲身上机演练版)
今天我一学妹问我,Java连接SQLServer2008数据库的问题,一直无法连接成功.想起自己刚开始学习的时候,在网上找各种文章,然后实际上机验证操作,花了一两天时间才搞定,一把辛酸泪呀!记得当时是 ...
- java连接mysql
Java 连接 MySQL 需要驱动包,最新版下载地址为:http://dev.mysql.com/downloads/connector/j/,解压后得到jar库文件,然后在对应的项目中导入该库文件 ...
随机推荐
- Scrapy 代理IP
Scrapy 代理IP 一.Scarpy使用代理IP 1.在setting.py 配置 代理服务器IP 2.在middlermares.py 配置 downloadmiddlermare(下载中间件) ...
- MySQL Hardware--CentOS 6修改CPU性能模式
cpufrequtils命令 ## 安装: yum install cpufrequtils ## 查看CPU信息: cpufreq-info -m 输出CPU信息为: analyzing CPU : ...
- 如何实现从Java入门到服务端项目开发的进阶?
对于打算入门或者刚刚入门学习Java的人来说,刚开始接触这门学科,往往会觉得不知所措,也会觉得很迷茫.结合前人经验,就从入门到进阶对于Java的学习而言,应该对于学习时间.目标和内容规划有一个清晰的定 ...
- C#编程时应注意的性能处理
GC堆回收 那么除了通过new对象而达到代的阈(临界)值时,还有什么能够导致垃圾堆进行垃圾回收呢? 还可能windows报告内存不足.CLR卸载AppDomain.CLR关闭等其它特殊情况. 或者,我 ...
- C goto
http://c.biancheng.net/view/266.html 当程序遇到 goto 后, 会无条件跳转到标签后出,然后程序按照顺序执行 例子: #include <stdio.h&g ...
- Java 动态绑定
转载 http://www.cnblogs.com/ygj0930/p/6554103.html 一:绑定 把一个方法与其所在的类/对象 关联起来叫做方法的绑定.绑定分为静态绑定(前期绑定)和动态绑 ...
- 剑指offer 12.代码的完整性 数值的整数次方
题目描述 给定一个double类型的浮点数base和int类型的整数exponent.求base的exponent次方. 本人渣渣代码: public double Power(double ba ...
- Linux基础入门-基本概念及操作
桌面环境: KDE.GNOME.XFCE.LXDE 实验楼使用的是XFCE 终端: gnome-terminal, kconsole, xterm, rxvt, kvt, nxterm, eterm ...
- MPU9250九轴陀螺仪--连接MPU9250
树莓派连接MPU9250九轴加速度传感器1,配线方法 树莓派侧 MPU9250侧 3.3V VCC (SDA)GPIO2 SDA (SCL)GPIO3 SCL GND GND 2,I2C有效在树莓派里 ...
- KindEditor自动过滤首行缩进和全角空格的解决方法
KindEditor 4.1.11:kindeditor-all.js 文件中大致第752行: /(\s*)<(\/)?([\w\-:]+)((?:\s+|(?:\s+[\w\-:]+)|(?: ...