hbase_异常_03_java.io.EOFException: Premature EOF: no length prefix available
一、异常现象
更改了hadoop的配置文件:core-site.xml 和 mapred-site.xml 之后,重启hadoop 和 hbase 之后,发现hbase日志中抛出了如下异常:
2018-03-22 15:56:09,948 WARN [ResponseProcessor for block BP-792111345-192.168.1.102-1521639243869:blk_1073741858_1034] hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block BP-792111345-192.168.1.102-1521639243869:blk_1073741858_1034
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2294)
at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:244)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer$ResponseProcessor.run(DFSOutputStream.java:847)
2018-03-22 15:56:09,951 WARN [ResponseProcessor for block BP-792111345-192.168.1.102-1521639243869:blk_1073741857_1033] hdfs.DFSClient: DFSOutputStream ResponseProcessor exception for block BP-792111345-192.168.1.102-1521639243869:blk_1073741857_1033
java.io.EOFException: Premature EOF: no length prefix available
at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2294)
at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:244)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer$ResponseProcessor.run(DFSOutputStream.java:847)
2018-03-22 15:56:22,866 INFO [regionserver/rayner/192.168.1.102:0.logRoller] wal.FSHLog:
java.io.IOException: All datanodes DatanodeInfoWithStorage[127.0.0.1:50010,DS-44043b27-9b72-419d-9b17-372546490c57,DISK] are bad. Aborting...
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1224)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:990)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:507)
2018-03-22 15:56:23,381 INFO [RS_OPEN_META-rayner:48626-0-MetaLogRoller] wal.FSHLog:
java.io.IOException: All datanodes DatanodeInfoWithStorage[127.0.0.1:50010,DS-44043b27-9b72-419d-9b17-372546490c57,DISK] are bad. Aborting...
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1224)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:990)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:507)
2018-03-22 15:56:25,296 WARN [LeaseRenewer:ray@localhost:9000] hdfs.LeaseRenewer: Failed to renew lease for [DFSClient_NONMAPREDUCE_-1494745515_1] for 30 seconds. Will retry shortly ...
java.net.ConnectException: Call From rayner/192.168.1.102 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1413)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy15.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:595)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy16.renewLease(Unknown Source)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:307)
at com.sun.proxy.$Proxy17.renewLease(Unknown Source)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:307)
at com.sun.proxy.$Proxy17.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:892)
at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:423)
at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:448)
at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71)
at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:304)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:615)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:713)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:376)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
at org.apache.hadoop.ipc.Client.call(Client.java:1452)
... 26 more
二、异常原因
三、异常解决
四、参考资料
1.java.io.EOFException: Premature EOF: no length prefix available
hbase_异常_03_java.io.EOFException: Premature EOF: no length prefix available的更多相关文章
- spark 执行报错 java.io.EOFException: Premature EOF from inputStream
使用spark2.4跟spark2.3 做替代公司现有的hive选项. 跑个别任务spark有以下错误 java.io.EOFException: Premature EOF from inputSt ...
- socket编程报异常java.io.EOFException
一个客户端连接服务器的小程序,服务器端可以正常读取客户端发来的数据 但是当客户端关闭时,服务端也关闭了,并且抛出如下的异常: java.io.EOFException at java.io.DataI ...
- java.io.IOException: Premature EOF
http访问第三方系统的接口时,小概率抛出下面的异常: java.io.IOException: Premature EOF at sun.net.www.http.ChunkedInputStrea ...
- hadoop异常: java.io.EOFException: Unexpected end of input stream
执行hadoop任务时报错: -- ::, INFO [main] org.apache.hadoop.mapred.MapTask: Processing --//app1@flume23_1000 ...
- hadoop MR 任务 报错 "Error: java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io"
错误原文分析 文件操作超租期,实际上就是data stream操作过程中文件被删掉了.一般是由于Mapred多个task操作同一个文件.一个task完毕后删掉文件导致. 这个错误跟dfs.datano ...
- hive报lzo Premature EOF from inputStream错误
今天dw组同事发邮件说有一个问题让帮解决一下.他们自己没能搞得定.下面问题解决过程: 1.hql insert overwrite table mds_prod_silent_atten_user p ...
- BufferedReader的ready与readLine使用,以及Premature EOF异常
我的个人主页:http://www.foreyou.net 有些人在读取服务器端返回的数据的时候,使用了BufferedReader类的ready: while(reader.ready()) { / ...
- 解决HDFS无法启动namenode,报错Premature EOF from inputStream;Failed to load FSImage file, see error(s) above for more info
一.情况描述 启动hadoop后发现无法打开hdfs web界面,50070打不开,于是jps发现少了一个namenode: 查看日志信息,发现如下报错: 2022-01-03 23:54:10,99 ...
- 启动项目时出现java.io.EOFException异常。
错误: 2018-4-18 10:55:54 org.apache.catalina.session.StandardManager doLoad 严重: IOException while load ...
随机推荐
- 数据性能调校——查看最耗资源的各种SQL
从计划高速缓存中清除查询计划 DBCC FREEPROCCACHE 清除缓存中的过程 DBCC DROPCLEANBUFFERS清除内存中的数据 SELECT DB_ID('你的数据库名') tota ...
- 阿里云下 centos7下启动程序总是被killed ,看内存占用情况以检查哪些服务存在问题并调整参数作调优
很久不搭理自己的网站了,几天突然发现启动程序总是被killed, 于是查看了系统日志 vi /var/log/messages 发现出现 kernel: Out of memory: Kill pro ...
- 解决github访问慢和clone慢解决方案
在http://tool.chinaz.com/dns/ 这个网站输入github.com 打开cmd ping各个服务器ip地址,看看哪个比较好 windows下C:\Windows\System3 ...
- ZRGGBS00 GGB1替代问题
ZRGGBS00ZRGGBS00ZRGGBS00 和Validation不同的是,Validation只做检测,一般不做相应数据的修改,Substitution弥补了这反面的缺陷,它和user exi ...
- 系统性能模块psutil
psutil是一个跨平台库,能够轻松实现获取系统运行的进程和系统利用率(包括cpu.内存.磁盘.网络等)信息.它主要用于系统监控,分析和限制系统资源及进程的管理.它实现了同等命令行工具提供的功能,如p ...
- foreach使用和函数
2016-04-25 一.foreach( 对集合每个元素的引用 in 集合 ) { } int[] a = new int[5]{1,2,3,4,5}; foreach( int b in a ) ...
- git在IDEA中的使用
学习资料: http://blog.csdn.net/autfish/article/details/52513465 (关于提交的讲解) http://blog.csdn.net/ck443870 ...
- docker 命令添加容器数据卷
实现宿主机和容器的数据共享 只要建立连接,即使容器exit,主机的修改仍能提现到容器
- Safari通过JavaScript获取系统语言
IE6 IE7 IE8 Firefox Chrome Safari Opera navigator.language undefined zh-CN zh-CN navigator.userLan ...
- 每天一个Linux命令(62)rcp命令
rcp代表"remote file copy"(远程文件拷贝). (1)用法: 用法: rcp [参数] [源文件] [目标文件] (2)功能: ...