ssh那些都已经搞了,跑一个书上的例子出现了Connection Refused异常,如下:

12/04/09 01:00:54 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 0 time(s).
12/04/09 01:00:56 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 1 time(s).
12/04/09 01:00:57 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 2 time(s).
12/04/09 01:00:58 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 3 time(s).
12/04/09 01:00:59 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 4 time(s).
12/04/09 01:01:00 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 5 time(s).
12/04/09 01:01:01 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 6 time(s).
12/04/09 01:01:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 7 time(s).
12/04/09 01:01:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 8 time(s).
12/04/09 01:01:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8020. Already tried 9 time(s).
Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:8020 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
at org.apache.hadoop.ipc.Client.call(Client.java:1071)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at $Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at com.lan.hadoop.FileSystemCat.main(FileSystemCat.java:15)

.............................

记得以前也出过这个异常,当时是因为Namenode没有启动,今天的情况是使用hadoop的ls 、mkdir那些都正常,使用jps也发现有namenode进程,telnet localhost 8020失败,但是自己也不知道是为啥失败,最后董问我namenode的端口号是不是8020,我还没反应过来,最后看了下core-site.xml发现我将fs.default.name配置为

hdfs://localhost:9000了,果然是端口号不对。在代码中把端口号改为9000就OK了

Hadoop: failed on connection exception: java.net.ConnectException: Connection refuse的更多相关文章

  1. Call From master/192.168.128.135 to master:8485 failed on connection exception: java.net.ConnectException: Connection refused

    hadoop集群搭建了ha,初次启动正常,最近几天启动时偶尔发现,namenode1节点启动后一段时间(大约10几秒-半分钟左右),namenode1上namenode进程停掉,查看日志: -- :: ...

  2. Caused by: java.net.ConnectException: Call From master/192.168.199.130 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.

    1:安装好hive,准备启动的时候出现下面的错误(由于hive是基于Hadoop的,所以必须先将你的集群启动起来,我就是没有启动集群,直接启动hive导致的错误): [root@master bin] ...

  3. java.net.ConnectException: Call From slaver1/192.168.19.128 to slaver1:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org

    1:练习spark的时候,操作大概如我读取hdfs上面的文件,然后spark懒加载以后,我读取详细信息出现如下所示的错误,错误虽然不大,我感觉有必要记录一下,因为错误的起因是对命令的不熟悉造成的,错误 ...

  4. Bad connection to FS. command aborted. exception: Call to chaoren/192.168.80.100:9000 failed on connection exception: java.net.ConnectException: Connection refused

    Bad connection to FS. command aborted. exception: Call to chaoren/192.168.80.100:9000 failed on conn ...

  5. 异常: Call From * 9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

    场景: eclipse链接不上阿里云hadoop解决: 将hadoop的配置文件中的ip改为内网IP即可

  6. Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException的解决方案

    Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException的解决方案 作者:凯鲁 ...

  7. exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

    1.虽然,不是大错,还说要贴一下,由于我运行run-example streaming.NetworkWordCount localhost 9999的测试案例,出现的错误,第一感觉就是Spark没有 ...

  8. hadoop报错:java.io.IOException(java.net.ConnectException: Call From xxx/xxx to xxx:10020 failed on connection exception: java.net.ConnectException: 拒绝连接

    任务一直报错 现象比较奇怪,部分任务可以正常跑,部分问题报错 报错信息如下: Ended Job = job_1527476268558_132947 with exception 'java.io. ...

  9. ls: Call From hdoop2/192.168.18.87 to hdoop2:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see

    场景:  预发环境中,同事已经搭建了一套hadoop集群,由于版本与所需不符,所以需要替换版本 问题描述: 在配置文件都准确的情况下,启动hadoop,出现以下报错: 启动之前初始化:   初始化目录 ...

随机推荐

  1. linux查看内核版本

    cat /proc/version 或者 cat /etc/issue 或者 uname -a

  2. Kubernetes 1.5.1 部署

    > kubernetes 1.5.0 , 配置文档 # 1 初始化环境 ## 1.1 环境: | 节 点  |      I P      ||--------|-------------||n ...

  3. 《JS权威指南学习总结--toString()和valueOf()方法》

    方法要点: 一.toString()方法 1.主要用于Array.Boolean.Date.Error.Function.Number等对象转化为字符串形式.      数组类的toSting()方法 ...

  4. LeetCode OJ 236. Lowest Common Ancestor of a Binary Tree

    Given a binary tree, find the lowest common ancestor (LCA) of two given nodes in the tree. According ...

  5. Floodfill算法——求独立子图个数

    Floodfill--漫水填充法(也称种子填充法)简单来说就是求一个图中独立子图的个数并将其描述出不同的状态.Floodfill在计算机图形学有着非常广泛的运用,比如图像分割.物体识别之类.基于Flo ...

  6. Docker私有仓库Registry 搭建

    1. 关于Registry 官方的Docker hub是一个用于管理公共镜像的好地方,我们可以在上面找到我们想要的镜像,也可以把我们自己的镜像推送上去.但是,有时候,我们的使用场景需要我们拥有一个私有 ...

  7. Ansible7:Playbook常用模块【转】

    playbook的模块与在ansible命令行下使用的模块有一些不同.这主要是因为在playbook中会使用到一些facts变量和一些通过setup模块从远程主机上获取到的变量.有些模块没法在命令行下 ...

  8. 微信支付WxpayAPI_php_v3(一)sdk简介与错误修改

    经过断断续续将近一周的时间终于把微信支付调通了. 这里总结一下,算是给后来者有个指引.少踩坑!!!! 开发语言:php5.5 语言框架:laravel5.2 微信sdk:WxpayAPI_php_v3 ...

  9. docker镜像与容器概念

    本文用图文并茂的方式介绍了容器.镜像的区别和Docker每个命令后面的技术细节,能够很好的帮助读者深入理解Docker. 这篇文章希望能够帮助读者深入理解Docker的命令,还有容器(containe ...

  10. jenkins配置角色访问

    本文将介绍如何配置jenkins,使其可以支持基于角色的项目权限管理. 由于jenkins默认的权限管理体系不支持用户组或角色的配置,因此需要安装第三发插件来支持角色的配置,本文将使用Role Str ...