不多说,直接上干货!

问题详情

问题排查

spark@master:~/app/hadoop$ sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [master]
master: starting namenode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-namenode-master.out
slave1: starting datanode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-datanode-slave1.out
slave2: starting datanode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-datanode-slave2.out
Starting secondary namenodes [master]
master: starting secondarynamenode, logging to /home/spark/app/hadoop-2.7./logs/hadoop-spark-secondarynamenode-master.out
starting yarn daemons
starting resourcemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-resourcemanager-master.out
slave2: starting nodemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-nodemanager-slave2.out
slave1: starting nodemanager, logging to /home/spark/app/hadoop-2.7./logs/yarn-spark-nodemanager-slave1.out
spark@master:~/app/hadoop$ jps
SecondaryNameNode
NameNode
ResourceManager
sun.tools.jps.Jps
spark@master:~/app/hadoop$

 解决办法

spark@slave1:~/app/hadoop-2.7./logs$ cat  hadoop-spark-datanode-slave1.log
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG: host = slave1/192.168.80.146
STARTUP_MSG: args = []
STARTUP_MSG: version = 2.7.3
STARTUP_MSG: classpath = /home/spark/app/hadoop-2.7.3/etc/hadoop:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-net-3.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-digester-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/gson-2.2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hadoop-auth-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsp-api-2.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-collections-3.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/xmlenc-0.52.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-client-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsch-0.1.42.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/httpcore-4.2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/activation-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/paranamer-2.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-framework-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/junit-4.11.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jersey-json-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/avro-1.7.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jettison-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/hadoop-annotations-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-common-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/common/hadoop-nfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/hdfs/hadoop-hdfs-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/activation-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/javax.inject-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jettison-1.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/guice-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-client-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-api-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-server-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/yarn/hadoop-yarn-registry-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/home/spark/app/hadoop-2.7.3/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r baa91f7c6bc9cb92be5982de4719c1c8af91ccff; compiled by 'root' on 2016-08-18T01:41Z
STARTUP_MSG: java = 1.8.0_60
************************************************************/
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at second(s).
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is slave1
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory =
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is bytes/s
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is
-- ::, INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
-- ::, INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
-- ::, INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
-- ::, INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
-- ::, INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port
-- ::, INFO org.mortbay.log: jetty-6.1.
-- ::, INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = spark
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
-- ::, INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
-- ::, INFO org.apache.hadoop.ipc.Server: Starting Socket Reader # for port
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
-- ::, INFO org.mortbay.log: Stopped HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:
-- ::, INFO org.apache.hadoop.ipc.Server: Stopping server on
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping DataNode metrics system...
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system stopped.
-- ::, INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system shutdown complete.
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Shutdown complete.
-- ::, FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
at org.apache.hadoop.hdfs.DFSUtil.getNNServiceRpcAddressesForCluster(DFSUtil.java:)
at org.apache.hadoop.hdfs.server.datanode.BlockPoolManager.refreshNamenodes(BlockPoolManager.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:)
-- ::, INFO org.apache.hadoop.util.ExitUtil: Exiting with status
-- ::, INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at slave1/192.168.80.146
************************************************************/

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000</value>
</property>
<property>
  <name>io.file.buffer.size</name>
  <value></value>
</property>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/usr/local/hadoop/hadoop-2.6./tmp</value>
</property>
<property>
   <name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
   <name>hadoop.proxyuser.hadoop.groups</name>
  <value>*</value>
</property>
</configuration>

  成功!

hadoop报错java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured的更多相关文章

  1. hadoop报错java.io.IOException: Bad connect ack with firstBadLink as 192.168.1.218:50010

    [root@linuxmain hadoop]# bin/hadoop jar hdfs3.jar com.dragon.test.CopyToHDFS Java HotSpot(TM) Client ...

  2. Spark报错java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

    Spark 读取 JSON 文件时运行报错 java.io.IOException: Could not locate executable null\bin\winutils.exe in the ...

  3. Kafka 启动报错java.io.IOException: Can't resolve address.

    阿里云上 部署Kafka 启动报错java.io.IOException: Can't resolve address. 本地调试的,报错 需要在本地添加阿里云主机的 host 映射   linux ...

  4. React Natived打包报错java.io.IOException: Could not delete path '...\android\support\v7'解决

    问题详情 React Native打包apk时在第二次编译时候报错: java.io.IOException: Could not delete path 'D:\mycode\reactnative ...

  5. github提交失败并报错java.io.IOException: Authentication failed:

    一.概述 我最近在写一个android的项目. 软件:android studio.Android studio VCS integration(插件) Android studio VCS inte ...

  6. vue app混合开发蓝牙串口连接(报错java.io.IOException: read failed, socket might closed or timeout, read ret: -1;at android.bluetooth.BluetoothSocket.connect at js/BluetoothTool.js:329)

    我使用的uni-app <template> <view class="bluetooth"> <!-- 发送数据 --> <view c ...

  7. java get请求带参数报错 java.io.IOException: Server returned HTTP response code: 400 for URL

    解决方案 在使用JAVA发起http请求的时候,经常会遇到这个错误,我们copy请求地址在浏览器中运行的时候又是正常运行的,造成这个错误的原因主要是因为请求的URL中包含空格,这个时候我们要使用URL ...

  8. jsp报错java.io.IOException: Stream closed

    在使用jsp的时候莫名其妙的抛出了这个异常,经过反复检查 去掉了网友们说的jsp使用流未关闭,以及tomcat版本冲突等原因,最后发现是书写格式的原因. 当时使用的代码如下 <jsp:inclu ...

  9. Spark- ERROR Shell: Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

    运行 mport org.apache.log4j.{Level, Logger} import org.apache.spark.rdd.RDD import org.apache.spark.{S ...

随机推荐

  1. 井眼轨迹的三次样条插值 (vs + QT + coin3d)

    井眼轨迹数据的测量值是离散的,根据某些测斜公式,我们可以计算出离散的三维的井眼轨迹坐标,但是真实的井眼轨迹是一条平滑的曲线,这就需要我们对测斜数据进行插值,使井眼轨迹变得平滑,我暂时决定使用三次样条进 ...

  2. Node.js核心模块_全局变量、util学习

    全局对象 javascript的全局对象是window,他及其所有属性都可以在程序的任何地方访问.即全局变量. 而在node中全局对象是global,所有全局变量都是global对象的属性,包括其本身 ...

  3. 使用log4j的邮件功能

    Log4j的邮件功能能够为我们做这样的事情----当程序运行完的时候,或者正在运行也是可以的,它将程序的日志通过邮件的方式发到你的邮箱上. 这样,对于程序运行的控制就不用每次都跑到机器上去看日志文件这 ...

  4. Eclipse插件开发_学习_02_GEF入门实例

    一.前言 这一节,我们将会创建一个GEF入门实例 二.新建RCP项目 1. New 一个 Plug-in Project 2.输入项目名 项目名:com.ray.gef.helloworld 3.Co ...

  5. 条款47:请使用traits class表示类型信息

    在stl的算法中,我们的希望往往是根据不同的迭代器类型进行不同的更有效率的操作: template<typename IterT, typename DistT> void advance ...

  6. python-Django初体验

    1.搭建Django开发环境 2.创建工程与应用 CentOS6.5环境下 Python 2.6 ipython 1.2.1 Django 1.6.5 pip install -y django == ...

  7. nyoj-952-最大四边形 (向量叉乘)

    题目链接 /* Name:nyoj-952-最大四边形 Copyright: Author: Date: 2018/4/27 10:46:24 Description: 枚举一条对角线,再选择一个 看 ...

  8. git教程3-添加远程库与从远程库拷贝

    一.添加到github 1.github上创建新的库learngit.git 2.git remote add origin git@github.com:moisiet/learngit.git  ...

  9. Arc083_F Collecting Balls

    传送门 题目大意 给定$N$,在$(1,0),(2,0)......(N,0)$和$(0,1),(0,2)...(0,N)$上都有$1$个机器人,同时给定$2N$个坐标$(x,y),x,y\in[1, ...

  10. HDU4940 Destroy Transportation system(有上下界的最大流)

    Problem Description Tom is a commander, his task is destroying his enemy’s transportation system. Le ...