Hadoop是Apache基金会的开源项目,为开发者提供了一个分布式系统的基础架构,用户可以在不了解分布式系统的底层细节的情况下开发分布式的应用,充分利用集群的强大功能,实现高速运算和存储。Hadoop项目中包括一个分布式的文件系统HDFS,一个分布式的并行编程框架mapreduce,以及包括hive,hbase,mahout,pig,zookeeper,avro,chukwa在内的诸多子项目。

Hadoop主要的两部分分别是分布式存储HDFS和分布式计算mapreduce。Hdfs是一个master/slave的结构,就一般的部署来说,在master上只运行一个namenode,而在每一个slave上运行一个datanode。Mapreduce是一个编程模型,用以进行大数据量的计算。Mapreduce的名字源于这个模型中的两项核心操作:map和reduce。Map是把一组数据一对一地映射为另外一组数据,reduce是对一组数据进行归约,映射和归约的规则都是由一个函数指定。

整个hadoop族群中包括很多项目,如:

HDFS:分布式文件系统,是GFS的开源实现。

Mapreduce:分布式并行编程模型和程序执行框架,是google公司的mapreduce的开源实现。

Common:是整个hadoop项目的核心,包括一组分布式文件系统和通用IO的组件和接口(序列化,javaprc和持久化数据结构)。

Avro:一种支持搞笑,跨语言的rpc以及永久存储数据的序列化实现。

Pig: 一种数据流语言和运行环境,用以检索非常大的数据集,运行在mapreduce和hdfs的集群上。

Hive: 一个分布式,按列存储的数据仓库。Hive管理hdfs中存储的数据,并提供基于sql的查询语言(由运行时引擎翻译成mapreduce作业)用以查询数据。

Hbase: 一个分布式,按列存储的额数据库。Hbase使用hdfs作为底层存储,同时支持mapreduce的批量式计算和点查询(随机读取)。

Mahout:一个在hadoop上运行的机器学习类库。

Zookeeper:一个分布式,可用性高的协调服务。Zookeeper提供分布式锁之类的基本服务用于构建分布式应用。

Cassandra:是一套开源分布式nosql数据库系统,用于存储收件箱等简单格式数据,集google bigtable的数据模型与amazon dynamo的完全分布式架构于一身。

从最终用户的角度来看,它就像传统的文件系统一样,可以通过目录路径对文件执行crud操作。一个HDFS集群是由一个namenode和若干个datanode组成的。Namenode主节点是主服务器,管理文件系统的命名空间和客户端对文件的访问操作;datanode是集群中的一般节点,负责节点数据的存储。客户端通过namenode向datanode节点交互访问文件系统,联系namenode获得文件的元数,而文件io操作则是直接和datanode进行交互的额。Hdfs允许用户以文件的形式存储数据,文件被分成若干个数据块,典型数据块大小是64MB。Hdfs的文件通常是按照64mb被切分为不同的数据块(block)的。每个数据块尽可能分散在不同的datanode中,而若干个数据块存放在一组datanode上。Namenode执行文件系统的命名空间操作,它也负责数据块到具体datanode的映射。Datanode负责处理文件系统客户端的文件读/写操作,并在namenode的统一调度下进行数据块的创建,删除和复制工作。

Mapreduce编写的程序和hadoop程序的编译都依赖于jdk。

Hadoop安装完毕后目录如下

drwxr-xr-x  3 yj70978 retailfi    4096 Jan 30 2013 share

drwxr-xr-x  9 yj70978 retailfi    4096 Jan 30 2013 webapps

-rw-r--r--  1 yj70978 retailfi  306534 Jan 30 2013 hadoop-tools-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi 2778017 Jan 30  2013 hadoop-test-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi     414 Jan 30 2013 hadoop-minicluster-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi  142453 Jan 30 2013 hadoop-examples-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi 4035539 Jan 30  2013 hadoop-core-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi     410 Jan 30 2013 hadoop-client-1.1.2.jar

-rw-r--r--  1 yj70978 retailfi    6840 Jan 30 2013 hadoop-ant-1.1.2.jar

drwxr-xr-x 10yj70978 retailfi    4096 Jan 30  2013 contrib

-rw-r--r--  1 yj70978 retailfi    1366 Jan 30 2013 README.txt

-rw-r--r--  1 yj70978 retailfi     101 Jan 30 2013 NOTICE.txt

-rw-r--r--  1 yj70978 retailfi   13366 Jan 30 2013 LICENSE.txt

-rw-r--r--  1 yj70978 retailfi   10525 Jan 30 2013 ivy.xml

-rw-r--r--  1 yj70978 retailfi  467130 Jan 30 2013 CHANGES.txt

drwxr-xr-x  4 yj70978 retailfi    4096 Jan 30 2013 c++

-rw-r--r--  1 yj70978 retailfi  120025 Jan 30 2013 build.xml

drwxr-xr-x  5 yj70978 retailfi    4096 Jul 22 07:49 lib

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 07:49 ivy

drwxr-xr-x  6 yj70978 retailfi    4096 Jul 22 07:49 docs

drwxr-xr-x 16yj70978 retailfi    4096 Jul 22 07:49 src

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 07:49 sbin

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 07:49 libexec

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 07:58 bin

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 07:59 output

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 21:34 conf

drwxr-xr-x  2 yj70978 retailfi    4096 Jul 22 23:42 logs

修改conf/hadoop-env.sh文件,定义JAVA_HOME的路径然后到bin目录下运行./hadoop命令,会输出如何使用hadoop。

$ ./hadoop

Usage: hadoop[--config confdir] COMMAND

where COMMAND isone of:

namenode -format     format the DFS filesystem

secondarynamenode    run the DFS secondary namenode

namenode             run the DFS namenode

datanode             run a DFS datanode

dfsadmin             run a DFS admin client

mradmin              run a Map-Reduce admin client

fsck                 run a DFS filesystem checkingutility

fs                   run a generic filesystemuser client

balancer             run a cluster balancing utility

fetchdt              fetch a delegation token from theNameNode

jobtracker           run the MapReduce job Tracker node

pipes                run a Pipes job

tasktracker          run a MapReduce task Tracker node

historyserver        run job history servers as a standalonedaemon

job                  manipulate MapReduce jobs

queue                get information regardingJobQueues

version              print the version

jar <jar>            run a jar file

distcp <srcurl> <desturl> copyfile or directories recursively

archive -archiveName NAME -p <parentpath> <src>* <dest> create a hadoop archi                                                                                                                                                            ve

classpath            prints the class path needed to getthe

Hadoop jar and therequired libraries

daemonlog            get/set the log level for each daemon

or

CLASSNAME            run the class named CLASSNAME

Most commands printhelp when invoked w/o parameters.

Hadoop有三种模式:

本地(单机)模式

分布式模式

完全分布式模式

本地操作:

默认情况下就是这个模式,

Mkdir input

cp conf/*.xml input

运行hadoop自带的hadoop-examples*.jar来搜索input文件夹中的含有dfs[a-z]的字符串并统计出现数量,

bin/hadoop jarhadoop-examples-*.jar grep input output1 'dfs[a-z.]'

输出:

13/08/19 04:57:39INFO util.NativeCodeLoader: Loaded the native-hadoop library

13/08/19 04:57:39WARN snappy.LoadSnappy: Snappy native library not loaded

13/08/19 04:57:39INFO mapred.FileInputFormat: Total input paths to process : 7

13/08/19 04:57:39INFO mapred.JobClient: Running job: job_local_0001

13/08/19 04:57:40INFO util.ProcessTree: setsid exited with exit code 0

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@8474463

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/capacity-scheduler.xml:0+7457

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@8fae75c

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.MapTask: Finished spill 0

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000001_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner: file:/home/yj70978/hadoop/hadoop-1.1.2/input/hadoop-policy.xml:0+4644

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000001_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@8fbdd2c

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000002_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/mapred-queue-acls.xml:0+2033

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000002_0' done.

13/08/19 04:57:40INFO mapred.Task:  Using ResourceCalculatorPlugin: org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab78f

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000003_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/fair-scheduler.xml:0+327

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000003_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab476

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000004_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/hdfs-site.xml:0+178

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000004_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab588

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000005_0 is done. And is in theprocess of commiting

13/08/19 04:57:40 INFOmapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/core-site.xml:0+178

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000005_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab9fc

13/08/19 04:57:40INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:40INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:40INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:40INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:40INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_m_000006_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/input/mapred-site.xml:0+178

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_m_000006_0' done.

13/08/19 04:57:40INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81abe1c

13/08/19 04:57:40INFO mapred.LocalJobRunner:

13/08/19 04:57:40INFO mapred.Merger: Merging 7 sorted segments

13/08/19 04:57:40INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of totalsize: 17 bytes

13/08/19 04:57:40INFO mapred.LocalJobRunner:

13/08/19 04:57:40INFO mapred.Task: Task:attempt_local_0001_r_000000_0 is done. And is in theprocess of commiting

13/08/19 04:57:40INFO mapred.LocalJobRunner:

13/08/19 04:57:40INFO mapred.Task: Task attempt_local_0001_r_000000_0 is allowed to commit now

13/08/19 04:57:40INFO mapred.FileOutputCommitter: Saved output of task'attempt_local_0001_r_000000_0' to file:/home/yj70978/hadoop/hadoop-1.1.2/grep-temp-1225056667

13/08/19 04:57:40INFO mapred.LocalJobRunner: reduce > reduce

13/08/19 04:57:40INFO mapred.Task: Task 'attempt_local_0001_r_000000_0' done.

13/08/19 04:57:41INFO mapred.JobClient:  map 100% reduce100%

13/08/19 04:57:41INFO mapred.JobClient: Job complete: job_local_0001

13/08/19 04:57:41INFO mapred.JobClient: Counters: 21

13/08/19 04:57:41INFO mapred.JobClient:   File InputFormat Counters

13/08/19 04:57:41INFO mapred.JobClient:     BytesRead=14995

13/08/19 04:57:41INFO mapred.JobClient:   File OutputFormat Counters

13/08/19 04:57:41INFO mapred.JobClient:     BytesWritten=119

13/08/19 04:57:41INFO mapred.JobClient:  FileSystemCounters

13/08/19 04:57:41INFO mapred.JobClient:    FILE_BYTES_READ=1274764

13/08/19 04:57:41INFO mapred.JobClient:    FILE_BYTES_WRITTEN=1548698

13/08/19 04:57:41INFO mapred.JobClient:   Map-ReduceFramework

13/08/19 04:57:41INFO mapred.JobClient:     Map outputmaterialized bytes=57

13/08/19 04:57:41INFO mapred.JobClient:     Map inputrecords=369

13/08/19 04:57:41INFO mapred.JobClient:     Reduce shufflebytes=0

13/08/19 04:57:41INFO mapred.JobClient:     SpilledRecords=2

13/08/19 04:57:41INFO mapred.JobClient:     Map output bytes=13

13/08/19 04:57:41INFO mapred.JobClient:     Totalcommitted heap usage (bytes)=2130739200

13/08/19 04:57:41INFO mapred.JobClient:     CPU time spent(ms)=0

13/08/19 04:57:41INFO mapred.JobClient:     Map inputbytes=14995

13/08/19 04:57:41INFO mapred.JobClient:    SPLIT_RAW_BYTES=805

13/08/19 04:57:41INFO mapred.JobClient:     Combine inputrecords=1

13/08/19 04:57:41INFO mapred.JobClient:     Reduce inputrecords=1

13/08/19 04:57:41INFO mapred.JobClient:     Reduce inputgroups=1

13/08/19 04:57:41INFO mapred.JobClient:     Combine outputrecords=1

13/08/19 04:57:41INFO mapred.JobClient:     Physicalmemory (bytes) snapshot=0

13/08/19 04:57:41INFO mapred.JobClient:     Reduce outputrecords=1

13/08/19 04:57:41INFO mapred.JobClient:     Virtual memory(bytes) snapshot=0

13/08/19 04:57:41INFO mapred.JobClient:     Map outputrecords=1

13/08/19 04:57:41INFO mapred.FileInputFormat: Total input paths to process : 1

13/08/19 04:57:41INFO mapred.JobClient: Running job: job_local_0002

13/08/19 04:57:41INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab5be

13/08/19 04:57:41INFO mapred.MapTask: numReduceTasks: 1

13/08/19 04:57:41INFO mapred.MapTask: io.sort.mb = 100

13/08/19 04:57:41INFO mapred.MapTask: data buffer = 79691776/99614720

13/08/19 04:57:41INFO mapred.MapTask: record buffer = 262144/327680

13/08/19 04:57:41INFO mapred.MapTask: Starting flush of map output

13/08/19 04:57:41INFO mapred.MapTask: Finished spill 0

13/08/19 04:57:41INFO mapred.Task: Task:attempt_local_0002_m_000000_0 is done. And is in theprocess of commiting

13/08/19 04:57:41INFO mapred.LocalJobRunner:file:/home/yj70978/hadoop/hadoop-1.1.2/grep-temp-1225056667/part-00000:0+107

13/08/19 04:57:41INFO mapred.Task: Task 'attempt_local_0002_m_000000_0' done.

13/08/19 04:57:41INFO mapred.Task:  UsingResourceCalculatorPlugin :org.apache.hadoop.util.LinuxResourceCalculatorPlugin@81ab734

13/08/19 04:57:41INFO mapred.LocalJobRunner:

13/08/19 04:57:41INFO mapred.Merger: Merging 1 sorted segments

13/08/19 04:57:41INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of totalsize: 17 bytes

13/08/19 04:57:41INFO mapred.LocalJobRunner:

13/08/19 04:57:41INFO mapred.Task: Task:attempt_local_0002_r_000000_0 is done. And is in theprocess of commiting

13/08/19 04:57:41INFO mapred.LocalJobRunner:

13/08/19 04:57:41INFO mapred.Task: Task attempt_local_0002_r_000000_0 is allowed to commit now

13/08/19 04:57:41INFO mapred.FileOutputCommitter: Saved output of task'attempt_local_0002_r_000000_0' tofile:/home/yj70978/hadoop/hadoop-1.1.2/output1

13/08/19 04:57:41INFO mapred.LocalJobRunner: reduce > reduce

13/08/19 04:57:41INFO mapred.Task: Task 'attempt_local_0002_r_000000_0' done.

13/08/19 04:57:42INFO mapred.JobClient:  map 100% reduce100%

13/08/19 04:57:42INFO mapred.JobClient: Job complete: job_local_0002

13/08/19 04:57:42INFO mapred.JobClient: Counters: 21

13/08/19 04:57:42INFO mapred.JobClient:   File InputFormat Counters

13/08/19 04:57:42INFO mapred.JobClient:     Bytes Read=119

13/08/19 04:57:42INFO mapred.JobClient:   File OutputFormat Counters

13/08/19 04:57:42INFO mapred.JobClient:     BytesWritten=19

13/08/19 04:57:42INFO mapred.JobClient:   FileSystemCounters

13/08/19 04:57:42INFO mapred.JobClient:    FILE_BYTES_READ=610339

13/08/19 04:57:42INFO mapred.JobClient:    FILE_BYTES_WRITTEN=770969

13/08/19 04:57:42INFO mapred.JobClient:   Map-ReduceFramework

13/08/19 04:57:42INFO mapred.JobClient:     Map outputmaterialized bytes=21

13/08/19 04:57:42INFO mapred.JobClient:     Map inputrecords=1

13/08/19 04:57:42INFO mapred.JobClient:     Reduce shufflebytes=0

13/08/19 04:57:42INFO mapred.JobClient:     SpilledRecords=2

13/08/19 04:57:42INFO mapred.JobClient:     Map outputbytes=13

13/08/19 04:57:42INFO mapred.JobClient:     Totalcommitted heap usage (bytes)=532684800

13/08/19 04:57:42INFO mapred.JobClient:     CPU time spent(ms)=0

13/08/19 04:57:42INFO mapred.JobClient:     Map inputbytes=21

13/08/19 04:57:42INFO mapred.JobClient:    SPLIT_RAW_BYTES=123

13/08/19 04:57:42INFO mapred.JobClient:     Combine inputrecords=0

13/08/19 04:57:42INFO mapred.JobClient:     Reduce input records=1

13/08/19 04:57:42INFO mapred.JobClient:     Reduce inputgroups=1

13/08/19 04:57:42INFO mapred.JobClient:     Combine outputrecords=0

13/08/19 04:57:42INFO mapred.JobClient:     Physicalmemory (bytes) snapshot=0

13/08/19 04:57:42INFO mapred.JobClient:     Reduce outputrecords=1

13/08/19 04:57:42INFO mapred.JobClient:     Virtual memory(bytes) snapshot=0

13/08/19 04:57:42INFO mapred.JobClient:     Map outputrecords=1

$ cat *

1           dfsa

以分布式模式运行,

修改以下文件,

conf/core-site.xml:

<configuration>
     <property>
         <name>fs.default.name</name>
         <value>hdfs://localhost:9000</value>
     </property>
</configuration>

conf/hdfs-site.xml:

<configuration>
     <property>
         <name>dfs.replication</name>
         <value>1</value>
     </property>
</configuration>

conf/mapred-site.xml:

<configuration>
     <property>
         <name>mapred.job.tracker</name>
         <value>localhost:9001</value>
     </property>
</configuration>

检查你是否可以ssh到本地,

$ ssh localhost

如果你不可以,用执行以下命令

$ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa 
$ cat~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

接下来对一个新的分布式文件系统进行format,

$ bin/hadoopnamenode -format

[JRockit] Localmanagement server started.

13/08/19 05:39:36INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG:Starting NameNode

STARTUP_MSG:   host = retailvm1d/169.193.171.159

STARTUP_MSG:   args = [-format]

STARTUP_MSG:   version = 1.1.2

STARTUP_MSG:   build =https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782;compiled by 'hortonfo' on Thu Jan 31 02:03:24 UTC 2013

************************************************************/

13/08/19 05:39:36INFO util.GSet: VM type       = 64-bit

13/08/19 05:39:36INFO util.GSet: 2% max memory = 20.0 MB

13/08/19 05:39:36INFO util.GSet: capacity      = 2^21 =2097152 entries

13/08/19 05:39:36INFO util.GSet: recommended=2097152, actual=2097152

13/08/19 05:39:37INFO namenode.FSNamesystem: fsOwner=yj70978

13/08/19 05:39:37INFO namenode.FSNamesystem: supergroup=supergroup

13/08/19 05:39:37INFO namenode.FSNamesystem: isPermissionEnabled=true

13/08/19 05:39:37INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100

13/08/19 05:39:37INFO namenode.FSNamesystem: isAccessTokenEnabled=falseaccessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)

13/08/19 05:39:37INFO namenode.NameNode: Caching file names occuring more than 10 times

13/08/19 05:39:37INFO common.Storage: Image file of size 113 saved in 0 seconds.

13/08/19 05:39:37INFO namenode.FSEditLog: closing edit log: position=4,editlog=/tmp/hadoop-yj70978/dfs/name/current/edits

13/08/19 05:39:37INFO namenode.FSEditLog: close success: truncate to 4, editlog=/tmp/hadoop-yj70978/dfs/name/current/edits

13/08/19 05:39:38INFO common.Storage: Storage directory /tmp/hadoop-yj70978/dfs/name has beensuccessfully formatted.

13/08/19 05:39:38INFO namenode.NameNode: SHUTDOWN_MSG:

/************************************************************

SHUTDOWN_MSG:Shutting down NameNode at retailvm1d/169.193.171.159

************************************************************/

启动一个hadoop daemon进程:

$ bin/start-all.sh

starting namenode,logging to /home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-yj70978-namenode-retailvm1d.out

localhost:

localhost: You areauthorized to use this System for approved business purposes only.

localhost: Use for anyother purpose is prohibited. All transactional records, reports,

localhost: email,software and other data generated by or residing upon this System,

localhost: to the extentpermitted by local law, are the property of Citigroup Inc.

localhost: or one of itssubsidiaries or their affiliates

localhost: (individuallyor collectively ' Citigroup ') and may be used by Citigroup

localhost: for anypurpose authorized and permissible in your country of work.

localhost: Activities onthis System are monitored to the extent permitted by local law.

localhost:

localhost:

localhost: PAM Authentication

localhost: Password:

localhost: PAMAuthentication

localhost: Password:

localhost: startingdatanode, logging to/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-yj70978-datanode-retailvm1d.out

Alocalhost:

localhost: You areauthorized to use this System for approved business purposes only.

localhost: Use for anyother purpose is prohibited. All transactional records, reports,

localhost: email,software and other data generated by or residing upon this System,

localhost: to the extentpermitted by local law, are the property of Citigroup Inc.

localhost: or one of itssubsidiaries or their affiliates

localhost: (individuallyor collectively ' Citigroup ') and may be used by Citigroup

localhost: for anypurpose authorized and permissible in your country of work.

localhost: Activities onthis System are monitored to the extent permitted by local law.

localhost:

localhost:

localhost: PAMAuthentication

localhost: Password:

localhost: startingsecondarynamenode, logging to/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-yj70978-secondarynamenode-retailvm1d.out

starting jobtracker,logging to/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-yj70978-jobtracker-retailvm1d.out

localhost:

localhost: You areauthorized to use this System for approved business purposes only.

localhost: Use for anyother purpose is prohibited. All transactional records, reports,

localhost: email,software and other data generated by or residing upon this System,

localhost: to the extentpermitted by local law, are the property of Citigroup Inc.

localhost: or one of itssubsidiaries or their affiliates

localhost: (individuallyor collectively ' Citigroup ') and may be used by Citigroup

localhost: for anypurpose authorized and permissible in your country of work.

localhost: Activities onthis System are monitored to the extent permitted by local law.

localhost:

localhost:

localhost: PAMAuthentication

localhost: Password:

localhost: startingtasktracker, logging to/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-yj70978-tasktracker-retailvm1d.out

$ ps -ef | grep hadoop

yj70978  25808    1  3 05:40 pts/2    00:00:04/export/opt/jrockit/6.0_14R27.6.5l64/bin/java -Dproc_namenode -Xmx1000m-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote-Dhadoop.log.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs -Dhadoop.log.file=hadoop-yj70978-namenode-retailvm1d.log-Dhadoop.home.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/..-Dhadoop.id.str=yj70978 -Dhadoop.root.logger=INFO,DRFA-Dhadoop.security.logger=INFO,DRFAS-Djava.library.path=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/native/Linux-amd64-64-Dhadoop.policy.file=hadoop-policy.xml -classpath/home/yj70978/hadoop/hadoop-1.1.2/libexec/../conf:/export/opt/jrockit/6.0_14R27.6.5l64/bin/java/lib/tools.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/..:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../hadoop-core-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/asm-3.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjrt-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjtools-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-1.7.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-cli-1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-codec-1.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-collections-3.2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-configuration-1.6.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-daemon-1.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-digester-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-el-1.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-httpclient-3.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-io-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-lang-2.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-1.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-api-1.0.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-math-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-net-3.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/core-3.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-capacity-scheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-fairscheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-thriftfs-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hsqldb-1.8.0.10.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-compiler-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-runtime-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jdeb-0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-core-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-json-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-server-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jets3t-0.6.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-util-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsch-0.1.42.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/junit-4.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/kfs-0.2.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/log4j-1.2.15.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/mockito-all-1.8.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/oro-2.0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/servlet-api-2.5-20081211.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-api-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/xmlenc-0.52.jar:/home/yj70978/

yj70978  26105    1 17 05:42 ?        00:00:04 /export/opt/jrockit/6.0_14R27.6.5l64/bin/java-Dproc_datanode -Xmx1000m -server -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dhadoop.log.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs-Dhadoop.log.file=hadoop-yj70978-datanode-retailvm1d.log-Dhadoop.home.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/..-Dhadoop.id.str=yj70978 -Dhadoop.root.logger=INFO,DRFA-Dhadoop.security.logger=INFO,NullAppender-Djava.library.path=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/native/Linux-amd64-64-Dhadoop.policy.file=hadoop-policy.xml -classpath/home/yj70978/hadoop/hadoop-1.1.2/libexec/../conf:/export/opt/jrockit/6.0_14R27.6.5l64/bin/java/lib/tools.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/..:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../hadoop-core-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/asm-3.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjrt-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjtools-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-1.7.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-cli-1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-codec-1.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-collections-3.2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-configuration-1.6.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-daemon-1.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-digester-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-el-1.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-httpclient-3.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-io-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-lang-2.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-1.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-api-1.0.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-math-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-net-3.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/core-3.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-capacity-scheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-fairscheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-thriftfs-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hsqldb-1.8.0.10.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-compiler-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-runtime-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jdeb-0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-core-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-json-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-server-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jets3t-0.6.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-util-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsch-0.1.42.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/junit-4.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/kfs-0.2.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/log4j-1.2.15.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/mockito-all-1.8.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/oro-2.0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/servlet-api-2.5-20081211.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-api-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/xmlenc-0.52.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsp-2.1/jsp-

yj70978  26261    1 16 05:42 ?        00:00:02/export/opt/jrockit/6.0_14R27.6.5l64/bin/java -Dproc_secondarynamenode-Xmx1000m -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote -Dhadoop.log.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs-Dhadoop.log.file=hadoop-yj70978-secondarynamenode-retailvm1d.log-Dhadoop.home.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/..-Dhadoop.id.str=yj70978 -Dhadoop.root.logger=INFO,DRFA-Dhadoop.security.logger=INFO,NullAppender-Djava.library.path=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/native/Linux-amd64-64-Dhadoop.policy.file=hadoop-policy.xml -classpath/home/yj70978/hadoop/hadoop-1.1.2/libexec/../conf:/export/opt/jrockit/6.0_14R27.6.5l64/bin/java/lib/tools.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/..:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../hadoop-core-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/asm-3.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjrt-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjtools-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-1.7.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-cli-1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-codec-1.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-collections-3.2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-configuration-1.6.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-daemon-1.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-digester-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-el-1.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-httpclient-3.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-io-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-lang-2.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-1.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-api-1.0.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-math-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-net-3.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/core-3.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-capacity-scheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-fairscheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-thriftfs-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hsqldb-1.8.0.10.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-compiler-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-runtime-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jdeb-0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-core-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-json-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-server-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jets3t-0.6.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-util-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsch-0.1.42.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/junit-4.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/kfs-0.2.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/log4j-1.2.15.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/mockito-all-1.8.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/oro-2.0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/servlet-api-2.5-20081211.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-api-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/xmlenc-0.52.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/js

yj70978  26346    1 27 05:42 pts/2    00:00:04/export/opt/jrockit/6.0_14R27.6.5l64/bin/java -Dproc_jobtracker -Xmx1000m-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote-Dcom.sun.management.jmxremote-Dhadoop.log.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs-Dhadoop.log.file=hadoop-yj70978-jobtracker-retailvm1d.log-Dhadoop.home.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/..-Dhadoop.id.str=yj70978 -Dhadoop.root.logger=INFO,DRFA-Dhadoop.security.logger=INFO,DRFAS-Djava.library.path=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/native/Linux-amd64-64-Dhadoop.policy.file=hadoop-policy.xml -classpath /home/yj70978/hadoop/hadoop-1.1.2/libexec/../conf:/export/opt/jrockit/6.0_14R27.6.5l64/bin/java/lib/tools.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/..:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../hadoop-core-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/asm-3.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjrt-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjtools-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-1.7.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-cli-1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-codec-1.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-collections-3.2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-configuration-1.6.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-daemon-1.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-digester-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-el-1.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-httpclient-3.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-io-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-lang-2.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-1.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-api-1.0.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-math-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-net-3.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/core-3.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-capacity-scheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-fairscheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-thriftfs-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hsqldb-1.8.0.10.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-compiler-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-runtime-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jdeb-0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-core-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-json-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-server-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jets3t-0.6.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-util-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsch-0.1.42.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/junit-4.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/kfs-0.2.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/log4j-1.2.15.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/mockito-all-1.8.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/oro-2.0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/servlet-api-2.5-20081211.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-api-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/xmlenc-0.52.jar:/home/yj70

yj70978  26518    1 35 05:42 ?        00:00:03/export/opt/jrockit/6.0_14R27.6.5l64/bin/java -Dproc_tasktracker -Xmx1000m-Dhadoop.log.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../logs-Dhadoop.log.file=hadoop-yj70978-tasktracker-retailvm1d.log-Dhadoop.home.dir=/home/yj70978/hadoop/hadoop-1.1.2/libexec/..-Dhadoop.id.str=yj70978 -Dhadoop.root.logger=INFO,DRFA-Dhadoop.security.logger=INFO,NullAppender -Djava.library.path=/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/native/Linux-amd64-64-Dhadoop.policy.file=hadoop-policy.xml -classpath/home/yj70978/hadoop/hadoop-1.1.2/libexec/../conf:/export/opt/jrockit/6.0_14R27.6.5l64/bin/java/lib/tools.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/..:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../hadoop-core-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/asm-3.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjrt-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/aspectjtools-1.6.11.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-1.7.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-cli-1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-codec-1.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-collections-3.2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-configuration-1.6.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-daemon-1.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-digester-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-el-1.0.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-httpclient-3.0.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-io-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-lang-2.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-1.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-logging-api-1.0.4.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-math-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/commons-net-3.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/core-3.1.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-capacity-scheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-fairscheduler-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hadoop-thriftfs-1.1.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/hsqldb-1.8.0.10.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-compiler-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jasper-runtime-5.5.12.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jdeb-0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-core-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-json-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jersey-server-1.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jets3t-0.6.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jetty-util-6.1.26.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsch-0.1.42.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/junit-4.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/kfs-0.2.2.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/log4j-1.2.15.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/mockito-all-1.8.5.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/oro-2.0.8.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/servlet-api-2.5-20081211.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-api-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/xmlenc-0.52.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/yj70978/hadoop/hadoop-1.1.2/libexec/../lib/jsp-2.1/jsp-api-2.1.jarorg.apache.had

日志会被记录到这里:

${HADOOP_HOME}/logs

然后可以访问查看namenode和jobtracker的情况,

·        NameNode - http://localhost:50070/

·        JobTracker - http://localhost:50030/

终止hadoop进程:

$ bin/stop-all.sh

hadoop部署、启动全套过程的更多相关文章

  1. Hadoop部署启动异常问题排查

    hadoop的日志目录(/home/hadoop/app/hadoop-2.6.4/logs) 1.hadoop启动不正常用浏览器访问namenode的50070端口,不正常,需要诊断问题出在哪里: ...

  2. 【Big Data - Hadoop - MapReduce】通过腾讯shuffle部署对shuffle过程进行详解

    摘要: 通过腾讯shuffle部署对shuffle过程进行详解 摘要:腾讯分布式数据仓库基于开源软件Hadoop和Hive进行构建,TDW计算引擎包括两部分:MapReduce和Spark,两者内部都 ...

  3. Hadoop 部署文档

    Hadoop 部署文档 1 先决条件 2 下载二进制文件 3 修改配置文件 3.1 core-site.xml 3.2 hdfs-site.xml 3.3 mapred-site.xml 3.4 ya ...

  4. Hadoop2.2.0安装配置手册!完全分布式Hadoop集群搭建过程~(心血之作啊~~)

    http://blog.csdn.net/licongcong_0224/article/details/12972889 历时一周多,终于搭建好最新版本hadoop2.2集群,期间遇到各种问题,作为 ...

  5. hadoop进阶----hadoop经验(一)-----生产环境hadoop部署在超大内存服务器的虚拟机集群上vs几个内存较小的物理机

    生产环境 hadoop部署在超大内存服务器的虚拟机集群上 好 还是  几个内存较小的物理机上好? 虚拟机集群优点 虚拟化会带来一些其他方面的功能. 资源隔离.有些集群是专用的,比如给你三台设备只跑一个 ...

  6. Hadoop部署方式-完全分布式(Fully-Distributed Mode)

    Hadoop部署方式-完全分布式(Fully-Distributed Mode) 作者:尹正杰 版权声明:原创作品,谢绝转载!否则将追究法律责任. 本博客搭建的虚拟机是伪分布式环境(https://w ...

  7. Hadoop部署方式-伪分布式(Pseudo-Distributed Mode)

    Hadoop部署方式-伪分布式(Pseudo-Distributed Mode) 作者:尹正杰 版权声明:原创作品,谢绝转载!否则将追究法律责任. 一.下载相应的jdk和Hadoop安装包 JDK:h ...

  8. Hadoop部署方式-本地模式(Local (Standalone) Mode)

    Hadoop部署方式-本地模式(Local (Standalone) Mode) 作者:尹正杰 版权声明:原创作品,谢绝转载!否则将追究法律责任. Hadoop总共有三种运行方式.本地模式(Local ...

  9. hadoop集群搭建过程中遇到的问题

    在安装配置Hadoop集群的过程中遇到了很多问题,有些是配置导致的,有些是linux系统本身的问题造成的,现在总结如下. 1. hdfs namenode -format出现错误:hdfs namen ...

随机推荐

  1. Python: 在Unicode和普通字符串之间转换

    Unicode字符串可以用多种方式编码为普通字符串, 依照你所选择的编码(encoding): <!-- Inject Script Filtered --> Toggle line nu ...

  2. Jquery Ajax时 error处理 之 parsererror

    Jquery Ajax时 error处理 之 parsererror     01 $.ajax({ 02         type: "POST", 03         con ...

  3. ThinkPHP - 扩展个人类库 - 以验证码类为例子

    首先,在项目目录下创建Class文件夹,用于存储个人类文件. 之后建立Data目录存放所需字体文件,其他的数据也可以放在这个文件夹下. 然后再Conf文件夹下创建verify.php配置文件. 在co ...

  4. 【Web】CXF WebService 服务端和客户端 环境搭建及测试

    cxf服务端 1.去官方下载对应的jar包:http://cxf.apache.org/ 2.maven配置相应jar包 3.修改web.xml,完成spring和cxf配置 <!-- Spri ...

  5. GIT在windows下搭建

    /*********工具准备********* *copSSH *msysgit *TortiseGIT *putty * 安装比较简单,此处省略... *********************** ...

  6. 那些年搞不懂的"协变"和"逆变"

    博主之前也不是很清楚协变与逆变,今天在书上看到了有关于协变还是逆变的介绍感觉还是不太懂,后来看了一篇园子里面一位朋友的文章,顿时茅塞顿开.本文里面会有自己的一些见解也会引用博友的一些正文,希望通过本篇 ...

  7. php函数参数

    函数的参数 通过参数列表可以传递信息到函数,即以逗号作为分隔符的表达式列表.参数是从左向右求值的. PHP 支持按值传递参数(默认),通过引用传递参数以及默认参数.也支持可变长度参数列表,更多信息参见 ...

  8. Android平台下使用lua调用Java代码经验总结

    动态语言以其执行的灵活性,可配置性.方便调试能够为开发带来极大的方便.假设用好了.能够极大的提高开发的效率. 怪不得像游戏开发这样复杂的软件开发里没有不集成脚本语言的. 当中,lua以其小巧,灵活.方 ...

  9. activity入门2

    1.如何获取其他应用的包名和类名? 点击和查看logcat第一条信息2.第二步Intent intent = new Intent();intent.setClassName("com.an ...

  10. 用css实现列表菜单的效果

    <!doctype html> <html lang="en"> <head> <meta charset="UTF-8&quo ...