hadoop官方网站中只提供了32位的hadoop-2.2.0.tar.gz,如果要在64位ubuntu下部署hadoop-2.2.0,就需要重新编译源码包,生成64位的部署包。
建议以下操作使用root账户,避免出现权限不足的问题。

安装jdk

请参考文章《在ubuntu中安装jdk》。

安装maven

请参考文章《在ubuntu中安装maven》。

下载hadoop源码

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz

解压

tar -xzf hadoop-2.2.-src.tar.gz

编译源代码

cd hadoop-2.2.-src
mvn package -Pdist,native -DskipTests -Dtar

第1次编译:失败(hadoop pom.xml的bug)

错误信息:

[ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies for project org.apache.hadoop:hadoop-auth:jar:2.2.: Could not transfer artifact org.mortbay.jetty:jetty:jar:6.1. from/to central (https://repo.maven.apache.org/maven2): GET request of: org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar from central failed: SSL peer shut down incorrectly -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help ] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-auth

解决办法:
这是hadoop的一个bug,在pom.xml中添加下面patch即可,详见https://issues.apache.org/jira/browse/HADOOP-10110 。

编辑`hadoop-common-project/hadoop-auth/pom.xml`文件:

vi hadoop-common-project/hadoop-auth/pom.xml

<dependencys></dependencys>节点中插入:

<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>

第2次编译:失败(未安装protoc)

错误信息:

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -&gt; [Help ]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help ] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-common

解决办法:

根据错误信息可以知道是因为没有安装protoc。

wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
tar -xzf protobuf-2.5..tar.gz
cd protobuf-2.5.
./configure
make
make check
make install

其中,在执行./configure命令是会报如下错误:

checking whether to enable maintainer-specific portions of Makefiles... yes
checking build system type... x86_64-unknown-linux-gnu
checking host system type... x86_64-unknown-linux-gnu
checking target system type... x86_64-unknown-linux-gnu
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... no
checking for gcc... no
checking for cc... no
checking for cl.exe... no
configure: error: in `/home/hadoop/protobuf-2.5.':
configure: error: no acceptable C compiler found in $PATH
See `config.log' for more details

提示我们找不到C编译器,因此我们还需要安装C编译器。

ubuntu提供了集成gcc等编译器的基本编译工具`build-essential`,安装起来也比较方便,只需要一行命令:

apt-get install build-essential

安装过程中可能会提示包找不到,建议先更新下软件源:

apt-get update

安装之后验证protobuf的时候可能会报错以下错误:

$ protoc --version
protoc: error while loading shared libraries: libprotoc.so.: cannot open shared object file: No such file or directory

解决如下:

$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
$ protoc --version
libprotoc 2.5.

第3次编译:失败(未安装cmake)

错误信息:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hadoop/hadoop-2.2.0-src/hadoop-common-project/hadoop-common/target/native"): error=, No such file or directory -&gt; [Help ]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help ] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-common

解决办法:

apt-get install cmake

第4次编译:失败(未安装libglib2.0-dev)

错误信息:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned:  -&gt; [Help ]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help ] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-common

解决办法:

apt-get install libglib2.-dev

第5次编译:失败(未安装libssl-dev)

错误信息:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned:  -&gt; [Help ]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help ] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-pipes

解决办法:

apt-get install libssl-dev

第6次编译:成功

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 13.578 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 5.183 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 9.527 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.268 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 4.717 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 9.966 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 7.368 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.971 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [: min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 14.996 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.078 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [: min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 30.260 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 19.083 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 8.313 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.071 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.542 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [: min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 48.948 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.314 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 18.413 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 23.891 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 5.687 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.345 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.721 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 8.261 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.168 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 6.632 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.261 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 40.147 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 3.497 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.164 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 6.054 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 29.892 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 5.450 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 18.558 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 9.045 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.740 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.819 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 12.523 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 5.321 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 8.999 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 13.044 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.739 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 11.307 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 8.223 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 6.296 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 6.341 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 14.662 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 2.694 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.063 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 44.996 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 16.908 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 5.014 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: : min
[INFO] Finished at: --04T14::+:
[INFO] Final Memory: 69M/215M
[INFO] ------------------------------------------------------------------------

编译成果

编译生产的文件在`~/hadoop-2.2.0-src/hadoop-dist/target`目录中。

$ ls ~/hadoop-2.2.-src/hadoop-dist/target
antrun hadoop-2.2. hadoop-dist-2.2.-javadoc.jar test-dir
dist-layout-stitching.sh hadoop-2.2..tar.gz javadoc-bundle-options
dist-tar-stitching.sh hadoop-dist-2.2..jar maven-archiver

其中hadoop-2.2.0是编译后的文件夹,hadoop-2.2.0.tar.gz是编译后的打包文件。

验证

$ cd ~/hadoop-2.2.-src/hadoop-dist/target/hadoop-2.2./lib/native/
$ file libhadoop.so.1.0.
libhadoop.so.1.0.: ELF -bit LSB shared object, x86-, version (SYSV), dynamically linked, BuildID[sha1]=fb43b4ebd092ae8b4a427719b8907e6fdb223ed9, not stripped

可以看到,libhadoop.so.1.0.0已经是64位的了。

拷贝

将编译好的64位hadoop-2.2.0.tar.gz部署包,拷贝到当前用户目录。

cp ~/hadoop-2.2.-src/hadoop-dist/target/hadoop-2.2..tar.gz ~

64位ubuntu下重新编译hadoop2.2流水账的更多相关文章

  1. 64位ubuntu下安装微博客户端的方法

    最近安装了12.04的ubuntu系统,在unbutu提供的软件中心找不到微博客户端的应用,但在新浪的http://sinatair.sinaapp.com/下找到了官方的客户端. 于是下载了linu ...

  2. 64位Ubuntu下的Eclipse、ADT终于可以生成R.java了,虚机也可以正常建立

    64位Ubuntu12下的Eclipse总也不能自动生成R.java,导致无法正常编译程序,建虚拟器的时候总是提示少文件....三天下班机器没关(无奈公司网络不给力)来安装lib包. 各种加载,总结一 ...

  3. 64位ubuntu下Android开发环境的配置

    本文介绍如何在64位ubuntu上搭建android的开发环境. 系统:ubuntu12.04LTS 使用的是ADT Bundle for Linux和jdk1.7(open jdk也可) 一共分为3 ...

  4. MXNet在64位Win7下的编译安装

    注:本文原创,作者:Noah Zhang  (http://www.cnblogs.com/noahzn/) 我笔记本配置比较低,想装个轻量级的MXNet试试,装完之后报错,不是有效的应用程序,找不到 ...

  5. 64位Ubuntu下配置CP-ABE环境

    CP-ABE环境配置 本文密码学专业,论文仿真需要CP-ABE,现将配置过程作个记录 cpabe依赖pbc,pbc依赖gmp,gmp依赖M4.bison.flex,所以.. sudo apt-get  ...

  6. Ubuntu14.04.3 64位环境下openjdk7编译

    系统环境:Ubuntu14.04.3 -amd64 in VMWare1.安装openjdk7依赖 sudo apt- sudo apt--jdk sudo apt-get install build ...

  7. 64位ubuntu下安装ia32-libs

    echo "deb http://archive.ubuntu.com/ubuntu/ raring main restricted universe multiverse" &g ...

  8. 64 位 Ubuntu 下 android adb 不可用解决方法

    解决方案: 安装ia32-libs 在终端执行 sudo apt-get install ia32-libs 其间会提示所依赖的某些包不存在,直接 sudo apt-get 安装即可.

  9. 64位ubuntu下用code::blocks IDE配置opengl开发环境

    http://jingyan.baidu.com/article/c74d60007d104f0f6b595d6d.html 样例程序: #include <GL/glut.h> #inc ...

随机推荐

  1. 五步搞定Android开发环境部署——非常详细的Android开发环境搭建教程

      在windows安装Android的开发环境不简单也说不上算复杂,本文写给第一次想在自己Windows上建立Android开发环境投入Android浪潮的朋友们,为了确保大家能顺利完成开发环境的搭 ...

  2. chrome调试命令模式

    哈哈哈

  3. poj 1276

    一道DP的题目,还是一道多重背包的题目,第一次接触. 题意:有现今cash,和n种钱币,每种钱币有ni个,价值为di,求各种钱币组成的不超过cash的最大钱数 思路:可以转换为0/1背包和完全背包来做 ...

  4. windows下安装Apache 64bit

    文件下载:http://pan.baidu.com/s/1c0oDjFE 一.Apache的安装 http://www.blogjava.net/greatyuqing/archive/2013/02 ...

  5. Zlib 在windows上的编译

    1.下载http://www.zlib.net 下载,最新版本1.2.8 2.解压后,实际已提供了在vc下编译的工程,目录为:zlib-1.2.8\contrib\vstudio. 其中的zlibst ...

  6. Effective C++ -----条款44:将与参数无关的代码抽离templates

    Templates生成多个classes和多个函数,所以任何template代码都不该与某个造成膨胀的template参数产生相依关系. 因非类型模板参数(non-type template para ...

  7. Effective C++ -----条款20:宁以pass-by-reference-to-const替换pass-by-value Prefer pass-by-reference-to-const to pass-by-value

    尽量以pass-by-reference-to-const替换pass-by-value.前者通常比较高校,并可避免切割问题(slicing problem). 以上规则并不适用于内置类型,以及STL ...

  8. codeforces 425C Sereja and Two Sequences(DP)

    题意读了好久才读懂....不知道怎么翻译好~~请自便~~~ http://codeforces.com/problemset/problem/425/C 看懂之后纠结好久...不会做...仍然是看题解 ...

  9. Mysql 基础 高级查询

    在西面内容中    car  和  nation   都表示 表名 1.无论 高级查询还是简单查询   都用  select.. from..语句   from  后面 加表名  可以使一张表也可以是 ...

  10. 【python】dict的注意事项

    1. key不能用list和set 由于列表是易变的,故不可做key.如果使用会报错 但是元组可以做key 2.遍历方法 for key in somedict: pass 速度快,但是如果要删除元素 ...