由于在Hadoop-2.x中,Apache官网上提供的都是32位版本,如果是生产环境中则需要自行编译64位,编译Hadoop-2.x版本方法如下:

安装编译源码所依赖的底层库
  yum install glibc-headers
  yum install gcc
  yum install gcc-c++
  yum install make
  yum install cmake
  yum install openssl-devel
  yum install ncurses-devel

安装protobuf-2.5.0(Hadoop中节点间的RPC通讯协议的实现是基于Google的Protocol buffer)
  tar -zxvf /home/tools/protobuf-2.5.0.tar.gz -C /home/tools/
  cd protobuf-2.5.0 && ./configure && make && make check && make install

安装apache-maven-3.0.5(这里我安装的JDK1.7,编译Hadoop源码时需要使用maven)
  tar -zxvf /home/tools/apache-maven-3.0.5-bin.tar.gz -C /home/tools/
  vi /etc/profile
  export JAVA_HOME=/usr/local/java
  export M2_HOME=/usr/local/apache-maven-3.0.5
  export PATH=.:$M2_HOME/bin:$JAVA_HOME/bin:$PATH

解压Hadoop-2.4.1-src.tar.gz源码包进行编译
  tar -zxvf /home/tools/hadoop-2.4.1-src.tar.gz -C /home/tools/
  cd /home/tools/hadoop-2.4.1-src
  mvn package -DskipsTests -Pdist,native
  出现如下日志表示编译成功
        main:
             [exec]
             [exec] Current directory /home/tools/hadoop-2.4.1-src/hadoop-dist/target
             [exec]
             [exec] $ rm -rf hadoop-2.4.1
             [exec] $ mkdir hadoop-2.4.1
             [exec] $ cd hadoop-2.4.1
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-nfs/target/hadoop-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/share .
             [exec]
             [exec] Hadoop dist layout available at: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1
             [exec]
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-dist ---
        [WARNING] JAR will be empty - no content was marked for inclusion!
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1.jar
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop-dist ---
        [INFO]
        [INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-dist ---
        [INFO] Executing tasks

main:
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1-javadoc.jar
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO]
        [INFO] Apache Hadoop Main ................................ SUCCESS [1.176s]
        [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.937s]
        [INFO] Apache Hadoop Annotations ......................... SUCCESS [3.426s]
        [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.302s]
        [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.582s]
        [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.132s]
        [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2.916s]
        [INFO] Apache Hadoop Auth ................................ SUCCESS [3.873s]
        [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.328s]
        [INFO] Apache Hadoop Common .............................. SUCCESS [1:36.564s]
        [INFO] Apache Hadoop NFS ................................. SUCCESS [5.527s]
        [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.038s]
        [INFO] Apache Hadoop HDFS ................................ SUCCESS [2:44.338s]
        [INFO] Apache Hadoop HttpFS .............................. SUCCESS [21.785s]
        [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [25.123s]
        [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.578s]
        [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.046s]
        [INFO] hadoop-yarn ....................................... SUCCESS [0.039s]
        [INFO] hadoop-yarn-api ................................... SUCCESS [1:19.929s]
        [INFO] hadoop-yarn-common ................................ SUCCESS [1:30.724s]
        [INFO] hadoop-yarn-server ................................ SUCCESS [0.032s]
        [INFO] hadoop-yarn-server-common ......................... SUCCESS [8.375s]
        [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [52.226s]
        [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.878s]
        [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [12.762s]
        [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [12.406s]
        [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.483s]
        [INFO] hadoop-yarn-client ................................ SUCCESS [5.208s]
        [INFO] hadoop-yarn-applications .......................... SUCCESS [0.029s]
        [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.614s]
        [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.137s]
        [INFO] hadoop-yarn-site .................................. SUCCESS [0.037s]
        [INFO] hadoop-yarn-project ............................... SUCCESS [3.164s]
        [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.059s]
        [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [18.276s]
        [INFO] hadoop-mapreduce-client-common .................... SUCCESS [18.034s]
        [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.728s]
        [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [8.973s]
        [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.420s]
        [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [12.076s]
        [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.988s]
        [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.648s]
        [INFO] hadoop-mapreduce .................................. SUCCESS [2.431s]
        [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [9.437s]
        [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [20.544s]
        [INFO] Apache Hadoop Archives ............................ SUCCESS [2.163s]
        [INFO] Apache Hadoop Rumen ............................... SUCCESS [5.710s]
        [INFO] Apache Hadoop Gridmix ............................. SUCCESS [4.467s]
        [INFO] Apache Hadoop Data Join ........................... SUCCESS [2.770s]
        [INFO] Apache Hadoop Extras .............................. SUCCESS [3.014s]
        [INFO] Apache Hadoop Pipes ............................... SUCCESS [10.174s]
        [INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.523s]
        [INFO] Apache Hadoop Client .............................. SUCCESS [3.611s]
        [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.136s]
        [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [9.834s]
        [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.285s]
        [INFO] Apache Hadoop Tools ............................... SUCCESS [0.025s]
        [INFO] Apache Hadoop Distribution ........................ SUCCESS [12.173s]
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD SUCCESS
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 13:01.056s
        [INFO] Finished at: Tues Jul 02 10:28:07 CST 2014
        [INFO] Final Memory: 165M/512M
        [INFO] ------------------------------------------------------------------------

cd /home/tools/hadoop-2.4.1-src/hadoop-dist/target
     ls
     antrun  dist-layout-stitching.sh  hadoop-2.4.1  hadoop-dist-2.4.1.jar  hadoop-dist-2.4.1-javadoc.jar  javadoc-bundle-options  maven-archiver  test-dir
     其中hadoop-2.4.1是64位hadoop二进制包,也就是我们所需要的。

Hadoop-2.x的源码编译的更多相关文章

  1. 基于cdh5.10.x hadoop版本的apache源码编译安装spark

    参考文档:http://spark.apache.org/docs/1.6.0/building-spark.html spark安装需要选择源码编译方式进行安装部署,cdh5.10.0提供默认的二进 ...

  2. Hadoop(一)Hadoop的简介与源码编译

    一 Hadoop简介 1.1Hadoop产生的背景 1. HADOOP最早起源于Nutch.Nutch的设计目标是构建一个大型的全网搜索引擎,包括网页抓取.索引.查询等功能,但随着抓取网页数量的增加, ...

  3. hadoop 2.5.2源码编译

    编译过程漫长无比,错误百出,需要耐心耐心!! 1.准备的环境及软件 操作系统:Centos6.4 64位 jdk:jdk-7u80-linux-x64.rpm,不要使用1.8 maven:apache ...

  4. hadoop 2.7.3 源码编译教程

    1.工具准备,最靠谱的是hadoop说明文档里要求具备的那些工具. 到hadoop官网,点击source下载hadoop-2.7.3-src.tar.gz. 解压之 tar -zxvf hadoop- ...

  5. Hadoop源码编译过程

    一.           为什么要编译Hadoop源码 Hadoop是使用Java语言开发的,但是有一些需求和操作并不适合使用java,所以就引入了本地库(Native Libraries)的概念,通 ...

  6. Hadoop,HBase,Zookeeper源码编译并导入eclipse

    基本理念:尽可能的参考官方英文文档 Hadoop:  http://wiki.apache.org/hadoop/FrontPage HBase:  http://hbase.apache.org/b ...

  7. hadoop 源码编译

    hadoop 源码编译 1.准备jar 1) hadoop-2.7.2-src.tar.gz 2) jdk-8u144-linux-x64.tar.gz 3) apach-ant-1.9.9-bin. ...

  8. Spark源码编译

    原创文章,转载请注明: 转载自http://www.cnblogs.com/tovin/p/3822995.html spark源码编译步骤如下: cd /home/hdpusr/workspace ...

  9. hadoop-1.2.0源码编译

    以下为在CentOS-6.4下hadoop-1.2.0源码编译步骤. 1. 安装并且配置ant 下载ant,将ant目录下的bin文件夹加入到PATH变量中. 2. 安装git,安装autoconf, ...

  10. hadoop-2.0.0-mr1-cdh4.2.0源码编译总结

    准备编译hadoop-2.0.0-mr1-cdh4.2.0的同学们要谨慎了.首先看一下这篇文章: Hadoop作业提交多种方案 http://www.blogjava.net/dragonHadoop ...

随机推荐

  1. leetCode 48.Rotate Image (旋转图像) 解题思路和方法

    Rotate Image You are given an n x n 2D matrix representing an image. Rotate the image by 90 degrees ...

  2. Hibernate的fetch

    hibernate抓取策略fetch具体解释一.hibernate抓取策略(单端代理的批量抓取fetch=select(默认)/join)測试用例:Student student = (Student ...

  3. JAVA - hashcode与equals作用、关系

      Hashcode的作用 总的来说,Java中的集合(Collection)有两类,一类是List,再有一类是Set.前者集合内的元素是有序的,元素可以重复:后者元素无序,但元素不可重复.      ...

  4. 3_Linux_文件搜索指令

    .3文件搜索命令 1)which 查找一个命令所在的路径 whereis 提供命令的帮助文件的信息 whatis 显示命令的概要信息whatis ls which提供命令的别名信息 2)find,基本 ...

  5. NYOJ-1070诡异的电梯【Ⅰ】

    这道题是个dp,主要考虑两种情况,刚开始我把状态转移方程写成了dp[i] = min(dp[i-1] + a, dp[i + 1] +b); 后来想想当推到dp[i]的时候,那个dp[i + 1]还没 ...

  6. js数组的操作及数组与字符串的相互转化

    数组与字符串的相互转化 <script type="text/javascript">var obj="new1abcdefg".replace(/ ...

  7. mysql索引和缓存

    mysql有缓存,缓存的设置见[转]MySql查询缓存机制

  8. jquery ajax 跨域处理

    今天使用JQuery Ajax 在本地电脑获取远程服务器数据的时候,发现使用$.ajax,$.getJSON,$.get这些都没有反应,后来再统一个网站下测试了一下,代码写得没有问题.后来想了想好想, ...

  9. PHP中的定界符格式

    <?php //nowdoc(单引号定界符) //ABC可以是任合内容,放在单引号中 $c=<<<'ABC' 这里可以是任合内容 我是历的苛夺基 本原则叶落归根在运 输费艰难田 ...

  10. (三)Knockout - ViewModel 的使用2 - select、list 应用

    select下拉菜单 <select>常用的data-bind参数: •options : 指向数组或ko.observableArray(),KO会将数组元素转换为下拉选项.如果是ko. ...