一、基本环境搭建

  1. 准备

  1. hadoop-2.5.0-src.tar.gz
  2. apache-maven-3.0.5-bin.tar.gz
  3. jdk-7u67-linux-x64.tar.gz
  4. protobuf-2.5.0.tar.gz
  5. 可联外部网络

  2. 安装 jdk-7u67-linux-x64.tar.gz 与 apache-maven-3.0.5-bin.tar.gz

  1. [liuwl@centos66-bigdata-hadoop ~]$ vi /etc/profile
  2. #JAVA_HOME
  3. export JAVA_HOME=/opt/modules/jdk1.7.0_67
  4. export PATH=$PATH:$JAVA_HOME/bin
  5. #MAVEN_HOME
  6. export MAVEN_HOME=/opt/modules/apache-maven-3.0.5
  7. export PATH=$PATH:$MAVEN_HOME/bin
  8. [liuwl@centos66-bigdata-hadoop ~] source /etc/profile
  9. [liuwl@centos66-bigdata-hadoop ~] java -version
  10. java version "1.7.0_67"
  11. Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
  12. Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
  13. [liuwl@centos66-bigdata-hadoop ~]$ echo $MAVEN_HOME
  14. /opt/modules/apache-maven-3.0.5
  15. [root@centos66-bigdata-hadoop ~]# mvn -v
  16. Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 05:51:28-0800)
  17. Maven home: /opt/modules/apache-maven-3.0.5
  18. Java version: 1.7.0_67, vendor: Oracle Corporation
  19. Java home: /opt/modules/jdk1.7.0_67/jre
  20. Default locale: en_US, platform encoding: UTF-8
  21. OS name: "linux", version: "2.6.32-504.el6.x86_64", arch: "amd64", family: "unix" 

  PS:准备文件中最好准备好maven的仓库文件,否则将下载很久

  1. [root@centos66-bigdata-hadoop hadoop-2.5.0-src]# ls /root/.m2/repository/
  2. ant biz commons-chain commons-el commons-validator junit sslext antlr bouncycastle commons-cli commons-httpclient dom4j log4j tomcat aopalliance bsh commons-codec commons-io doxia logkit xerces asm cglib commons-collections commons-lang io net xml-apis avalon-framework classworlds commons-configuration commons-logging javax org xmlenc backport-util-concurrent com commons-daemon commons-net jdiff oro xpp3 bcel commons-beanutils commons-digester commons-pool jline regexp

  3. yum 安装 cmake,zlib-devel,openssl-devel,gcc gcc-c++,ncurses-devel

  1. [root@centos66-bigdata-hadoop ~]# yum -y install cmake
  2. [root@centos66-bigdata-hadoop ~]# yum -y install zlib-devel
  3. [root@centos66-bigdata-hadoop ~]# yum -y install openssl-devel
  4. [root@centos66-bigdata-hadoop ~]# yum -y install gcc gcc-c++
  5. [root@centos66-bigdata-hadoop hadoop-2.5.0-src]# yum -y install ncurses-devel

  4. 安装 protobuf-2.5.0.tar.gz(解压后进入protobuf主目录)

  1. [root@centos66-bigdata-hadoop protobuf-2.5.0]# mkdir -p /opt/modules/protobuf
  2. [root@centos66-bigdata-hadoop protobuf-2.5.0]# ./configure --prefix=/opt/modules/protobuf
  3. ...
  4. [root@centos66-bigdata-hadoop protobuf-2.5.0]# make
  5. ...
  6. [root@centos66-bigdata-hadoop protobuf-2.5.0]# make install
  7. ...
  8. [root@centos66-bigdata-hadoop protobuf-2.5.0]# vi /etc/profile
  9. ...
  10. #PROTOBUF_HOME
  11. export PROTOBUF_HOME=/opt/modules/protobuf
  12. export PATH=$PATH:$PROTOBUF_HOME/bin
  13. [root@centos66-bigdata-hadoop protobuf-2.5.0]# source /etc/profile
  14. [root@centos66-bigdata-hadoop protobuf-2.5.0]# protoc --version
  15. libprotoc 2.5.0

  5. 解压Hadoop源文件压缩包,并进入主目录进行编译

  1. [root@centos66-bigdata-hadoop protobuf-2.5.0]# cd ../../files/
  2. [root@centos66-bigdata-hadoop files]# tar -zxf hadoop-2.5.0-src.tar.gz -C ../src/
  3. [root@centos66-bigdata-hadoop files]# cd ../src/hadoop-2.5.0-src/
  4. [root@centos66-bigdata-hadoop hadoop-2.5.0-src]# mvn package -DskipTests -Pdist,native
  5. ...
  6. [INFO] Executed tasks
  7. [INFO]
  8. [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
  9. [INFO] Building jar: /home/liuwl/opt/src/hadoop-2.5.0-src/hadoop-dist/target/hadoop-dist-2.5.0-javadoc.jar
  10. [INFO] ------------------------------------------------------------------------
  11. [INFO] Reactor Summary:
  12. [INFO]
  13. [INFO] Apache Hadoop Main ................................ SUCCESS [8:22.179s]
  14. [INFO] Apache Hadoop Project POM ......................... SUCCESS [5:14.366s]
  15. [INFO] Apache Hadoop Annotations ......................... SUCCESS [1:50.627s]
  16. [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.795s]
  17. [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1:11.384s]
  18. [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1:55.962s]
  19. [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [10:21.736s]
  20. [INFO] Apache Hadoop Auth ................................ SUCCESS [4:01.790s]
  21. [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [35.829s]
  22. [INFO] Apache Hadoop Common .............................. SUCCESS [12:51.374s]
  23. [INFO] Apache Hadoop NFS ................................. SUCCESS [29.567s]
  24. [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.220s]
  25. [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:04:44.352s]
  26. [INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:40.397s]
  27. [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:24.100s]
  28. [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [12.020s]
  29. [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.239s]
  30. [INFO] hadoop-yarn ....................................... SUCCESS [0.298s]
  31. [INFO] hadoop-yarn-api ................................... SUCCESS [2:07.150s]
  32. [INFO] hadoop-yarn-common ................................ SUCCESS [3:13.690s]
  33. [INFO] hadoop-yarn-server ................................ SUCCESS [1.009s]
  34. [INFO] hadoop-yarn-server-common ......................... SUCCESS [54.750s]
  35. [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [2:53.418s]
  36. [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [23.570s]
  37. [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [16.137s]
  38. [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:17.456s]
  39. [INFO] hadoop-yarn-server-tests .......................... SUCCESS [9.170s]
  40. [INFO] hadoop-yarn-client ................................ SUCCESS [17.790s]
  41. [INFO] hadoop-yarn-applications .......................... SUCCESS [0.132s]
  42. [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [6.689s]
  43. [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.015s]
  44. [INFO] hadoop-yarn-site .................................. SUCCESS [0.102s]
  45. [INFO] hadoop-yarn-project ............................... SUCCESS [13.562s]
  46. [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.526s]
  47. [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:27.794s]
  48. [INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:32.320s]
  49. [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [19.368s]
  50. [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [26.041s]
  51. [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [31.938s]
  52. [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [38.261s]
  53. [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.923s]
  54. [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.856s]
  55. [INFO] hadoop-mapreduce .................................. SUCCESS [15.510s]
  56. [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [20.631s]
  57. [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [51.096s]
  58. [INFO] Apache Hadoop Archives ............................ SUCCESS [13.185s]
  59. [INFO] Apache Hadoop Rumen ............................... SUCCESS [22.877s]
  60. [INFO] Apache Hadoop Gridmix ............................. SUCCESS [25.861s]
  61. [INFO] Apache Hadoop Data Join ........................... SUCCESS [9.764s]
  62. [INFO] Apache Hadoop Extras .............................. SUCCESS [7.152s]
  63. [INFO] Apache Hadoop Pipes ............................... SUCCESS [23.914s]
  64. [INFO] Apache Hadoop OpenStack support ................... SUCCESS [21.289s]
  65. [INFO] Apache Hadoop Client .............................. SUCCESS [18.486s]
  66. [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.966s]
  67. [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [37.039s]
  68. [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [9.809s]
  69. [INFO] Apache Hadoop Tools ............................... SUCCESS [0.192s]
  70. [INFO] Apache Hadoop Distribution ........................ SUCCESS [34.114s]
  71. [INFO] ------------------------------------------------------------------------
  72. [INFO] BUILD SUCCESS
  73. [INFO] ------------------------------------------------------------------------
  74. [INFO] Total time: 2:21:11.103s
  75. [INFO] Finished at: Wed Sep 14 11:49:38 PDT 2016
  76. [INFO] Final Memory: 86M/239M
  77. [INFO] ------------------------------------------------------------------------

Hadoop.2.x_源码编译的更多相关文章

  1. Hadoop,HBase,Zookeeper源码编译并导入eclipse

    基本理念:尽可能的参考官方英文文档 Hadoop:  http://wiki.apache.org/hadoop/FrontPage HBase:  http://hbase.apache.org/b ...

  2. Hadoop源码编译过程

    一.           为什么要编译Hadoop源码 Hadoop是使用Java语言开发的,但是有一些需求和操作并不适合使用java,所以就引入了本地库(Native Libraries)的概念,通 ...

  3. 基于cdh5.10.x hadoop版本的apache源码编译安装spark

    参考文档:http://spark.apache.org/docs/1.6.0/building-spark.html spark安装需要选择源码编译方式进行安装部署,cdh5.10.0提供默认的二进 ...

  4. hadoop 源码编译

    hadoop 源码编译 1.准备jar 1) hadoop-2.7.2-src.tar.gz 2) jdk-8u144-linux-x64.tar.gz 3) apach-ant-1.9.9-bin. ...

  5. Spark源码编译

    原创文章,转载请注明: 转载自http://www.cnblogs.com/tovin/p/3822995.html spark源码编译步骤如下: cd /home/hdpusr/workspace ...

  6. hadoop-1.2.0源码编译

    以下为在CentOS-6.4下hadoop-1.2.0源码编译步骤. 1. 安装并且配置ant 下载ant,将ant目录下的bin文件夹加入到PATH变量中. 2. 安装git,安装autoconf, ...

  7. hadoop-2.0.0-mr1-cdh4.2.0源码编译总结

    准备编译hadoop-2.0.0-mr1-cdh4.2.0的同学们要谨慎了.首先看一下这篇文章: Hadoop作业提交多种方案 http://www.blogjava.net/dragonHadoop ...

  8. wso2esb源码编译总结

    最近花了两周的空闲时间帮朋友把wso2esb的4.0.3.4.6.0.4.7.0三个版本从源码编译出来了.以下是大概的一些体会. wso2esb是基于carbon的.carbon是个基于eclipse ...

  9. hadoop-2.6.0源码编译问题汇总

    在上一篇文章中,介绍了hadoop-2.6.0源码编译的一般流程,因个人计算机环境的不同, 编译过程中难免会出现一些错误,下面是我编译过程中遇到的错误. 列举出来并附上我解决此错误的方法,希望对大家有 ...

随机推荐

  1. HDU 5867 Sparse Graph (2016年大连网络赛 I bfs+补图)

    题意:给你n个点m条边形成一个无向图,问你求出给定点在此图的补图上到每个点距离的最小值,每条边距离为1 补图:完全图减去原图 完全图:每两个点都相连的图 其实就是一个有技巧的bfs,我们可以看到虽然点 ...

  2. JS Number对象

    数字属性 MAX_VALUE MIN_VALUE NEGATIVE_INFINITY POSITIVE_INFINITY NaN prototype constructor 数字方法 toExpone ...

  3. MVC 异常处理机制

    方法一 :web.config配置文件的 system.web 接点下添加,若为On则不会将异常信息反馈到用户,而是友好的跳转到error.htm <customErrors mode=&quo ...

  4. Middleware In ASP.NET Core

    中间件简介 ASP.NET Core 由很多中间件构成,实现了一个HTTP请求管道(pipeline). Request的Response的管道可以看成一个Push Stack 和 Pop Stack ...

  5. Python学习笔记02

      元组:圆括号的,不能进行赋值操作,即不可更改. 列表:方括号的,可以修改. 访问:均使用下标访问   # 元组是一个静态列表,初始化之后就不可以修改,可以试任意类型 tuple1 = ('a st ...

  6. Codeforces 696D Legen...(AC自动机 + 矩阵快速幂)

    题目大概说给几个字符串,每个字符串都有一个开心值,一个串如果包含一次这些字符串就加上对应的开心值,问长度n的串开心值最多可以是多少. POJ2778..复习下..太弱了都快不会做了.. 这个矩阵的乘法 ...

  7. sparklyr包--实现R与Spark接口

    1.sparklyr包简介 Rstudio公司发布的sparklyr包具有以下几个功能: 实现R与Spark的连接: sparklyr包提供了一个完整的dplyr后端,可筛选并聚合Spark数据集,接 ...

  8. Fetch from Upstream 变灰失效

    Team——>Remote——>Configure Fetch from Upstream… Team——>Remote——>Configure Push to  Upstre ...

  9. BZOJ3092 : [FDU2012校赛] A Famous King’s Trip

    题目等价于去掉两条边,使得剩下的图连通,且所有点度数都为偶数. 首先特判掉图一开始就不连通的情况. 求出dfs生成树,对于每条非树边随机一个权值,每条树边的权值为所有经过它的非树边权值的异或和. 那么 ...

  10. 201453408刘昊阳 《Java程序设计》第5周学习总结

    201453408刘昊阳 <Java程序设计>第5周学习总结 教材学习内容总结 第8章 异常处理 8.1 语法与继承结构 8.1.1 使用try.catch p227代码(Average) ...