一、基本环境搭建

  1. 准备

hadoop-2.5.0-src.tar.gz
apache-maven-3.0.5-bin.tar.gz
jdk-7u67-linux-x64.tar.gz
protobuf-2.5.0.tar.gz
可联外部网络

  2. 安装 jdk-7u67-linux-x64.tar.gz 与 apache-maven-3.0.5-bin.tar.gz

[liuwl@centos66-bigdata-hadoop ~]$ vi /etc/profile
#JAVA_HOME
export JAVA_HOME=/opt/modules/jdk1.7.0_67
export PATH=$PATH:$JAVA_HOME/bin
#MAVEN_HOME
export MAVEN_HOME=/opt/modules/apache-maven-3.0.5
export PATH=$PATH:$MAVEN_HOME/bin
[liuwl@centos66-bigdata-hadoop ~] source /etc/profile
[liuwl@centos66-bigdata-hadoop ~] java -version
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
[liuwl@centos66-bigdata-hadoop ~]$ echo $MAVEN_HOME
/opt/modules/apache-maven-3.0.5
[root@centos66-bigdata-hadoop ~]# mvn -v
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 05:51:28-0800)
Maven home: /opt/modules/apache-maven-3.0.5
Java version: 1.7.0_67, vendor: Oracle Corporation
Java home: /opt/modules/jdk1.7.0_67/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-504.el6.x86_64", arch: "amd64", family: "unix" 

  PS:准备文件中最好准备好maven的仓库文件,否则将下载很久

[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# ls /root/.m2/repository/
ant biz commons-chain commons-el commons-validator junit sslext antlr bouncycastle commons-cli commons-httpclient dom4j log4j tomcat aopalliance bsh commons-codec commons-io doxia logkit xerces asm cglib commons-collections commons-lang io net xml-apis avalon-framework classworlds commons-configuration commons-logging javax org xmlenc backport-util-concurrent com commons-daemon commons-net jdiff oro xpp3 bcel commons-beanutils commons-digester commons-pool jline regexp

  3. yum 安装 cmake,zlib-devel,openssl-devel,gcc gcc-c++,ncurses-devel

[root@centos66-bigdata-hadoop ~]# yum -y install cmake
[root@centos66-bigdata-hadoop ~]# yum -y install zlib-devel
[root@centos66-bigdata-hadoop ~]# yum -y install openssl-devel
[root@centos66-bigdata-hadoop ~]# yum -y install gcc gcc-c++
[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# yum -y install ncurses-devel

  4. 安装 protobuf-2.5.0.tar.gz(解压后进入protobuf主目录)

[root@centos66-bigdata-hadoop protobuf-2.5.0]# mkdir -p /opt/modules/protobuf
[root@centos66-bigdata-hadoop protobuf-2.5.0]# ./configure --prefix=/opt/modules/protobuf
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# make
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# make install
...
[root@centos66-bigdata-hadoop protobuf-2.5.0]# vi /etc/profile
...
#PROTOBUF_HOME
export PROTOBUF_HOME=/opt/modules/protobuf
export PATH=$PATH:$PROTOBUF_HOME/bin
[root@centos66-bigdata-hadoop protobuf-2.5.0]# source /etc/profile
[root@centos66-bigdata-hadoop protobuf-2.5.0]# protoc --version
libprotoc 2.5.0

  5. 解压Hadoop源文件压缩包,并进入主目录进行编译

[root@centos66-bigdata-hadoop protobuf-2.5.0]# cd ../../files/
[root@centos66-bigdata-hadoop files]# tar -zxf hadoop-2.5.0-src.tar.gz -C ../src/
[root@centos66-bigdata-hadoop files]# cd ../src/hadoop-2.5.0-src/
[root@centos66-bigdata-hadoop hadoop-2.5.0-src]# mvn package -DskipTests -Pdist,native
...
[INFO] Executed tasks
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /home/liuwl/opt/src/hadoop-2.5.0-src/hadoop-dist/target/hadoop-dist-2.5.0-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [8:22.179s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5:14.366s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1:50.627s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.795s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1:11.384s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1:55.962s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [10:21.736s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [4:01.790s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [35.829s]
[INFO] Apache Hadoop Common .............................. SUCCESS [12:51.374s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [29.567s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.220s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:04:44.352s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:40.397s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:24.100s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [12.020s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.239s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.298s]
[INFO] hadoop-yarn-api ................................... SUCCESS [2:07.150s]
[INFO] hadoop-yarn-common ................................ SUCCESS [3:13.690s]
[INFO] hadoop-yarn-server ................................ SUCCESS [1.009s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [54.750s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [2:53.418s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [23.570s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [16.137s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:17.456s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [9.170s]
[INFO] hadoop-yarn-client ................................ SUCCESS [17.790s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.132s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [6.689s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.015s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.102s]
[INFO] hadoop-yarn-project ............................... SUCCESS [13.562s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.526s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:27.794s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:32.320s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [19.368s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [26.041s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [31.938s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [38.261s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.923s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.856s]
[INFO] hadoop-mapreduce .................................. SUCCESS [15.510s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [20.631s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [51.096s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [13.185s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [22.877s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [25.861s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [9.764s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [7.152s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [23.914s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [21.289s]
[INFO] Apache Hadoop Client .............................. SUCCESS [18.486s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.966s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [37.039s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [9.809s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.192s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [34.114s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:21:11.103s
[INFO] Finished at: Wed Sep 14 11:49:38 PDT 2016
[INFO] Final Memory: 86M/239M
[INFO] ------------------------------------------------------------------------

Hadoop.2.x_源码编译的更多相关文章

  1. Hadoop,HBase,Zookeeper源码编译并导入eclipse

    基本理念:尽可能的参考官方英文文档 Hadoop:  http://wiki.apache.org/hadoop/FrontPage HBase:  http://hbase.apache.org/b ...

  2. Hadoop源码编译过程

    一.           为什么要编译Hadoop源码 Hadoop是使用Java语言开发的,但是有一些需求和操作并不适合使用java,所以就引入了本地库(Native Libraries)的概念,通 ...

  3. 基于cdh5.10.x hadoop版本的apache源码编译安装spark

    参考文档:http://spark.apache.org/docs/1.6.0/building-spark.html spark安装需要选择源码编译方式进行安装部署,cdh5.10.0提供默认的二进 ...

  4. hadoop 源码编译

    hadoop 源码编译 1.准备jar 1) hadoop-2.7.2-src.tar.gz 2) jdk-8u144-linux-x64.tar.gz 3) apach-ant-1.9.9-bin. ...

  5. Spark源码编译

    原创文章,转载请注明: 转载自http://www.cnblogs.com/tovin/p/3822995.html spark源码编译步骤如下: cd /home/hdpusr/workspace ...

  6. hadoop-1.2.0源码编译

    以下为在CentOS-6.4下hadoop-1.2.0源码编译步骤. 1. 安装并且配置ant 下载ant,将ant目录下的bin文件夹加入到PATH变量中. 2. 安装git,安装autoconf, ...

  7. hadoop-2.0.0-mr1-cdh4.2.0源码编译总结

    准备编译hadoop-2.0.0-mr1-cdh4.2.0的同学们要谨慎了.首先看一下这篇文章: Hadoop作业提交多种方案 http://www.blogjava.net/dragonHadoop ...

  8. wso2esb源码编译总结

    最近花了两周的空闲时间帮朋友把wso2esb的4.0.3.4.6.0.4.7.0三个版本从源码编译出来了.以下是大概的一些体会. wso2esb是基于carbon的.carbon是个基于eclipse ...

  9. hadoop-2.6.0源码编译问题汇总

    在上一篇文章中,介绍了hadoop-2.6.0源码编译的一般流程,因个人计算机环境的不同, 编译过程中难免会出现一些错误,下面是我编译过程中遇到的错误. 列举出来并附上我解决此错误的方法,希望对大家有 ...

随机推荐

  1. Eclipse的详细安装步骤

    第一种:这个方法是在线安装的  第二种:下载完整免安装包 首先打开网址:http://www.eclipse.org/ 然后在这里我就选择64位的安装,就以安装安卓开发的举例: 然后下载即可:

  2. 【zTree】 zTree使用的 小例子

    使用zTree树不是第一次了  但是 还是翻阅着之前做的 对照着 使用起来比较方便  这里就把小例子列出来   总结一下使用步骤 这样方便下次使用起来方便一点 使用zTree树的步骤: 1.首先  在 ...

  3. mysql注入研究

    网址: http://www.jb51.net/article/14446.htm http://www.jb51.net/article/29445.htm

  4. Loadrunner中参数化实战(7)-Unique+Each iteration

    参数化数据30条: 脚本如下,演示登录,投资,退出操作是,打印手机号: 首先验证Vugen中迭代: Unique+Each iteration 设置迭代4次Action 结果如下:

  5. sprint3冲刺第一天

    1.计划了sprint3要做的内容: 整合前台和后台,然后发布让用户使用,然后给我们反馈再进行改进 2.backlog表格: ID 任务 Est 做了什么 1 实现用户登录与权限判定 4 进行用户分类 ...

  6. iOS 为类添加Xib里面配置的view

    创建Empty文件,最好与其Controller同名, 在File's Owner的类属性里面指明其所属类(或者说它是个什么Controller), 从File's Owner右键拖向内部创建的视图( ...

  7. 疯狂java学习笔记之面向对象(八) - static和final

    一.static: 1.static是一个标识符: - 有static修饰的成员表明该成员是属于类的; - 没有static修饰的成员表明该成员是属于实例/对象的. 2.static修饰的成员(Fie ...

  8. [转]HTML5本地存储——Web SQL Database

    在HTML5 WebStorage介绍了html5本地存储的Local Storage和Session Storage,这两个是以键值对存储的解决方案,存储少量数据结构很有用,但是对于大量结构化数据就 ...

  9. COGS738 [网络流24题] 数字梯形(最小费用最大流)

    题目这么说: 给定一个由n 行数字组成的数字梯形如下图所示.梯形的第一行有m 个数字.从梯形的顶部的m 个数字开始,在每个数字处可以沿左下或右下方向移动,形成一条从梯形的顶至底的路径.规则1:从梯形的 ...

  10. myeclipse下拷贝的项目,tomcat下部署名称和导出为war包的名称默认值修改

    拷贝一个项目,作为一个新的项目,给它换了名字,这时候默认的部署名称等都是原来项目的,这时候要在属性里面修改一下.