4.1、环境:

1)Linux 64 位操作系统,CentOS 6.4 版本,VMWare 搭建的虚拟机

2)虚拟机可以联网

4.2、官方编译说明:

解压命令:tar -zxvf hadoop-2.4.0-src.tar.gz

之后进入到解压文件夹下,可以查看BUILDING.txt文件, more BUILDING.txt ,向下翻页是空格键,其中内容如下

Requirements:

* Unix System

* JDK 1.6+

* Maven 3.0 or later

* Findbugs 1.3.9 (if running findbugs)

* ProtocolBuffer 2.5.0

* CMake 2.6 or newer (if compiling native code)

* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

----------------------------------------------------------------------------------

Maven main modules:

hadoop (Main Hadoop project)

- hadoop-project (Parent POM for all Hadoop Maven modules. )

(All plugins & dependencies versions are defined here.)

- hadoop-project-dist (Parent POM for modules that generate distributions.)

- hadoop-annotations (Generates the Hadoop doclet used to generated the Java

docs)

- hadoop-assemblies (Maven assemblies used by the different modules)

- hadoop-common-project (Hadoop Common)

- hadoop-hdfs-project (Hadoop HDFS)

- hadoop-mapreduce-project (Hadoop MapReduce)

- hadoop-tools (Hadoop tools like Streaming, Distcp, etc.)

- hadoop-dist (Hadoop distribution assembler)

----------------------------------------------------------------------------------

在编译完成之后,可以查看Hadoop的版本信息

libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped 
[root@centos native]# pwd 
/opt/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native 
[root@centos native]#

4.3、编译前准备之安装依赖包

安装linux系统包

  • yum install autoconf automake libtool cmake
  • yum install ncurses-devel
  • yum install openssl-devel
  • yum install lzo-devel zlib-devel gcc gcc-c++

安装Maven

  • 下载:apache-maven-3.0.5-bin.tar.gz
  • 解压:tar -zxvf apache-maven-3.0.5-bin.tar.gz
  • 设置环境变量,打开/etc/profile文件,添加
    • export MAVEN_HOME=/opt/apache-maven-3.0.5
    • export PATH=$PATH:$MAVEN_HOME/bin
  • 执行命令使之生效:source /etc/profile或者./etc/profile
  • 验证:mvn -v
(root用户)安装protobuf
  • 解压:tar -zxvf protobuf-2.5.0.tar.gz
  • 进入安装目录,进行配置,执行命令,./configure
  • 安装命令:make && make check && make install
  • 验证:protoc --version

vi /etc/profile

export PROTOC_HOME=/opt/protobuf-2.5.0

export PATH=$PATH:$PROTOC_HOME/src

然后,

$protoc --version

libprotoc.2.5.0

安装findbugs
  • 解压:tar -zxvf findbugs.tar.gz
  • 设置环境变量:
  • vi /etc/profile
  • export FINDBUGS_HOME=/opt/findbugs-3.0.0
  • export PATH=$PATH:$FINDBUGS_HOME/bin
  • 验证命令:findbugs -version
安装java
下载了rpm包之后,rpm -ivh jre-7u71-linux-x64.rpm,安装完成之后
[root@centos ~]# java -version 
java version "1.7.0_71" 
Java(TM) SE Runtime Environment (build 1.7.0_71-b14) 
Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode) 
[root@centos ~]# javac -version 
javac 1.7.0_71 
 
注意

Hadoop是Java写的,他无法使用Linux预安装的OpenJDK,因此安装hadoop前需要先安装JDK(1.6以上)


4.4、如何编译

进入到Hadoop源码目录下/opt/hadoop-2.4.0-src,运行红色字体[可选项]:

Building distributions:

Create binary distribution without native code and without documentation:

mvn package -Pdist -DskipTests -Dtar

Create binary distribution with native code and with documentation:

mvn package -Pdist,native,docs -DskipTests -Dtar

Create source distribution:

mvn package -Psrc -DskipTests

Create source and binary distributions with native code and documentation:

 $ mvn -e -X package -Pdist,native[,docs,src] -DskipTests -Dtar

Create a local staging version of the website (in /tmp/hadoop-site)

mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site

4.5、编译之前,可能需要配置MAVEN国内镜像配置

  1. 进入安装目录 /opt/modules/apache-maven-3.0.5/conf,编辑 settings.xml 文件

* 修改<mirrors>内容:

<mirror>

<id>nexus-osc</id>

<mirrorOf>*</mirrorOf>

<name>Nexus osc</name>

<url>http://maven.oschina.net/content/groups/public/</url>

</mirror>

* 修改<profiles>内容:

<profile>

<id>jdk-1.6</id>

<activation>

<jdk>1.6</jdk>

</activation>

<repositories>

<repository>

<id>nexus</id>

<name>local private nexus</name>

<url>http://maven.oschina.net/content/groups/public/</url>

<releases>

<enabled>true</enabled>

</releases>

<snapshots>

<enabled>false</enabled>

</snapshots>

</repository>

</repositories>

<pluginRepositories>

<pluginRepository>

<id>nexus</id>

<name>local private nexus</name>

<url>http://maven.oschina.net/content/groups/public/</url>

<releases>

<enabled>true</enabled>

</releases>

<snapshots>

<enabled>false</enabled>

</snapshots>

</pluginRepository>

</pluginRepositories>

</profile>

复制配置

将该配置文件复制到用户目录,使得每次对maven创建时,都采用该配置

* 查看用户目录【/home/hadoop】是否存在【.m2】文件夹,如没有,则创建

$ cd /home/hadoop

$ mkdir .m2

* 复制文件

$ cp /opt/modules/apache-maven-3.0.5/conf/settings.xml   ~/.m2/

4.6、配置DNS 

修改: vi /etc/resolv.conf

nameserver 8.8.8.8

nameserver 8.8.4.4

4.7、将Hadoop Project 导入到Eclipse

Importing projects to eclipse

When you import the project to eclipse, install hadoop-maven-plugins at first.

$ cd hadoop-maven-plugins

$ mvn install

Then, generate eclipse project files.

$ mvn eclipse:eclipse -DskipTests

At last, import to eclipse by specifying the root directory of the project via

[File] > [Import] > [Existing Projects into Workspace].

出现错误:Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: input file /opt/hadoop-2.4.0-src/hadoop-hdfs-project/hadoop-hdfs/target/findbugsXml.xml does not exist
解决办法:

cd ~/hadoop-2.4.0-src/ mvn clean package -Pdist,native,docs -DskipTests -Dtar //编译中途出错修正后可从指定点开始继续编译,修改最后一个参数即可。如出现hadoop-hdfs/target/findbugsXml.xml does not exist则从该命令删除docs参数再运行mvn package -Pdist,native -DskipTests -Dtar -rf :hadoop-pipes

 
build成功之后,进入到/opt/hadoop-2.4.0-src/hadoop-dist/target路径下查看hadoop-2.4.0.tar.gz就是编译完成之后的tar包
 
出现错误:Could not find goal 'protoc' in plugin org.apache.hadoop:hadoop-maven-plugins:2.2.0 among available 
解决办法:在/etc/profile中加入如下内容,之后source /etc/profile
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/protobuf/lib
export PATH=$PATH:/usr/local/bin
通常建议安装到/usr/local目录下,执行configure时,指定--prefix=/usr/local/protobuf即可,如果出现错误,那么make clean一下,之后再进行操作
 
我的/etc/profile文件内容:
export MAVEN_HOME=/opt/apache-maven-3.0.5
export PATH=$PATH:$MAVEN_HOME/bin
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_71/
export JRE_HOME=/usr/lib/jvm/jdk1.7.0_71/jre
export ANT_HOME=/usr/lib/jvm/apache-ant/
export CLASSPATH=.:$JRE_HOME/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$ANT_HOME/bin
export FINDBUGS_HOME=/opt/findbugs-3.0.0
export PATH=$PATH:$FINDBUGS_HOME/bin:/opt/protoc/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/protoc/lib

export PROTOC_HOME=/home/hadoop/protobuf-2.5.0
export PATH=${PATH}:${FINDBUGS_HOME}/bin:$PROTOC_HOME/src

编译完成,输出信息如下:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 0.923 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 0.734 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.009 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.416 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.871 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 3.672 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.528 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 17.347 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.163 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:46 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.383 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.032 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [08:17 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [04:10 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 27.153 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.014 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.076 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.074 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 55.567 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 30.243 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.027 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 8.851 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 33.811 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.315 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 8.813 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 12.100 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.343 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 4.797 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.027 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 3.495 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.208 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.038 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 6.086 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.125 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 18.008 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 14.628 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.223 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 9.358 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 8.184 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 12.318 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.600 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 5.915 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 4.150 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.438 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 7.712 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.838 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.190 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 4.524 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.694 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.687 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.023 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 6.197 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 7.037 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.072 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 25.116 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 6.242 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.023 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 46.024 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:56 min
[INFO] Finished at: 2016-02-17T16:34:55+08:00
[INFO] Final Memory: 221M/6213M
[INFO] ------------------------------------------------------------------------

编译完成,在源码目录下多了文件夹hadoop-dist,里面的target文件夹就是编译的输出

总用量 599636

drwxrwxr-x.   2 hadoop hadoop 4096            2月 17 16:34 antrun
-rw-rw-r--.    1 hadoop hadoop 1625            2月 17 16:34 dist-layout-stitching.sh
-rw-rw-r--.    1 hadoop hadoop 642              2月 17 16:34 dist-tar-stitching.sh
drwxrwxr-x.  8 hadoop hadoop 4096            2月 17 16:34 hadoop-2.4.0
-rw-rw-r--.   1 hadoop hadoop 202726573    2月 17 16:34 hadoop-2.4.0.tar.gz
-rw-rw-r--.   1 hadoop hadoop 2746            2月 17 16:34 hadoop-dist-2.4.0.jar
-rw-rw-r--.   1 hadoop hadoop 411264073   2月 17 16:34 hadoop-dist-2.4.0-javadoc.jar
drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 javadoc-bundle-options
drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 maven-archiver
drwxrwxr-x. 2 hadoop hadoop 4096            2月 17 16:34 test-dir

centos6.4编译hadoop2.4源码的更多相关文章

  1. 在eclipse下编译hadoop2.0源码

    Hadoop是一个分布式系统基础架构,由apache基金会维护并更新.官网地址: http://hadoop.apache.org/ Hadoop项目主要包括以下4个模块: Hadoop Common ...

  2. hadoop2.x源码编译

    转载请标明出处: http://blog.csdn.net/zwto1/article/details/50733753: 介绍 本篇主要会涉及以下内容: 学会编译hadoop2.x源码 编译hado ...

  3. 从零教你如何获取hadoop2.4源码并使用eclipse关联hadoop2.4源码

    从零教你如何获取hadoop2.4源码并使用eclipse关联hadoop2.4源码http://www.aboutyun.com/thread-8211-1-1.html(出处: about云开发) ...

  4. Atitit.反编译apk android源码以及防止反编译apk

    Atitit.反编译apk android源码以及防止反编译apk 1.1. Tool  apk逆向助手1 1.2. 二.使用dex2jar + jd-gui 得到apk的java源码1 1.3. 用 ...

  5. msvc2013编译qt5.6源码

    1.回顾 说起到qt的编译,真是领人痛心啊,不仅编译选项繁多,而且编译时间比较久,总是能使想编译qt源码的人望而却步,呵呵...我就是其中一个,不知道从什么时候开始就想着把qt的源码编译一下,也尝试过 ...

  6. Ubuntu 下载 & 编译 Android5.1 源码

    ustc & tsinghua android srchttps://lug.ustc.edu.cn/wiki/mirrors/help/aosphttps://mirrors.tuna.ts ...

  7. 在Ubuntu Server14.04上编译Android6.0源码

    此前编译过Android4.4的源码,但是现在Android都到了7.0的版本,不禁让我感叹Google的步伐真心难跟上,趁这周周末时间比较充裕,于是在过去的24小时里,毅然花了9个小时编译了一把An ...

  8. 编译android5.0源码的

    java环境 Android 5.1 用到的jdk不再是Oracle 的 jdk ,而是开源的 openjdk,在ubuntu安装好后,使用如下命令安装jdk: $sudo apt-get insta ...

  9. 【转】编译Android系统源码和内核源码

    原文网址:http://blog.csdn.net/jiangwei0910410003/article/details/37988637 好长时间没有写blog了,之所以没有写,主要还是工作上的事, ...

随机推荐

  1. Java中的for循环——通过示例学习Java编程(9)

      作者:CHAITANYA SINGH 来源:https://www.koofun.com/pro/kfpostsdetail?kfpostsid=21 循环用于反复执行同一组语句,直到满足特定条件 ...

  2. Java开发工具IntelliJ IDEA创建Andriod项目示例说明

    IntelliJ IDEA社区版作为一个轻量级的Java开发IDE,是一个开箱即用的Android开发工具. 注意:在本次的教程中我们将以Android平台2.2为例进行IntelliJ IDEA的使 ...

  3. Spring Boot 的配置文件application.properties

    Spring Boot 中的application.properties 是一个全局的配置文件,放在src/main/resources 目录下或者类路径的/config下. 作为全局配置文件的app ...

  4. logback的加载过程

    使用logback-classic.jar时,启动应用后,logback按照以下顺序进行扫描: 1.在系统配置文件System Properties中寻找是否有logback.configuratio ...

  5. Payoneer个人账户注册申请教程

    1)照牛排于2013年末写的<免费申请Payoneer万事达预付卡+美国银行账号教程>非常详尽,网友纷纷转载,但生命在于折腾,Payoneer官网几经改版,自2015年3月推出无卡账户以来 ...

  6. GridView的 PreRender事件与 RowCreated、RowDataBound事件大乱斗

    GridView的 PreRender事件与 RowCreated.RowDataBound事件大乱斗 之前写了几个范例,做了GridView的 PreRender事件与 RowCreated.Row ...

  7. 在编辑Spring的配置文件时的自动提示

    打 开MyEclipse—>Windows--->referenecs——>General,选择下面的Keys,这就是快捷键的设 置,可将Content Assist的快捷键改为 A ...

  8. [Rodbourn's Blog]How to export Excel plots to a vector image (EPS, EMF, SVG, etc.)

    This is a bit of a workaround, but it's the only way I know of to export an Excel plot into a vector ...

  9. 初学树套树:线段树套Treap

    前言 树套树是一个十分神奇的算法,种类也有很多:像什么树状数组套主席树.树状数组套值域线段树.\(zkw\)线段树套\(vector\)等等. 不过,像我这么弱,当然只会最经典的 线段树套\(Trea ...

  10. 2018.10.24 NOIP2018模拟赛 解题报告

    得分: \(100+0+100=200\)(\(T2\)悲惨爆\(0\)) \(P.S.\)由于原题是图片,所以我没有上传题目描述,只有数据. \(T1\):query(点此看题面) 熟悉主席树的人都 ...