Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/

软件安装目录:~/app

jdk: jdk-7u45-linux-x64.rpm
hadoop: hadoop-2.5.-src.tar.gz
maven: apache-maven-3.0.-bin.zip
protobuf: protobuf-2.5..tar.gz

1、下载hadoop

wget http://archive.apache.org/dist/hadoop/core/stable/hadoop-2.5.1-src.tar.gz
tar -zxvf hadoop-2.5.-src.tar.gz

在解压后的hadoop根目录下有个BUILDING.txt文件,可以看到编译hadoop的环境要求

Requirements:
* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

2、安装jdk

sudo yum install jdk-7u45-linux-x64.rpm

查看jdk安装位置:

which java
/usr/java/jdk1..0_45/bin/java

添加jdk到环境变量(~/.bash_profile):

export JAVA_HOME=/usr/java/jdk1..0_45
export PATH=.:$JAVA_HOME/bin:$PATH

验证:

java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) -Bit Server VM (build 24.45-b08, mixed mode)

3、安装maven

wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.zip
unzip apache-maven-3.0.-bin.zip

添加maven到环境变量(~/.bash_profile):

export MAVEN_HOME=/home/hadoop/app/apache-maven-3.0.
export PATH=.:$MAVEN_HOME/bin:$PATH

验证:

mvn -version
Apache Maven 3.0. (r01de14724cdef164cd33c7c8c2fe155faf9602da; -- ::-)
Maven home: /home/hadoop/app/apache-maven-3.0.
Java version: 1.7.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1..0_45/jre
Default locale: en_US, platform encoding: UTF-
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"

4、安装protobuf

protobuf的官方地址貌似上不了,自行下载protobuf安装包;为了编译安装protobuf,需要先gcc/gcc-c++/make

sudo yum install gcc
sudo yum install gcc-c++
sudo yum install make
tar -zvxf protobuf-2.5..tar.gz
cd protobuf-2.5.
./configure --prefix=/usr/local/protoc/
sudo make
sudo make install

添加protobuf到环境变量(~/.bash_profile):

export PATH=.:/usr/local/protoc/bin:$PATH

验证:

protoc --version
libprotoc 2.5.

5、安装其他依赖

sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel

6、编译hadoop源代码

cd ~/app/hadoop-2.5.-src
mvn package -DskipTests -Pdist,native
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [.980s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [.575s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [.324s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [.318s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [.548s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [.410s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [.503s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [.915s]
[INFO] Apache Hadoop Common .............................. SUCCESS [:.913s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [.324s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [.064s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [:.023s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [.389s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [.235s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [.493s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [.041s]
[INFO] hadoop-yarn ....................................... SUCCESS [.031s]
[INFO] hadoop-yarn-api ................................... SUCCESS [:.828s]
[INFO] hadoop-yarn-common ................................ SUCCESS [.542s]
[INFO] hadoop-yarn-server ................................ SUCCESS [.047s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [.953s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [.537s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [.270s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [.840s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [.877s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [.421s]
[INFO] hadoop-yarn-client ................................ SUCCESS [.406s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [.025s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [.208s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [.885s]
[INFO] hadoop-yarn-site .................................. SUCCESS [.058s]
[INFO] hadoop-yarn-project ............................... SUCCESS [.870s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [.065s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [.292s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [.197s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [.229s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [.322s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [.640s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [.154s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [.939s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [.088s]
[INFO] hadoop-mapreduce .................................. SUCCESS [.979s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [.615s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [.668s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [.014s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [.567s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [.398s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [.151s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [.251s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [.901s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [.722s]
[INFO] Apache Hadoop Client .............................. SUCCESS [.021s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [.095s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [.776s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [.768s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [.035s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [.571s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: :.071s
[INFO] Finished at: Sat Nov :: PST
[INFO] Final Memory: 91M/324M
[INFO] ------------------------------------------------------------------------

编译后的代码在hadoop-2.5.1-src/hadoop-dist/target/hadoop-2.5.1下,以后要搭建hadoop环境直接使用hadoop-2.5.1文件夹部署即可。

64位Linux编译hadoop-2.5.1的更多相关文章

  1. 64位Linux编译C代码,crt1.o文件格式不对的问题

    今天在某台64位LInux下编译一个简单的hello world的C程序,报错: /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../crt1.o: cou ...

  2. 关于64位Linux编译hadoop2

    Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译可以查看hadoop-2.2.0-src下的BUILDING.txtB ...

  3. CentOS 64位上编译 Hadoop 2.6.0

    Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号. 学习一项技术从安装開始.学习hadoop要从编译開始. 1.操作系统编译环境 yum install cmake lzo- ...

  4. 64位linux编译32位程序

    昨天接到的任务,编译64位和32位两个版本的.so动态库给其他部门,我的ubuntu虚拟机是64位的,编译32位时遇到了问题: /usr/bin/ld: cannot find -lstdc++ 最后 ...

  5. linux下hadoop2.6.1源码64位的编译

    linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...

  6. 在64位linux下编译32位程序

    在64位linux下编译32位程序 http://blog.csdn.net/xsckernel/article/details/38045783

  7. MiniCRT 64位 linux 系统移植记录:64位gcc的几点注意

    32位未修改源码与修改版的代码下载: git clone git@github.com:youzhonghui/MiniCRT.git MiniCRT 64位 linux 系统移植记录 MiniCRT ...

  8. /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux)

    /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux) /usr/bin/ld: /usr/local/lib/lib ...

  9. 64位linux报错Could not initialize class java.awt.image.BufferedImage

    最近碰到一个问题: 64位linux报错Could not initialize class java.awt.image.BufferedImage 在WIN平台下运行正常BufferedImage ...

随机推荐

  1. Java并发之CopyOnWriteArrayList

    CopyOnWriteArrayList是线程安全的.并且读操作无锁的ArrayList.不像ArrayList默认初始化大小为10的Object[],CopyOnWriteArrayList默认初始 ...

  2. R(一): R基础知识

    R 是一门拥有统计分析及作图功能的免费软件,主要用于数学建模.统计计算.数据处理.可视化等方向.据 IEEE Spectrum发布的2016年编程语言前10位排名来看,R语言由2015年排名第6位上升 ...

  3. MySql服务基础

           MySQL是一个关系型数据库管理系统,由瑞典MySQL AB 公司开发,目前属于 Oracle 旗下产品.MySQL 最流行的关系型数据库管理系统,在 WEB 应用方面MySQL是最好的 ...

  4. .NET去掉HTML标记

    using System.Text.RegularExpressions; /// <summary> /// 去除HTML标记 /// </summary> /// < ...

  5. Nova分析(1)——整体架构

    Conceptual Diagram Logical diagram Nova is the most complicated and distributed component of OpenSta ...

  6. FIR系统的递归与非递归实现

    首先,因为FIR的脉冲响应是有限长,所以总是可以非递归实现的: 其次,也可以用递归系统来实现它. 以滑动平均做例子,最直观的想法就是,每次来一个新的值,丢掉最老的,加上最新的: y[n]=y[n-1] ...

  7. MIME对应表

    文件后缀与MIME类型的对应表            'ai' => 'application/postscript',            'aif' => 'audio/x-aiff ...

  8. 【WP之一】]独立存储

    介绍: 提供一个磁盘存储空间,他是一种虚拟的文件系统,能存储小量的数据:在默认的情况下,它只能存储1MB的文件.根据使用方式及功能的不同,独立存储空间又包含两部分:独立设置存储和独立文件存储.除非卸载 ...

  9. 推荐一个css帮助手册的版本 同时提供chm和在线

    版本保持更新 目录分类妥当 查阅很方便 就是习惯了jquery那种风格,略有不适应. 包括最新css3的内容 网址: http://css.doyoe.com/ chm下载地址: http://css ...

  10. unity客户端与c++服务器之间的简单通讯_1

    // 服务器 # pragma once using namespace std; # include <iostream> # include <string> # incl ...