64位Linux编译hadoop-2.5.1
Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/
软件安装目录:~/app
jdk: jdk-7u45-linux-x64.rpm
hadoop: hadoop-2.5.-src.tar.gz
maven: apache-maven-3.0.-bin.zip
protobuf: protobuf-2.5..tar.gz
1、下载hadoop
wget http://archive.apache.org/dist/hadoop/core/stable/hadoop-2.5.1-src.tar.gz
tar -zxvf hadoop-2.5.-src.tar.gz
在解压后的hadoop根目录下有个BUILDING.txt文件,可以看到编译hadoop的环境要求
Requirements:
* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2、安装jdk
sudo yum install jdk-7u45-linux-x64.rpm
查看jdk安装位置:
which java
/usr/java/jdk1..0_45/bin/java
添加jdk到环境变量(~/.bash_profile):
export JAVA_HOME=/usr/java/jdk1..0_45
export PATH=.:$JAVA_HOME/bin:$PATH
验证:
java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) -Bit Server VM (build 24.45-b08, mixed mode)
3、安装maven
wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.zip
unzip apache-maven-3.0.-bin.zip
添加maven到环境变量(~/.bash_profile):
export MAVEN_HOME=/home/hadoop/app/apache-maven-3.0.
export PATH=.:$MAVEN_HOME/bin:$PATH
验证:
mvn -version
Apache Maven 3.0. (r01de14724cdef164cd33c7c8c2fe155faf9602da; -- ::-)
Maven home: /home/hadoop/app/apache-maven-3.0.
Java version: 1.7.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1..0_45/jre
Default locale: en_US, platform encoding: UTF-
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"
4、安装protobuf
protobuf的官方地址貌似上不了,自行下载protobuf安装包;为了编译安装protobuf,需要先gcc/gcc-c++/make
sudo yum install gcc
sudo yum install gcc-c++
sudo yum install make
tar -zvxf protobuf-2.5..tar.gz
cd protobuf-2.5.
./configure --prefix=/usr/local/protoc/
sudo make
sudo make install
添加protobuf到环境变量(~/.bash_profile):
export PATH=.:/usr/local/protoc/bin:$PATH
验证:
protoc --version
libprotoc 2.5.
5、安装其他依赖
sudo yum install cmake
sudo yum install openssl-devel
sudo yum install ncurses-devel
6、编译hadoop源代码
cd ~/app/hadoop-2.5.-src
mvn package -DskipTests -Pdist,native
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [.980s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [.575s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [.324s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [.318s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [.550s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [.548s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [.410s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [.503s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [.915s]
[INFO] Apache Hadoop Common .............................. SUCCESS [:.913s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [.324s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [.064s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [:.023s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [.389s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [.235s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [.493s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [.041s]
[INFO] hadoop-yarn ....................................... SUCCESS [.031s]
[INFO] hadoop-yarn-api ................................... SUCCESS [:.828s]
[INFO] hadoop-yarn-common ................................ SUCCESS [.542s]
[INFO] hadoop-yarn-server ................................ SUCCESS [.047s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [.953s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [.537s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [.270s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [.840s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [.877s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [.421s]
[INFO] hadoop-yarn-client ................................ SUCCESS [.406s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [.025s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [.208s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [.885s]
[INFO] hadoop-yarn-site .................................. SUCCESS [.058s]
[INFO] hadoop-yarn-project ............................... SUCCESS [.870s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [.065s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [.292s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [.197s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [.229s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [.322s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [.640s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [.154s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [.939s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [.088s]
[INFO] hadoop-mapreduce .................................. SUCCESS [.979s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [.615s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [.668s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [.014s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [.567s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [.398s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [.151s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [.251s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [.901s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [.722s]
[INFO] Apache Hadoop Client .............................. SUCCESS [.021s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [.095s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [.776s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [.768s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [.035s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [.571s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: :.071s
[INFO] Finished at: Sat Nov :: PST
[INFO] Final Memory: 91M/324M
[INFO] ------------------------------------------------------------------------
编译后的代码在hadoop-2.5.1-src/hadoop-dist/target/hadoop-2.5.1下,以后要搭建hadoop环境直接使用hadoop-2.5.1文件夹部署即可。
64位Linux编译hadoop-2.5.1的更多相关文章
- 64位Linux编译C代码,crt1.o文件格式不对的问题
今天在某台64位LInux下编译一个简单的hello world的C程序,报错: /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../crt1.o: cou ...
- 关于64位Linux编译hadoop2
Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译可以查看hadoop-2.2.0-src下的BUILDING.txtB ...
- CentOS 64位上编译 Hadoop 2.6.0
Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号. 学习一项技术从安装開始.学习hadoop要从编译開始. 1.操作系统编译环境 yum install cmake lzo- ...
- 64位linux编译32位程序
昨天接到的任务,编译64位和32位两个版本的.so动态库给其他部门,我的ubuntu虚拟机是64位的,编译32位时遇到了问题: /usr/bin/ld: cannot find -lstdc++ 最后 ...
- linux下hadoop2.6.1源码64位的编译
linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...
- 在64位linux下编译32位程序
在64位linux下编译32位程序 http://blog.csdn.net/xsckernel/article/details/38045783
- MiniCRT 64位 linux 系统移植记录:64位gcc的几点注意
32位未修改源码与修改版的代码下载: git clone git@github.com:youzhonghui/MiniCRT.git MiniCRT 64位 linux 系统移植记录 MiniCRT ...
- /usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux)
/usr/local/lib/libz.a: could not read symbols: Bad value(64 位 Linux) /usr/bin/ld: /usr/local/lib/lib ...
- 64位linux报错Could not initialize class java.awt.image.BufferedImage
最近碰到一个问题: 64位linux报错Could not initialize class java.awt.image.BufferedImage 在WIN平台下运行正常BufferedImage ...
随机推荐
- ORA-01109:数据库未打开
ORA-01109:数据库未打开 在此之前做了这样一操作,在plsql创建了2表空间,由于装的是oracle精简版所以创建表空间大小超过4G就不能创建,然后我就手动把表空间给删除了,回收站也给删了,问 ...
- android外包公司——最新案例铁血军事手机客户端(IOS & Android)
<铁血军事>Android手机客户端由铁血网开发和运营,为网友提供铁血论坛和铁血读书两大产品.使用Android手机客户端,您不仅可以阅读到最新军事资讯,随时参与精彩话题讨论,还可以在线阅 ...
- 剑指offer系列28--字符流中第一个不重复的字符
[题目]请实现一个函数用来找出字符流中第一个只出现一次的字符.例如,当从字符流中只读出前两个字符”go”时,第一个只出现一次的字符是”g”.当从该字符流中读出前六个字符“google”时,第一个只出现 ...
- String、StringBuffer、StringBuilder之间的区别
String 字符串常量 StringBuffer 字符串变量(线程安全) StringBuilder 字符串变量(非线程安全) ...
- 深度优化LNMP
优化前准备工作 Centos准备及配置 准备安装包及软件:http://pan.baidu.com/s/1chHQF 下载解压到U盘即可安装http://pan.baidu.com/s/15TUWf ...
- 2. hdfs
一.Hdfs的shell 所有hadoop的fs的shell均用uri路径作为参数 uri格式:schema://authority/path.hdfs的schema是hdfs.其中,schema和a ...
- C和C++混合编程
extern "C"表示编译生成的内部符号名使用C约定.C++支持函数重载,而C不支持,两者的编译规则也不一样.函数被C++编译后在符号库中的名字与C语言的不 同.例如,假设某个函 ...
- SOA_环境安装系列1_Oracle SOA Suite11g安装总括(案例)
2015-01-01 Created By BaoXinjian
- CE_现金银行对账单的手工导入和调节(案例)
2014-07-14 Created By BaoXinjian
- CF 486D vailid set 树形DP
As you know, an undirected connected graph with n nodes and n - 1 edges is called a tree. You are gi ...