CentOS 64位上编译 Hadoop2.6.0
由于hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时会出错,比如:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V
所以需要重新编译
1.编译环境
yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst
2.安装JDK(下载JDK1.7,只能用1.7,否则编译会出错)
下载页面: http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
tar -zxvf jdk-7u75-linux-x64.tar.gz -C /usr/local
export JAVA_HOME=/usr/local/jdk1.7.0_75
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin
3.安装protobuf
下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz 或者 在百度云盘上下载:http://yun.baidu.com/share/link?shareid=830873155&uk=3573928349
tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
make
make install
protoc --version
4.安装ANT
wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
tar -zxvf apache-ant-1.9.4-bin.tar.gz -C /usr/local
vi /etc/profile
export ANT_HOME=/usr/local/apache-ant-1.9.4
export PATH=$PATH:$ANT_HOME/bin
5.安装maven
wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz
tar -zxvf apache-maven-3.3.1-bin.tar.gz -C /usr/local
vi /etc/profile
export MAVEN_HOME=/usr/local/apache-maven-3.3.1
export PATH=$PATH:$MAVEN_HOME/bin
修改配置文件
vi /usr/local/apache-maven-3.3.1/conf/settings.xml
更改maven资料库,在<mirrors></mirros>里添加如下内容:
<mirror>
<id>nexus-osc</id>
<mirrorOf>*</mirrorOf>
<name>Nexusosc</name>
<url>http://maven.oschina.net/content/groups/public/</url>
</mirror>
在<profiles></profiles>内新添加
<profile>
<id>jdk-1.7</id>
<activation>
<jdk>1.7</jdk>
</activation>
<repositories>
<repository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</profile>
在shell下执行,使环境变量生效
source /etc/profile
7.编译 Hadoop2.6.0
wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
cd hadoop-2.6.0-src
mvn package -DskipTests -Pdist,native -Dtar
如果是第一次使用maven,会打印很多如下日志信息
Downloading: http://maven.oschina.net/...
Scanning for projects...
...
[INFO] Apache Hadoop Main ................................. SUCCESS [ 4.590 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 3.503 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 5.870 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.540 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.921 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 7.731 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 6.805 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 9.008 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 6.991 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:12 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 16.557 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 24.476 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.115 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [05:09 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 40.145 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 15.876 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 9.236 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.125 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.129 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [02:49 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [01:01 min]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.099 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 25.019 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 33.655 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 5.761 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 13.714 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 41.930 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 13.364 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 17.408 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.042 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 5.131 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 3.710 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.107 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 12.531 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 7.781 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.116 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 47.915 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 38.104 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 9.073 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [01:01 min]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 18.149 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 9.002 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 3.222 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 13.224 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 6.571 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 9.781 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 16.254 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 5.302 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 13.760 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 8.858 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 6.252 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 4.276 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 6.206 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 1.945 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 12.239 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 38.137 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 13.213 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.169 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.206 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 15.248 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.162 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:09 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:19 min
[INFO] Finished at: 2015-03-26T17:54:10+08:00
[INFO] Final Memory: 106M/402M
[INFO] ------------------------------------------------------------------------
经过漫长的等待编译过程后,编译成功后会打包,放在hadoop-dist/target
#ll
total 528824
drwxr-xr-x 2 root root 4096 Mar 26 17:53 antrun
-rw-r--r-- 1 root root 1874 Mar 26 17:53 dist-layout-stitching.sh
-rw-r--r-- 1 root root 647 Mar 26 17:53 dist-tar-stitching.sh
drwxr-xr-x 9 root root 4096 Mar 26 17:53 hadoop-2.6.0
-rw-r--r-- 1 root root 180222548 Mar 26 17:53 hadoop-2.6.0.tar.gz
-rw-r--r-- 1 root root 2777 Mar 26 17:53 hadoop-dist-2.6.0.jar
-rw-r--r-- 1 root root 361254421 Mar 26 17:54 hadoop-dist-2.6.0-javadoc.jar
drwxr-xr-x 2 root root 4096 Mar 26 17:53 javadoc-bundle-options
drwxr-xr-x 2 root root 4096 Mar 26 17:53 maven-archiver
drwxr-xr-x 2 root root 4096 Mar 26 17:53 test-dir
编译后的文件见百度云盘
然后把lib下native下的文件覆盖掉hadoop下native中文件就ok了
CentOS 64位上编译 Hadoop2.6.0的更多相关文章
- CentOS 64位上编译 Hadoop 2.6.0
Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号. 学习一项技术从安装開始.学习hadoop要从编译開始. 1.操作系统编译环境 yum install cmake lzo- ...
- Windows 8.0上Eclipse 4.4.0 配置CentOS 6.5 上的Hadoop2.2.0开发环境
原文地址:http://www.linuxidc.com/Linux/2014-11/109200.htm 图文详解Windows 8.0上Eclipse 4.4.0 配置CentOS 6.5 上的H ...
- 在Linux上编译Hadoop-2.4.0
目录 目录 1 1. 前言 1 2. 安装依赖 1 2.1. 安装ProtocolBuffer 2 2.2. 安装CMake 2 2.3. 安装JDK 2 2.4. 安装Maven 3 3. 编译Ha ...
- 关于64位Linux编译hadoop2
Apache官方提供hadoop2的安装包是在32位机器下编译的,生产环境一般使用64的Linux,那么需要在64位机器下重新编译可以查看hadoop-2.2.0-src下的BUILDING.txtB ...
- 编译本地64位版本的hadoop-2.6.0
官方提供的hadoop-2.x版本貌似都是32位的,在64位机子下使用可能会报错,最好使用官方提供的源码进行本地编译,编译成适合本地硬件环境的64位软件包. 关于native Hadoop是使用J ...
- 在Linux上编译Hadoop-2.4.0实践与总结
问题导读: 1.编译源码前需要安装哪些软件? 2.安装之后该如何设置环境变量? 3.为什么不要使用JDK1.8? 4.mvn package -Pdist -DskipTests -Dtar的作用是什 ...
- 64位Linux编译hadoop-2.5.1
Apache Hadoop生态系统安装包下载地址:http://archive.apache.org/dist/ 软件安装目录:~/app jdk: jdk-7u45-linux-x64.rpm ha ...
- 在 CentOS 7.5 64位上使用 yum 安装 MySQL 8.0
前段时间在 CentOS 7.5 64位上安装 MySQL 8.0.查了些资料,在这里记录一下详细的安装和设置步骤. 一.安装 使用yum安装MySQL之前需要先下载对应的.rpm文件,下载方法: 去 ...
- 在Ubuntu 64位OS上运行hadoop2.2.0[重新编译hadoop]
最近在学习搭建Hadoop, 我们从Apache官方网站直接下载最新版本Hadoop2.2.官方目前是提供了linux32位系统可执行文件,结果运行时发现提示 “libhadoop.so.1.0.0 ...
随机推荐
- vue---day03
1. Vue的生命周期 - 创建和销毁的时候可以做一些我们自己的事情 - beforeCreated - created - beforeMount - mounted - beforeUpdate ...
- node解析post表单信息
一共有4种解析方式 urlencoded.json.text .raw 发起请求的form表单中可以设置三种数据编码方式 application/x-www-form-urlencoded.multi ...
- Educational Codeforces Round 47 (Rated for Div. 2) :B. Minimum Ternary String
题目链接:http://codeforces.com/contest/1009/problem/B 解题心得: 题意就是给你一个只包含012三个字符的字符串,位置并且逻辑相邻的字符可以相互交换位置,就 ...
- 简单整理React的Context API
之前做项目时经常会遇到某个组件需要传递方法或者数据到其内部的某个子组件,中间跨越了甚至三四层组件,必须层层传递,一不小心哪层组件忘记传递下去了就不行.然而我们的项目其实并没有那么复杂,所以也没有使用r ...
- git之解决冲突
前面几次使用git,一直对于冲突的这个问题不是很理解,感觉有些时候就会产生冲突,在此记录一下解决冲突的流程 1.git bash上面冲突显示 2.在idea上面可以看到冲突的文件 3.去解决冲突 4. ...
- JQuery中的load()、$.get()和$.post()详解 (转)
load() 1.载入HTML文档 load()方法是jQuery中最为简单和常用的Ajax方法,能载入远程HTML代码并插入DOM中. 它的结构为: load(url [,data][,callba ...
- 批处理bat实现创建、复制、删除文件及文件夹
转自:http://blog.csdn.net/linda1000/article/details/10221285 1 建bat文件自动执行复制,删除命令. 例1:以下是复制cd.dll文件至win ...
- 15 GIL 全局解释器锁 C语言解决 top ps
1.GIL 全局解释器锁:保证同一时刻只有一个线程在运行. 什么是全局解释器锁GIL(Global Interpreter Lock) Python代码的执行由Python 虚拟机(也叫解释器主循环, ...
- 安装Sql Server 2008的时候报错说找不到某个安装文件
在安装Sql Server 2008的时候,报错说找不到某个安装文件,但是这个文件明明在那,百思不得其解. 最后看到一个老外的文章里面说,你要确认,你能访问到这个文件 ...
- 源码-集合:ArrayList
只是文章摘录,还未研究 JAVA ArrayList详细介绍(示例) http://www.jb51.net/article/42764.htm Jdk1.6 JUC源码解析汇总 - 永远保持敬畏之心 ...