Hadoop不提供64位编译好的版本号,仅仅能用源代码自行编译64位版本号。

学习一项技术从安装開始。学习hadoop要从编译開始。

1.操作系统编译环境



yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst



2.安装JDK



下载JDK1.7。注意仅仅能用1.7,否则编译会出错

http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html



tar zxvf jdk-7u75-linux-x64.tar.gz -C /app



export JAVA_HOME=/app/jdk1.7.0_75

export JRE_HOME=$JAVA_HOME/jre

export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar



PATH=$PATH:$JAVA_HOME/bin



3.安装protobuf



下载protobuf-2.5.0,不能用高版本号,否则Hadoop编译不能通过

wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz



tar xvf protobuf-2.5.0.tar.gz

cd protobuf-2.5.0

./configure 

make

make install

ldconfig



protoc --version



4.安装ANT



wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz

 tar zxvf apache-ant-1.9.4-bin.tar.gz -C /app



vi /etc/profile

export ANT_HOME=/app/apache-ant-1.9.4

PATH=$PATH:$ANT_HOME/bin



5.安装maven



wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz



tar zxvf apache-maven-3.3.1-bin.tar.gz -C /app



vi /etc/profile

export MAVEN_HOME=/app/apache-maven-3.3.1

export PATH=$PATH:$MAVEN_HOME/bin



改动配置文件

vi /app/apache-maven-3.3.1/conf/settings.xml



更改maven资料库。在<mirrors></mirros>里加入例如以下内容:



   <mirror>

        <id>nexus-osc</id>

         <mirrorOf>*</mirrorOf>

     <name>Nexusosc</name>

     <url>http://maven.oschina.net/content/groups/public/</url>

   </mirror>





在<profiles></profiles>内新加入



<profile>

       <id>jdk-1.7</id>

       <activation>

         <jdk>1.7</jdk>

       </activation>

       <repositories>

         <repository>

           <id>nexus</id>

           <name>local private nexus</name>

           <url>http://maven.oschina.net/content/groups/public/</url>

           <releases>

             <enabled>true</enabled>

           </releases>

           <snapshots>

             <enabled>false</enabled>

           </snapshots>

         </repository>

       </repositories>

       <pluginRepositories>

         <pluginRepository>

           <id>nexus</id>

          <name>local private nexus</name>

           <url>http://maven.oschina.net/content/groups/public/</url>

           <releases>

             <enabled>true</enabled>

           </releases>

           <snapshots>

             <enabled>false</enabled>

           </snapshots>

         </pluginRepository>

       </pluginRepositories>

</profile>



6.安装findbugs(非必须)

wget http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download

tar zxvf findbugs-3.0.1.tar.gz -C /app



vi /etc/profile

export FINDBUGS_HOME=/app/findbugs-3.0.1

PATH=$PATH:$FINDBUGS_HOME/bin

export PATH



注意:

终于,在/etc/profile中环境变量PATH的设置例如以下:

PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin

export PATH



在shell下运行,使环境变量生效

. /etc/profile



7.编译 Hadoop2.6.0



wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz

cd hadoop-2.6.0-src

mvn package -DskipTests -Pdist,native -Dtar



[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................. SUCCESS [  4.401 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [  3.864 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [  7.591 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.535 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  3.585 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  6.623 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  4.722 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [  7.787 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  5.500 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [02:47 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 12.793 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [ 20.443 s]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.111 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:35 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 29.896 s]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 11.100 s]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  8.262 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.069 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.066 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [02:05 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [ 46.132 s]

[INFO] hadoop-yarn-server ................................. SUCCESS [  0.123 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 19.166 s]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 25.552 s]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  5.456 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 11.781 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 30.557 s]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  9.765 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [ 14.016 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.101 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  4.116 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.993 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.093 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [  9.036 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [  6.557 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.267 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 36.775 s]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 28.049 s]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  7.285 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 17.333 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 15.283 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  7.110 s]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  3.843 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 12.559 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [  6.331 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 45.863 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 46.304 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  3.575 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 12.991 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 10.105 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  5.021 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  3.804 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  5.298 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 10.290 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  9.220 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [11:12 min]

[INFO] Apache Hadoop Client ............................... SUCCESS [ 10.714 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.143 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  7.664 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 29.970 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.057 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 49.425 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 32:26 min

[INFO] Finished at: 2015-03-19T19:56:40+08:00

[INFO] Final Memory: 99M/298M

[INFO] ------------------------------------------------------------------------



编译成功后会打包。放在hadoop-dist/target

# ls

antrun                    dist-tar-stitching.sh  hadoop-2.6.0.tar.gz    hadoop-dist-2.6.0-javadoc.jar  maven-archiver

dist-layout-stitching.sh  hadoop-2.6.0           hadoop-dist-2.6.0.jar  javadoc-bundle-options         test-dir

CentOS 64位上编译 Hadoop 2.6.0的更多相关文章

  1. CentOS 64位上编译 Hadoop2.6.0

    由于hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时会出错,比如: java.lang.UnsatisfiedLinkError: org.apa ...

  2. 64位CentOS上编译 Hadoop 2.2.0

    下载了Hadoop预编译好的二进制包,hadoop-2.2.0.tar.gz,启动起来后.总是出现这样的警告: WARN util.NativeCodeLoader: Unable to load n ...

  3. mac OS X Yosemite 上编译hadoop 2.6.0/2.7.0及TEZ 0.5.2/0.7.0 注意事项

    1.jdk 1.7问题 hadoop 2.7.0必须要求jdk 1.7.0,而oracle官网已经声明,jdk 1.7 以后不准备再提供更新了,所以趁现在还能下载,赶紧去down一个mac版吧 htt ...

  4. centos 64位 下hadoop-2.7.2 下编译

    centos 64位 下hadoop-2.7.2 下编译 由于机器安装的是centos 6.7 64位 系统  从hadoop中下载是32位  hadoop 依赖的的库是libhadoop.so 是3 ...

  5. 在 CentOS 7.5 64位上使用 yum 安装 MySQL 8.0

    前段时间在 CentOS 7.5 64位上安装 MySQL 8.0.查了些资料,在这里记录一下详细的安装和设置步骤. 一.安装 使用yum安装MySQL之前需要先下载对应的.rpm文件,下载方法: 去 ...

  6. Ubuntu 14.04 (32位)上搭建Hadoop 2.5.1单机和伪分布式环境

    引言 一直用的Ubuntu 32位系统(准备下次用Fedora,Ubuntu越来越不适合学习了),今天准备学习一下Hadoop,结果下载Apache官网上发布的最新的封装好的2.5.1版,配置完了根本 ...

  7. Centos(64位)安装Hbase详细步骤

    HBase是一个分布式的.面向列的开源数据库,该技术来源于 Fay Chang 所撰写的Google论文“Bigtable:一个结构化数据的分布式存储系统”.就像Bigtable利用了Google文件 ...

  8. linux下hadoop2.6.1源码64位的编译

    linux下hadoop2.6.1源码64位的编译 一. 前言 Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会现问题.我们在64位服务器执行Hado ...

  9. Centos 64位 Install certificate on apache 即走https协议

    Centos 64位 Install certificate on apache 即走https协议 一: 先要apache 请求ssl证书的csr 一下是步骤: 重要注意事项 An Importan ...

随机推荐

  1. ##DAY5 UIControl及其子类

    ##DAY5 UIControl及其子类 #pragma mark ———————UIControl——————————— UIControl初识: 1)UIControl是有控制功能的视图(比如UI ...

  2. 一个简单顺序表的C++实现

    /* SList.cpp Author: Qiang Xiao Time: 2015-07-11 */ #include<iostream> using namespace std; ; ...

  3. Lowest Common Ancestor of a Binary Tree, with Parent Pointer

    Given a binary tree, find the lowest common ancestor of two given nodes in tree. Each node contains ...

  4. git学习基础教程

    分享一个git学习基础教程 http://pan.baidu.com/s/1o6ugkGE 具体在网盘里面的内容..需要的学习可以直接下.

  5. Integer Inquiry(大数相加)

    Description One of the first users of BIT's new supercomputer was Chip Diller. He extended his explo ...

  6. Hadoop学习笔记(1)概述

    写在学习笔记之前的话: 寒假已经开始好几天了,似乎按现在的时间算,明天就要过年了.在家的这几天,该忙的也都差不多了,其实也都是瞎忙.接下来的几点,哪里也不去了,静静的呆在家里学点东西.所以学习一下Ha ...

  7. Hibernate学习之缓存机制

    转自:http://www.cnblogs.com/xiaoluo501395377/p/3377604.html 一.N+1问题 首先我们来探讨一下N+1的问题,我们先通过一个例子来看一下,什么是N ...

  8. eclipse手动添加源码

    在开发过程中,有的时候需要我们自已手动去添加一些源码文件,但是由于我们可能在eclipse中安装了jad反编译插件,我们再用“Ctrl + 鼠标左键”的话,会打开已经反编译好的class文件,而不是带 ...

  9. Linux 下 Hadoop java api 问题

    1. org.apache.hadoop.security.AccessControlException: Permission denied: user=opsuser, access=WRITE, ...

  10. c 语言简单计算器源码

    //  main.c //  计算器 //  Created by qianfeng on 14-7-15. //  Copyright (c) 2014年 ___FGY___. All rights ...