0. install xubuntu

we recommend to set username as "hadoop"

after installation, set user "hadoop" as administrator

  1. sudo addgroup hadoop
  2. sudo adduser --ingroup hadoop hadoop

打开/etc/sudoers文件

sudo gedit /etc/sudoers

在root  ALL=(ALL:ALL)  ALL下添加hadoop  ALL=(ALL:ALL)  ALL

1. install java

  1. .解压java压缩包到usr/java(新建的文件夹)中。解压后就可使用
  2.  
  3. .配置环境变量。如下
  4. etc/profile 文件中。在最后添加如下内容
  5. #set java environment
  6. export JAVA_HOME=/usr/java/jdk1..0_67
  7. export JRE_HOME=/usr/java/jdk1..0_67/jre
  8. export PATH=$PATH:/usr/java/jdk1..0_67/bin
  9. export CLASSPATH=./:/usr/java/jdk1..0_67/lib:/usr/java/jdk1..0_67/jre/lib
  10.  
  11. .配置立即生效命令
  12. source /etc/profile
  13.  
  14. .检测是否配置成功
  15. java -version
  16.  
  17. 如果不行,重启linux

2. configure login in ssh without entering password

please operate under user "hadoop"

  1. su - hadoop
  2. sudo apt-get install openssh-server
    sudo /etc/init.d/ssh start  
  3.  
  4. cd ~/.ssh
    ssh-keygen -t rsa -""  
    cat ~/.ssh/id_rsa.pub>> ~/.ssh/authorized_keys  

3. install hadoop

  1. . unzip hadoop.tar.gz into /usr/hadoop
      then, ensure user "hadoop" owns /usr/hadoop
      
    sudo chown -R hadoop:hadoop hadoop  
  1. . edit environment
  2. 2.1 gedit /etc/profile append these:
  3.  
  4. export JAVA_HOME=/usr/java/
  5. export JRE_HOME=/usr/java/jre
  6. export HADOOP_INSTALL=/usr/hadoop
  7. export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin
  8. export CLASSPATH=./:/usr/java/lib:/usr/java/jre/lib
  9.  
  10. 2.2 gedit /usr/hadoop/conf/hadoop-env.sh append these:
  11.  
  12. # The java implementation to use. Required.
  13. export JAVA_HOME=/usr/java
  14. export HADOOP_INSTALL=/usr/hadoop
  15. export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin
  16.  
  17. . restart linux

4. test

  1. hadoop@ms:~$
  2. hadoop@ms:~$ java -version
  3. java version "1.7.0_79"
  4. Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
  5. Java HotSpot(TM) -Bit Server VM (build 24.79-b02, mixed mode)
  6. hadoop@ms:~$ hadoop version
  7. Hadoop 1.2.
  8. Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152
  9. Compiled by mattf on Mon Jul :: PDT
  10. From source with checksum 6923c86528809c4e7e6f493b6b413a9a
  11. This command was run using /usr/hadoop/hadoop-core-1.2..jar
  12. hadoop@ms:~$

5. hadoop 伪分布式

  1. 编辑三个文件:
  2. ). core-site.xml:
  3.  
  4. <configuration>
  5. <property>
  6. <name>fs.default.name</name>
  7. <value>hdfs://localhost:9000</value>
  8. </property>
  9. <property>
  10. <name>hadoop.tmp.dir</name>
  11. <value>/usr/local/hadoop/tmp</value>
  12. </property>
  13. </configuration>
  14.  
  15. ).hdfs-site.xml:
  16.  
  17. <configuration>
  18. <property>
  19. <name>dfs.replication</name>
  20. <value></value>
  21. </property>
  22. <property>
  23. <name>dfs.name.dir</name>
  24. <value>/usr/local/hadoop/datalog1,/usr/local/hadoop/datalog2</value>
  25. </property>
  26. <property>
  27. <name>dfs.data.dir</name>
  28. <value>/usr/local/hadoop/data1,/usr/local/hadoop/data2</value>
  29. </property>
  30. </configuration>
  31.  
  32. ). mapred-site.xml:
  33.  
  34. <configuration>
  35. <property>
  36. <name>mapred.job.tracker</name>
  37. <value>localhost:</value>
  38. </property>
  39. </configuration>
  40.  
  41. . 启动Hadoop到相关服务,格式化namenode, secondarynamenode, tasktracker:
  42. hadoop@derekUbun:/usr/local/hadoop$ source /usr/local/hadoop/conf/hadoop-env.sh
  43. hadoop@derekUbun:/usr/local/hadoop$ hadoop namenode -format

6*. install hbase[伪分布式]

  1. . unzip hbase.tar.gz into /usr/hbase
  2.   then, ensure user "hadoop" owns /usr/hbase
  3.   
  4. sudo chown -R hadoop:hadoop hbase
  5.  
  6. . edit environment
  7. 2.1 gedit /etc/profile append these:
  8.  
  9. export HBASE_HOME="/usr/hbase"
  10. export PATH=$HBASE_HOME/bin:$PATH
  11.  
  12. 2.2 gedit /usr/hbase/conf/hbase-site.xml append these:
  13.  
  14. <property>
  15. <name>hbase.rootdir</name>
  16. <!-- 对应hadoophdfs的配置项 -->
  17. <value>hdfs://localhost:9000/hbase</value>
  18. </property>
  19. <property>
  20. <name>hbase.cluster.distributed</name>
  21. <value>true</value>
  22. </property>
  23. <property>
  24. <name>hbase.master.info.port</name>
  25. <value></value>
  26. </property>
  27.  
  28. 2.3 gedit /usr/hbase/hbase-env.sh modify these:
  29.  
  30. # The java implementation to use. Java 1.6 required.
  31. export JAVA_HOME=/usr/java/
  32.  
  33. # Extra Java CLASSPATH elements. Optional.
  34. export HBASE_CLASSPATH=/usr/hadoop/conf
  35.  
  36. # Tell HBase whether it should manage it's own instance of Zookeeper or not.
  37. export HBASE_MANAGES_ZK=true
  38.  
  39. . restart linux

#. references

  1. http://blog.csdn.net/zhaoyl03/article/details/8657104#
  2.  
  3. http://www.tuicool.com/articles/VZn6zi
  4.  
  5. http://blog.csdn.net/zhaoyl03/article/details/8657104#
  6.  
  7. http://blog.csdn.net/pdw2009/article/details/21261417
  8.  
  9. http://www.th7.cn/db/nosql/201510/134214.shtml

install hadoop on xubuntu的更多相关文章

  1. Hadoop学习日志- install hadoop

    资料来源 : http://www.tutorialspoint.com/hadoop/hadoop_enviornment_setup.htm Hadoop 安装 创建新用户 $ su passwo ...

  2. mac osx 系统 brew install hadoop 安装指南

    mac osx 系统 brew  install hadoop 安装指南   brew install hadoop 配置 core-site.xml:配置hdfs文件地址(记得chmod 对应文件夹 ...

  3. How to install Hadoop

    1.How to install Hadoop 3.0.0 http://blog.sina.com.cn/s/blog_4a1f59bf01010kx3.html 2.How to install ...

  4. [Spark] 00 - Install Hadoop & Spark

    Hadoop安装 Java环境配置 安装课程:安装配置 配置手册:Hadoop安装教程_单机/伪分布式配置_Hadoop2.6.0/Ubuntu14.04[依照步骤完成配置] jsk安装使用的链接中第 ...

  5. Steps to Install Hadoop on CentOS/RHEL 6---reference

    http://tecadmin.net/steps-to-install-hadoop-on-centosrhel-6/# The Apache Hadoop software library is ...

  6. yum install hadoop related client

    yum list avaliable hadoop\* yum list installed yum repolist repo is in /etc/yum.repos.d yum install ...

  7. install Hadoop

    Installing Java Hadoop runs on both Unix and Windows operating systems, and requires Java to beinsta ...

  8. How to install Hadoop Cluster

    https://dwbi.org/etl/bigdata/183-setup-hadoop-cluster https://www.linode.com/docs/databases/hadoop/h ...

  9. Install hadoop on windows(non-virtual machine, such cygwin)

    DownloadBefore starting make sure you have this two softwares Hadoop 2.7.1 Java – Jdk 1.7+ Extract d ...

随机推荐

  1. j2ee log4j集中式日志解决方案logpool-v0.2

    下一个小版本会进行清理. war包下载地址 http://pan.baidu.com/s/1nvGmORn

  2. 移动端web自适应解决方案: adaptive.js

    代码有更新,最好直接查看github github:https://github.com/finance-sh/adaptive adaptivejs利用rem解决移动端页面开发的自适应问题 页面模板 ...

  3. Android Weekly Notes Issue #223

    Android Weekly Issue #223 September 18th, 2016 Android Weekly Issue #223 本期内容包括: Offline时间戳处理; Acces ...

  4. android的消息提示(震动与提示音)

    protected AudioManager audioManager; protected Vibrator vibrator; audioManager = (AudioManager)getSy ...

  5. swift-分支

    swift相当于OC的比较 if后的括号可以省略 if后只能接bool值 if后的大括号不能省略 let num1 = 5.0 let num2 = 4.0 let boo :Bool = true ...

  6. Laravel大型项目系列教程(五)之文章和标签管理

    一.前言 本节教程将大概完成文章和标签管理以及标签关联. 二.Let's go 1.文章管理 首先创建管理后台文章列表视图: $ php artisan generate:view admin.art ...

  7. json的理解及读取

    一: JSON 语法是 JavaScript 对象表示语法的子集,其语法规则如下: 数据在键值对中 数据由逗号分隔 花括号保存对象:{} 方括号保存数组:[] 如:[{"name" ...

  8. [Hadoop in Action] 第1章 Hadoop简介

    编写可扩展.分布式的数据密集型程序和基础知识 理解Hadoop和MapReduce 编写和运行一个基本的MapReduce程序   1.什么是Hadoop   Hadoop是一个开源的框架,可编写和运 ...

  9. Spark——SparkContext简单分析

    本篇文章就要根据源码分析SparkContext所做的一些事情,用过Spark的开发者都知道SparkContext是编写Spark程序用到的第一个类,足以说明SparkContext的重要性:这里先 ...

  10. [转]ubuntu linux下DNS重启后丢失

    从网上得知 /etc/resolv.conf中的DNS配置是从/etc/resolvconf/resolv.conf.d/head中加载而来,所以每回改resolv.conf都会失效,在此文件里面已经 ...