Linux_hadoop_install
1、 Build Linux env
my env is VM RedHat Linux 6.5 64bit
set fixed IP
vim /etc/sysconfig/network-scripts/ifcfg-eth0
set IP to : 192.168.38.128
modify hostname: vim /etc/hosts
set hostname to : itbuilder1
2、install JDK
config JDK env variables
3、install Hadoop env
download Apache hadoop pkg
addr:http://archive.apache.org/dist/hadoop/core/stable2/hadoop-2.7.1.tar.gz
3.1 Extract the package to the specified directory
create a dir : mkdir /usr/local/hadoop
extract file to dir : /usr/local/hadoop :tar -zxvf hadoop-2.7.1.tar.gz -C /usr/local/hadoop
3.2 Modify the configuration file
hadoop2.7.1 version need to modify 5 config files :
1、hadoop-env.sh
2、core-site.xml
3、hdfs-site.xml
4、mapred-site.xml(mapred-site.xml.template)
5、yarn-site.xml
these file all under etc of hadoop, the detail dir is : /usr/local/hadoop/hadoop-2.7.1/etc/hadoop/
3.2.1 Modfiy env variable (hadoop-env.sh)
vim hadoop-env.sh
set up JDK root directory, as shown below:
export JAVA_HOME=/usr/java/jdk1.8.0_20
3.2.2 core-site.xml ,set namenode and temp file addr of HDFS.
<configuration>
<!--set HDFS addr (NameNode) -->
<property>
<name>fs.defaultFS</name>
<value>hdfs://itbuilder1:9000</value>
</property>
<!--set dir of Hadoop runtime file storage directory-->
<property>
<name>hadoop.tmp.dir</name>
<value>/usr/local/hadoop/hadoop-2.7.1/tmp</value>
</property>
</configuration>
3.2.3 hdfs-site.xml (set duplicate quantity)
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
3.2.4 mapred-site.xml ( tell hadoop that later MR runs on yarn )
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
3.2.5 yarn-site.xml
<configuration>
<!-- tell nodemanager the way to get data is shuffle -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<!--set yarn addr (ResourceManager) -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>itbuilder1</value>
</property>
</configuration>
4、add hadoop to env variable
vim /etc/profile
export JAVA_HOME=/usr/java/jdk1.8.0_20
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.7.1
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin
#refresh /etc/profile
source /etc/profile
5、Initialize (format) file system (HDFS)
#hadoop namenode -format
hdfs namenode -format
6、start hadoop (hdfs yarn)
./start-all.sh (need to input linux password)
./start-hdfs.sh
./start-yarn.sh
View the current process of opening by JPs command
[root@linuxidc ~]# jps
3461 ResourceManager
3142 DataNode
3751 NodeManager
3016 NameNode
5034 Jps
3307 SecondaryNameNode
Access the management interface :
http://192.168.38.128:50070 (hdfs management interface)
http://192.168.38.128:8088 (mr management interface)
Linux_hadoop_install的更多相关文章
随机推荐
- Hat's Fibonacci(大数问题)
#include <iostream>#include <stdio.h>#include <string.h>using namespace std;int a[ ...
- iOS状态栏颜色
下面截图给出修改 iOS 状态栏颜色的 4 种方式 Target.png Info.plist.png Storyboard.png code.png 其中第四张图中的代码,直接写在你的任何一个 Vi ...
- windows下计算文件的md和sha值
在windows下可以使用FCIV命令行工具计算文件的md5和sha值,具体例子如下: FCIV -md5 -sha1 path\filename.ext 例如: FCIV-md5-sha1 c:\w ...
- geektool--一款很geek的工具
2016/12/18 今天尝试一款很geek的工具 geektool 听名字就超级geek有木有 get it geektool website 从官网直接下载app,一键傻瓜式安装. use it ...
- 关于lucene的IndexSearcher单实例,对于索引的实时搜索
Lucene版本:3.0 一般情况下,lucene的IndexSearcher都要写成单实例,因为每次创建IndexSearcher对象的时候,它都需要把索引文件加载进来,如果访问量比较大,而索引也比 ...
- Sql Server 2005 CLR实例
本文转载:http://www.cnblogs.com/yongfa365/archive/2010/04/26/SQL-Server-CLR.html CSDN:博客参考http://blog.cs ...
- [置顶] 数据持久层(DAO)常用功能–通用API的实现
在Web开发中,一般都分3层. Controller/Action 控制层, Service/Business 服务层/业务逻辑层, Dao 数据访问层/数据持久层. 在学习和工作的实践过程中,我发现 ...
- 让大蛇(Python)帮你找工作 之增强版
前一段时间用Python写了个简单的网络爬虫,可以从某个求职网站上根据预先设置的条件一次性的爬取所有的职位信息,最近对该程序进行了一下完善,主要包括如下内容 (1)可以对爬取的结果再进行筛选 例如,你 ...
- Android为ListView的Item设置不同的布局
MainActivity如下: package cc.testlistview; import java.util.ArrayList; import java.util.HashMap; impor ...
- [PWA] 10. Trigger a version update
When the refersh button is clicked, we need to tell the waiting service worker to replace the curren ...