1.安装mysql,可参考下面链接

http://www.cnblogs.com/liuchangchun/p/4099003.html

2.安装hive,之前,先在mysql上创建一个hive,数据库,并在hive数据库中建立表user

create database hive;
use hive;
create table user(Host char(),User char(),Password char());

3.进入mysql,赋予权限,密码自己改

mysql -u root -p
insert into user(Host,User,Password) values("localhost","hive",password(""));
FLUSH PRIVILEGES; GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost' IDENTIFIED BY 'hive';
FLUSH PRIVILEGES;

4.解压hive安装包之后,配置环境

sudo gedit /etc/profile
#hive
export HIVE_HOME=/home/sendi/apache-hive-1.1.-bin
export PATH=$PATH:$HIVE_HOME/bin

5.修改hive/conf下的几个template模板,

cp hive-env.sh.template hive-env.sh
cp hive-default.xml.template hive-site.xml

6.配置hive-env.sh文件,指定HADOOP_HOME

HADOOP_HOME=/home/sendi/hadoop-2.6.

7.修改hive-site.xml文件,指定MySQL数据库驱动、数据库名、用户名及密码

<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property> <property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property> <property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property> <property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property> <property>
<name>hive.metastore.local</name>
<value>true</value>
<description></description>
</property>

8.修改hive/bin下的hive-config.sh文件,设置JAVA_HOME,HADOOP_HOME

export JAVA_HOME=/usr/lib/jdk/jdk1..0_67
export HADOOP_HOME=/home/sendi/hadoop-2.6.
export HIVE_HOME=/home/sendi/apache-hive-1.1.-bin

9.下载mysql-connector-java-5.1.27-bin.jar文件,并放到$HIVE_HOME/lib目录下

10.在HDFS中创建/tmp和/user/hive/warehouse并设置权限

hadoop fs -mkdir /tmp
hadoop fs -mkdir /user/hive/warehouse
hadoop fs -chmod g+w /tmp
hadoop fs -chmod g+w /user/hive/warehouse

11.启动hive

12.启动时,可能会遇到下面的问题

Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop-2.5./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:)
at jline.TerminalFactory.get(TerminalFactory.java:)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.util.RunJar.main(RunJar.java:)

13.原因是hadoop目录下存在老版本jline,解决方法:

1.进入hive的lib目录,把新版本的jline复制到hadoop的一下目录

/home/sendi/hadoop-2.6./share/hadoop/yarn/lib

2把hadoop就版本的jline删掉

14.如果还遇到以下问题:

jiahong@jiahongPC:~/apache/apache-hive-1.1.-bin$ hive
// :: WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist Logging initialized using configuration in jar:file:/home/jiahong/apache/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/jiahong/apache/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/jiahong/apache/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.util.RunJar.run(RunJar.java:)
at org.apache.hadoop.util.RunJar.main(RunJar.java:)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.fs.Path.initialize(Path.java:)
at org.apache.hadoop.fs.Path.<init>(Path.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:)
... more
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at java.net.URI.checkPath(URI.java:)
at java.net.URI.<init>(URI.java:)
at org.apache.hadoop.fs.Path.initialize(Path.java:)
... more
jiahong@jiahongPC:~/apache/apache-hive-1.1.-bin$ hadoop dfs - ls /
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

15.修改hive-site.xml文件,修改之前要在hdfs上建立相应的文件夹,内容如下:

<property>
<name>hive.exec.scratchdir</name>
<value>/tmp/hive</value>
<description>HDFS root scratch dir for Hive jobs which gets created with write all () permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/&lt;username&gt; is created, with ${hive.scratch.dir.permission}.</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/tmp/hive/local</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/tmp/hive/resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>

16.先启动hadoop,再启动hive

sendi@sendijia:~/hadoop-2.6.$ hive

Logging initialized using configuration in jar:file:/home/sendi/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/sendi/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/sendi/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive>

ubuntu安装hive的更多相关文章

  1. Ubuntu 安装hive + mysql

    先安装mysql sudo apt-get update sudo apt-get install mysql-server sudo mysql_secure_installation具体详情请另查 ...

  2. ubuntu - 安装hive

    粗略步骤: 详细参考:https://www.2cto.com/net/201804/735478.html 环境:ubunut  jdk  hadoop   mysql 一.下载hive 二.解压( ...

  3. Ubuntu16.04下安装Hive

    上一篇博客我们已经说过了要如何安装Hadoop,别忘记了我们的目的是安装Hive.所以这篇博客,我就来介绍一下如何安装Hive. 一.环境准备 (1)Vmware (2)  Ubuntu 16.04 ...

  4. 安装Hive过程中报错:Unsupported major.minor version 52.0

    在安装hive的过程中,我觉得我是按照教程走的,但是在启动hive时还是报错了,错误如下 Exception in thread "main" java.lang.Unsuppor ...

  5. Mac OS、Ubuntu 安装及使用 Consul

    Consul 概念(摘录): Consul 是 HashiCorp 公司推出的开源工具,用于实现分布式系统的服务发现与配置.与其他分布式服务注册与发现的方案,比如 Airbnb 的 SmartStac ...

  6. ubuntu安装mysql

    好记性不如烂笔头,记录一下,ubuntu安装mysql的指令. 安装MySQL: sudo apt-get install mysql-server sudo apt-get install mysq ...

  7. ubuntu安装vim时提示 没有可用的软件包 vim,但是它被其它的软件包引用了 解决办法

    ubuntu安装vim时提示 没有可用的软件包 vim-gtk3,但是它被其它的软件包引用了 解决办法 本人在ubuntu系统安装vim  输入 sudo apt-get install vim 提示 ...

  8. 安装Hive(独立模式 使用mysql连接)

    安装Hive(独立模式 使用mysql连接) 1.默认安装了java+hadoop 2.下载对应hadoop版本的安装包 3.解压安装包 tar zxvf apache-hive-1.2.1-bin. ...

  9. docker 1.8+之后ubuntu安装指定版本docker-engine

    这边记录ubuntu安装过程,首先是官网文档 If you haven’t already done so, log into your Ubuntu instance. Open a termina ...

随机推荐

  1. 使用 libevent 和 libev 提高网络应用性能——I/O模型演进变化史

    构建现代的服务器应用程序需要以某种方法同时接收数百.数千甚至数万个事件,无论它们是内部请求还是网络连接,都要有效地处理它们的操作. 有许多解决方案,但事件驱动也被广泛应用到网络编程中.并大规模部署在高 ...

  2. oracle自定义job名字,job调度

    一.调试创建 begin -- create_schedule dbms_scheduler.create_schedule(schedule_name => 's_change_send_da ...

  3. qtp 设置等待时间

    1.file->settings->run .默认的时间是20 秒 2. browser("browser").Navigate http://www.baidu.co ...

  4. SaaS应用“正益工作”发布,为大中型企业轻松构建移动门户

    6月24日,以“平台之上,应用无限”为主题的2016 AppCan移动开发者大会,在北京国际会议中心隆重举行,逾1500名移动开发者一起见证了此次大会盛况. 会上,在专家领导.技术大咖.移动开发者的共 ...

  5. CocoaPods 教程 转载

    CocoaPods安装和使用教程 Code4App 原创文章.转载请注明出处:http://code4app.com/article/cocoapods-install-usage 目录 CocoaP ...

  6. SVN基于一个branch创建新branch

    在本地现有Branch(Checkout出来的目录)上,右键SVN,选择Branch/Tags,选择目录.比如https://sadcsc01.pmmr.com/web/Jaguar/branches ...

  7. 65.OV7725图像倒置180度

    采集的图像倒置180度,这跟寄存器的设置有关.寄存器0X32的bit[7]可以变换倒置方向.

  8. cocos2dx中帧循环的伪代码实现

    1.在游戏开发中,帧率很大程度上体现了游戏的流畅度,帧循环是游戏中一个很重要的概念 2.下面用伪代码实现了cocos2dx中的帧循环 /*main函数调用*/ CCApplication::share ...

  9. 团队开发(NABC)

    特点:这是一个手机软件,能通过通讯录录入生日信息 N(Need需求):现在在交际圈中需要记住越来越多朋友的生日信息 A(Approach做法):由一个简单的闹钟为基础,添加与生日相关的功能,最终实现 ...

  10. c语言编程之双向循环链表

    双向循环链表就是形成两个环,注意每个环的首尾相连基本就可以了. 程序中采用尾插法进行添加节点. #include<stdio.h> #include<stdlib.h> #de ...