1.安装mysql,可参考下面链接

http://www.cnblogs.com/liuchangchun/p/4099003.html

2.安装hive,之前,先在mysql上创建一个hive,数据库,并在hive数据库中建立表user

  1. create database hive;
  2. use hive;
  3. create table user(Host char(),User char(),Password char());

3.进入mysql,赋予权限,密码自己改

  1. mysql -u root -p
  2. insert into user(Host,User,Password) values("localhost","hive",password(""));
  3. FLUSH PRIVILEGES;
  4.  
  5. GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost' IDENTIFIED BY 'hive';
  6. FLUSH PRIVILEGES;

4.解压hive安装包之后,配置环境

  1. sudo gedit /etc/profile
  1. #hive
  2. export HIVE_HOME=/home/sendi/apache-hive-1.1.-bin
  3. export PATH=$PATH:$HIVE_HOME/bin

5.修改hive/conf下的几个template模板,

  1. cp hive-env.sh.template hive-env.sh
  2. cp hive-default.xml.template hive-site.xml

6.配置hive-env.sh文件,指定HADOOP_HOME

  1. HADOOP_HOME=/home/sendi/hadoop-2.6.

7.修改hive-site.xml文件,指定MySQL数据库驱动、数据库名、用户名及密码

  1. <property>
  2. <name>javax.jdo.option.ConnectionURL</name>
  3. <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
  4. <description>JDBC connect string for a JDBC metastore</description>
  5. </property>
  6.  
  7. <property>
  8. <name>javax.jdo.option.ConnectionDriverName</name>
  9. <value>com.mysql.jdbc.Driver</value>
  10. <description>Driver class name for a JDBC metastore</description>
  11. </property>
  12.  
  13. <property>
  14. <name>javax.jdo.option.ConnectionUserName</name>
  15. <value>hive</value>
  16. <description>username to use against metastore database</description>
  17. </property>
  18.  
  19. <property>
  20. <name>javax.jdo.option.ConnectionPassword</name>
  21. <value>hive</value>
  22. <description>password to use against metastore database</description>
  23. </property>
  24.  
  25. <property>
  26. <name>hive.metastore.local</name>
  27. <value>true</value>
  28. <description></description>
  29. </property>

8.修改hive/bin下的hive-config.sh文件,设置JAVA_HOME,HADOOP_HOME

  1. export JAVA_HOME=/usr/lib/jdk/jdk1..0_67
  2. export HADOOP_HOME=/home/sendi/hadoop-2.6.
  3. export HIVE_HOME=/home/sendi/apache-hive-1.1.-bin

9.下载mysql-connector-java-5.1.27-bin.jar文件,并放到$HIVE_HOME/lib目录下

10.在HDFS中创建/tmp和/user/hive/warehouse并设置权限

  1. hadoop fs -mkdir /tmp
  2. hadoop fs -mkdir /user/hive/warehouse
  3. hadoop fs -chmod g+w /tmp
  4. hadoop fs -chmod g+w /user/hive/warehouse

11.启动hive

12.启动时,可能会遇到下面的问题

  1. Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
  2. SLF4J: Class path contains multiple SLF4J bindings.
  3. SLF4J: Found binding in [jar:file:/hadoop-2.5./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  5. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  6. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  7. [ERROR] Terminal initialization failed; falling back to unsupported
  8. java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
  9. at jline.TerminalFactory.create(TerminalFactory.java:)
  10. at jline.TerminalFactory.get(TerminalFactory.java:)
  11. at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
  12. at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
  13. at jline.console.ConsoleReader.<init>(ConsoleReader.java:)
  14. at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:)
  15. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:)
  16. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:)
  17. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:)
  18. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  19. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
  20. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
  21. at java.lang.reflect.Method.invoke(Method.java:)
  22. at org.apache.hadoop.util.RunJar.main(RunJar.java:)

13.原因是hadoop目录下存在老版本jline,解决方法:

1.进入hive的lib目录,把新版本的jline复制到hadoop的一下目录

  1. /home/sendi/hadoop-2.6./share/hadoop/yarn/lib

2把hadoop就版本的jline删掉

14.如果还遇到以下问题:

  1. jiahong@jiahongPC:~/apache/apache-hive-1.1.-bin$ hive
  2. // :: WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
  3.  
  4. Logging initialized using configuration in jar:file:/home/jiahong/apache/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
  5. SLF4J: Class path contains multiple SLF4J bindings.
  6. SLF4J: Found binding in [jar:file:/home/jiahong/apache/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
  7. SLF4J: Found binding in [jar:file:/home/jiahong/apache/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  8. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  9. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  10. Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  11. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
  12. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:)
  13. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:)
  14. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  15. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
  16. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
  17. at java.lang.reflect.Method.invoke(Method.java:)
  18. at org.apache.hadoop.util.RunJar.run(RunJar.java:)
  19. at org.apache.hadoop.util.RunJar.main(RunJar.java:)
  20. Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  21. at org.apache.hadoop.fs.Path.initialize(Path.java:)
  22. at org.apache.hadoop.fs.Path.<init>(Path.java:)
  23. at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:)
  24. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:)
  25. ... more
  26. Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
  27. at java.net.URI.checkPath(URI.java:)
  28. at java.net.URI.<init>(URI.java:)
  29. at org.apache.hadoop.fs.Path.initialize(Path.java:)
  30. ... more
  31. jiahong@jiahongPC:~/apache/apache-hive-1.1.-bin$ hadoop dfs - ls /
  32. DEPRECATED: Use of this script to execute hdfs command is deprecated.
  33. Instead use the hdfs command for it.

15.修改hive-site.xml文件,修改之前要在hdfs上建立相应的文件夹,内容如下:

  1. <property>
  2. <name>hive.exec.scratchdir</name>
  3. <value>/tmp/hive</value>
  4. <description>HDFS root scratch dir for Hive jobs which gets created with write all () permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/&lt;username&gt; is created, with ${hive.scratch.dir.permission}.</description>
  5. </property>
  6. <property>
  7. <name>hive.exec.local.scratchdir</name>
  8. <value>/tmp/hive/local</value>
  9. <description>Local scratch space for Hive jobs</description>
  10. </property>
  11. <property>
  12. <name>hive.downloaded.resources.dir</name>
  13. <value>/tmp/hive/resources</value>
  14. <description>Temporary local directory for added resources in the remote file system.</description>
  15. </property>

16.先启动hadoop,再启动hive

  1. sendi@sendijia:~/hadoop-2.6.$ hive
  2.  
  3. Logging initialized using configuration in jar:file:/home/sendi/apache-hive-1.1.-bin/lib/hive-common-1.1..jar!/hive-log4j.properties
  4. SLF4J: Class path contains multiple SLF4J bindings.
  5. SLF4J: Found binding in [jar:file:/home/sendi/hadoop-2.6./share/hadoop/common/lib/slf4j-log4j12-1.7..jar!/org/slf4j/impl/StaticLoggerBinder.class]
  6. SLF4J: Found binding in [jar:file:/home/sendi/apache-hive-1.1.-bin/lib/hive-jdbc-1.1.-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  7. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  8. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  9. hive>

ubuntu安装hive的更多相关文章

  1. Ubuntu 安装hive + mysql

    先安装mysql sudo apt-get update sudo apt-get install mysql-server sudo mysql_secure_installation具体详情请另查 ...

  2. ubuntu - 安装hive

    粗略步骤: 详细参考:https://www.2cto.com/net/201804/735478.html 环境:ubunut  jdk  hadoop   mysql 一.下载hive 二.解压( ...

  3. Ubuntu16.04下安装Hive

    上一篇博客我们已经说过了要如何安装Hadoop,别忘记了我们的目的是安装Hive.所以这篇博客,我就来介绍一下如何安装Hive. 一.环境准备 (1)Vmware (2)  Ubuntu 16.04 ...

  4. 安装Hive过程中报错:Unsupported major.minor version 52.0

    在安装hive的过程中,我觉得我是按照教程走的,但是在启动hive时还是报错了,错误如下 Exception in thread "main" java.lang.Unsuppor ...

  5. Mac OS、Ubuntu 安装及使用 Consul

    Consul 概念(摘录): Consul 是 HashiCorp 公司推出的开源工具,用于实现分布式系统的服务发现与配置.与其他分布式服务注册与发现的方案,比如 Airbnb 的 SmartStac ...

  6. ubuntu安装mysql

    好记性不如烂笔头,记录一下,ubuntu安装mysql的指令. 安装MySQL: sudo apt-get install mysql-server sudo apt-get install mysq ...

  7. ubuntu安装vim时提示 没有可用的软件包 vim,但是它被其它的软件包引用了 解决办法

    ubuntu安装vim时提示 没有可用的软件包 vim-gtk3,但是它被其它的软件包引用了 解决办法 本人在ubuntu系统安装vim  输入 sudo apt-get install vim 提示 ...

  8. 安装Hive(独立模式 使用mysql连接)

    安装Hive(独立模式 使用mysql连接) 1.默认安装了java+hadoop 2.下载对应hadoop版本的安装包 3.解压安装包 tar zxvf apache-hive-1.2.1-bin. ...

  9. docker 1.8+之后ubuntu安装指定版本docker-engine

    这边记录ubuntu安装过程,首先是官网文档 If you haven’t already done so, log into your Ubuntu instance. Open a termina ...

随机推荐

  1. ThinkPHP 3.2.2跨控制器调用方法

     所谓跨控制器调用,指的是在一个控制器中调用另一个控制器的某个方法.在ThinkPHP中有三种方式实现跨控制器调用: 直接实例化: A()函数实例化; R()函数实例化. (1)直接实例化  直接实例 ...

  2. jQuery的筛选选择器

    基本筛选选择器 很多时候我们不能直接通过基本选择器与层级选择器找到我们想要的元素,为此jQuery提供了一系列的筛选选择器用来更快捷的找到所需的DOM元素.筛选选择器很多都不是CSS的规范,而是jQu ...

  3. Android--将图片存放到我们本地

    代码里面有详细的解释,我就不多说了 //处理并保存图像 private File dealPhoto(Bitmap photo){ FileOutputStream fileOutputStream ...

  4. hdu 5142 NPY and FFT

    题目连接 http://acm.hdu.edu.cn/showproblem.php?pid=5142 NPY and FFT Description A boy named NPY is learn ...

  5. epoll重要

    EPOLL事件分发系统可以运转在两种模式下:Edge Triggered (ET).Level Triggered (LT). LT是缺省的工作方式,并且同时支持block和no-blocksocke ...

  6. nodejs base64 编码解码

    普通字符串 编码解码: var b = new Buffer('JavaScript'); var s = b.toString('base64'); // SmF2YVNjcmlwdA== var ...

  7. win8.1上安装vc6

    win8.1上安装vc6 1.以管理员方式运行SETUP.EXE,然后一路下一步 2.这里需要一点点耐心,等10分钟左右就能过去,电脑会比较卡,有点像假死,还是没有死掉,等等就好了 3.这里选择vc6 ...

  8. Qt:禁止qDebug的输出

    Qt:禁止qDebug的输出 在工程的.pro文件里加上以下编译批令即可: DEFINES += QT_NO_DEBUG_OUTPUT

  9. OpenGL学习笔记之配置OpenGL

    OpenGL是计算机图形学领域的一门入门语言,OpenGL开发库的一些文件在官网上可以下载到.里面包含三个文件,如下: 1.把在OpenGL开发库中LIB(库文件)glut.lib和glut32.li ...

  10. 华为p7怎么打开usb调试模式

    在应用程序列表中选择[设置]进入系统设置菜单,点击[关于手机]  2.在"版本号"上面连续点击七次:  3.现在返回"设置"界面,发现多了一个"开 ...