Hive之单独部署机器
环境说明
- CentOS7,hadoop-2.6.5,hive-1.2.2,MariaDB-5.5.60,jdk-1.8
- 假设hive机已经安装好了MariaDB(已启动且已创建好hive账号,对hive数据库有所有权限)和jdk
copy一份hadoop2.6.5到hive机器
只需要保留 bin, etc, libexec, share四个目录即可,share/doc目录可以删除
[root@wadeyu hadoop-2.6.5]# pwd
/usr/local/src/hadoop-2.6.5
[root@wadeyu hadoop-2.6.5]# ll
total 16
drwxrwxr-x. 2 root root 4096 Oct 3 2016 bin
drwxrwxr-x. 3 root root 4096 Oct 3 2016 etc
drwxrwxr-x. 2 root root 4096 Oct 3 2016 libexec
drwxrwxr-x. 3 root root 4096 Sep 18 11:27 share
hive机器安装以及配置hive1.2.2
- 下载:http://archive.apache.org/dist/hive/hive-1.2.2/
- 解压
- 下载mysql jdbc驱动(驱动放在hive lib目录下):https://dev.mysql.com/downloads/connector/j/
- 创建iotmp:/usr/local/src/hive-1.2.2/iotmp
- 修改配置 conf/hive-env.sh
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/usr/local/src/hadoop-2.6.5
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/usr/local/src/hive-1.2.2/conf
- 修改配置hive-site.xml
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://127.0.0.1:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/usr/local/src/hive-1.2.2/iotmp</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/usr/local/src/hive-1.2.2/iotmp</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/usr/local/src/hive-1.2.2/iotmp</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.hwi.listen.host</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hive.hwi.listen.port</name>
<value>9999</value>
</property>
<property>
<name>hive.hwi.war.file</name>
<value>lib/hive-hwi-1.2.2.war</value>
</property>
<property>
<name>hive.cli.print.current.db</name>
<value>true</value>
</property>
<property>
<name>hive.cli.print.header</name>
<value>true</value>
</property>
</configuration>
启动客户端
- 客户端shell:/usr/local/src/hive-1.2.2/bin/hive
- 保证其它机器可以通过hive客户端使用,需要启动metastroe服务
/usr/local/src/hive-1.2.2/bin/hive --service metastore &
启动网页界面
[root@wadeyu lib]# /usr/local/src/hive-1.2.2/bin/hive --service hwi &
[1] 7349
[root@wadeyu lib]# jps
7410 Jps
7349 RunJar
5161 RunJar
[root@wadeyu lib]# 18/09/18 16:44:22 INFO hwi.HWIServer: HWI is starting up
18/09/18 16:44:28 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
18/09/18 16:44:28 INFO mortbay.log: jetty-6.1.26
18/09/18 16:44:29 INFO mortbay.log: Extract /usr/local/src/hive-1.2.2/lib/hive-hwi-1.2.2.war to /tmp/Jetty_0_0_0_0_9999_hive.hwi.1.2.2.war__hwi__21w1ka/webapp
18/09/18 16:44:31 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999
其它hive客户端机器操作
- 复制hive-1.2.2到其它机器
- 如果hadoop未安装,jdk1.8未安装,还需要安装这2个组件
- 修改hive-site.xml配置文件
[root@master conf]# cat hive-site.xml
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://192.168.1.9:9083</value>
</property>
</configuration>
基本操作
基本上跟mysql客户端差不多
碰到的问题
- 问题1
Exception in thread "main" java.lang.RuntimeException: java.net.ConnectException: Call From master/192.168.1.15 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
原因:未启动hadoop集群
解决方法:启动hadoop集群
- 问题2
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
原因:缺少相关配置
解决方法:hive-site.xml增加如下配置
<property>
<name>system:java.io.tmpdir</name>
<value>/usr/local/src/hive-1.2.2/iotmp</value>
<description/>
</property>
<property>
<name>system:user.name</name>
<value>hive</value>
<description/>
</property>
- 问题3
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
原因:hadoop jline.jar库版本太低,移动hive jline库 到 hadoop库
解决方法:
[root@master lib]# find /usr/local/src -name '*jline*'
/usr/local/src/hadoop-2.6.5/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/jline-0.9.94.jar
/usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/jline-0.9.94.jar
/usr/local/src/hadoop-2.6.5/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/jline-0.9.94.jar
/usr/local/src/zookeeper-3.4.12/lib/jline-0.9.94.jar
/usr/local/src/zookeeper-3.4.12/lib/jline-0.9.94.LICENSE.txt
/usr/local/src/zookeeper-3.4.12/src/java/lib/jline-0.9.94.LICENSE.txt
/usr/local/src/hive-1.2.2/lib/jline-2.12.jar
[root@master lib]# cp jline-2.12.jar /usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/jline-2.12.jar
[root@master lib]# ll /usr/local/src/hadoop-2.6.5/share/hadoop/yarn/lib/ | grep jline
-rw-rw-r--. 1 wadeyu wadeyu 87325 Oct 3 2016 jline-0.9.94.jar
-rw-r--r--. 1 root root 213854 Sep 12 17:58 jline-2.12.jar
- 问题4
hive> create table dep(id int, name string) row format delimited fields terminated by '\t' lines terminated by '\n';
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)
原因:jdbc客户端和hive元数据库使用编码不一致
解决方法:
# 解决方法1:hive数据库编码改成latin1
MariaDB [hive]> alter database hive character set latin1;
Query OK, 1 row affected (0.00 sec)
# 结局方法2:客户端编码改成和hive数据一样
或者客户端统一改成utf8编码
jdbc:mysql://192.168.1.9:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8
使用的是mysql-connector-java-8.0.12.jar这个版本的驱动,使用这种方法可以解决
- 问题5
[root@master conf]# hive --service cli
[Fatal Error] hive-site.xml:406:94: The reference to entity "characterEncoding" must end with the ';' delimiter.
18/09/12 18:55:00 FATAL conf.Configuration: error parsing conf file:/usr/local/src/hive-1.2.2/conf/hive-site.xml
org.xml.sax.SAXParseException; systemId: file:/usr/local/src/hive-1.2.2/conf/hive-site.xml; lineNumber: 406; columnNumber: 94; The reference to entity "characterEncoding" must end with the ';' delimiter.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2432)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2420)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2488)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2454)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2615)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2636)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:2707)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:2651)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:74)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:637)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
原因:值包含了xml未转义的特殊字符
解决方法:特殊字符&转义成xml实体
<value>jdbc:mysql://192.168.1.9:3306/hive?createDatabaseIfNotExist=true&characterEncoding=UTF-8</value>
- 问题6
hive> load data local inpath '/home/wadeyu/test2.log' into table dep;
Loading data to table default.dep
Table default.dep stats: [numFiles=1, totalSize=56]
OK
Time taken: 1.779 seconds
hive> select * from dep;
OK
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
原因:vim把tab转换成了空格
解决方法:
文本编辑器需要设置tab符不扩展为多个空格,因为定义表结构的时候,使用\t分隔字段,使用\n分隔行
vim编辑器临时设置:set noexpandtab
- 问题7
root@wadeyu conf]# ss -ls: cannot access /usr/local/src/hive-1.2.2/lib/hive-hwi-*.war: No such file or directory 18/09/18 16:21:31 INFO hwi.HWIServer: HWI is starting up
18/09/18 16:21:36 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
18/09/18 16:21:36 INFO mortbay.log: jetty-6.1.26
18/09/18 16:21:38 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999
原因:且少hwi.war包
解决方法:
从对应的版本源码里扣出来
cd hwi/web
对web目录内容通过jar命令生成.war文件,然后移动到 hive的lib目录下
[root@wadeyu web]# jar cvf hive-hwi-1.2.2.war ./*
added manifest
adding: authorize.jsp(in = 2729) (out= 1201)(deflated 55%)
adding: css/(in = 0) (out= 0)(stored 0%)
adding: css/bootstrap.min.css(in = 90193) (out= 14754)(deflated 83%)
adding: diagnostics.jsp(in = 2365) (out= 1062)(deflated 55%)
adding: error_page.jsp(in = 1867) (out= 931)(deflated 50%)
adding: img/(in = 0) (out= 0)(stored 0%)
adding: img/glyphicons-halflings-white.png(in = 4352) (out= 4190)(deflated 3%)
adding: img/glyphicons-halflings.png(in = 4352) (out= 4192)(deflated 3%)
adding: index.jsp(in = 1876) (out= 981)(deflated 47%)
adding: left_navigation.jsp(in = 1553) (out= 709)(deflated 54%)
adding: navbar.jsp(in = 1345) (out= 681)(deflated 49%)
adding: session_create.jsp(in = 2690) (out= 1248)(deflated 53%)
adding: session_diagnostics.jsp(in = 2489) (out= 1155)(deflated 53%)
adding: session_history.jsp(in = 3150) (out= 1334)(deflated 57%)
adding: session_kill.jsp(in = 2236) (out= 1108)(deflated 50%)
adding: session_list.jsp(in = 2298) (out= 1059)(deflated 53%)
adding: session_manage.jsp(in = 6738) (out= 2198)(deflated 67%)
adding: session_remove.jsp(in = 2359) (out= 1151)(deflated 51%)
adding: session_result.jsp(in = 2488) (out= 1149)(deflated 53%)
adding: show_database.jsp(in = 2346) (out= 1133)(deflated 51%)
adding: show_databases.jsp(in = 2096) (out= 1039)(deflated 50%)
adding: show_table.jsp(in = 4996) (out= 1607)(deflated 67%)
adding: view_file.jsp(in = 2653) (out= 1257)(deflated 52%)
adding: WEB-INF/(in = 0) (out= 0)(stored 0%)
adding: WEB-INF/web.xml(in = 1438) (out= 741)(deflated 48%)
- 问题8
Unable to find a javac compiler;
com.sun.tools.javac.Main is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK.
It is currently set to "/usr/local/src/jdk1.8.0_181/jre"
Caused by:
Unable to find a javac compiler;
com.sun.tools.javac.Main is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK.
It is currently set to "/usr/local/src/jdk1.8.0_181/jre"
at org.apache.tools.ant.taskdefs.compilers.CompilerAdapterFactory.getCompiler(CompilerAdapterFactory.java:129)
原因:缺少tools.jar库
解决方法:java库lib/tools.jar复制到hive的lib目录下
- 问题9
18/10/09 15:44:58 [main]: ERROR DataNucleus.Datastore: Error thrown executing CREATE TABLE `PARTITION_PARAMS`
(
`PART_ID` BIGINT NOT NULL,
`PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
`PARAM_VALUE` VARCHAR(4000) BINARY NULL,
CONSTRAINT `PARTITION_PARAMS_PK` PRIMARY KEY (`PART_ID`,`PARAM_KEY`)
) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes
java.sql.SQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:781)
原因:未知,可能是hive的bug
解决方法:MariaDB [hive]> alter database hive default character set latin1;
- 问题10
18/10/09 16:06:53 [main]: ERROR DataNucleus.Datastore: Error thrown executing ALTER TABLE `PARTITIONS` ADD COLUMN `TBL_ID` BIGINT NULL : Table 'hive.PARTITIONS' doesn't exist
java.sql.SQLSyntaxErrorException: Table 'hive.PARTITIONS' doesn't exist
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:781)
at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:666)
at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
原因:缺少表
解决方法:删除hive元数据库,使用工具初始化Hive元数据库
[root@wadeyu bin]# ./schematool -dbType mysql -initSchema
- 问题11
[root@wadeyu hive-1.2.2]# ./bin/beeline -u jdbc:hive2://
Connecting to jdbc:hive2://
18/10/09 16:24:45 [main]: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
18/10/09 16:25:30 [main]: WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
Error applying authorization policy on hive configuration: java.net.NoRouteToHostException: No Route to Host from wadeyu/192.168.1.7 to master:9000 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost
Beeline version 1.2.2 by Apache Hive
原因:hadoop集群未启动,有可能是网络原因
解决方法:启动hadoop集群,检查网络
参考资料
【0】Hive环境的安装部署
http://www.cnblogs.com/zlslch/p/6700695.html
【1】Hadoop集群之Hive安装配置
https://blog.csdn.net/blue_jjw/article/details/50479263
【2】hive的用户和用户权限
https://www.cnblogs.com/yejibigdata/p/6394719.html
【3】Hive用户接口(一)—Hive Web接口HWI的操作及使用
https://blog.csdn.net/NIITYZU/article/details/42582537
【4】Hive的使用之hwi
https://blog.csdn.net/zengmingen/article/details/52399457
【5】使用HIVE的WEB界面:HWI
http://www.cnblogs.com/gpcuster/archive/2010/02/25/1673480.html
【6】hive-hwi-0.13.1图形界面配置
https://blog.csdn.net/wulantian/article/details/38271803
【7】HARDFP ABI理解
http://www.cnblogs.com/sonach/archive/2011/12/24/2300713.html
Hive之单独部署机器的更多相关文章
- 【Hadoop离线基础总结】Hive的安装部署以及使用方式
Hive的安装部署以及使用方式 安装部署 Derby版hive直接使用 cd /export/softwares 将上传的hive软件包解压:tar -zxvf hive-1.1.0-cdh5.14. ...
- silverlight如何通过单独部署的WCF站点访问sharepoint2013的图片库
最近有项目silverlight通过单独部署的WCF站点访问sharepoint2013的图片库,需要做个笑脸墙效果如下: 结果开发完毕后无法在SP站点显示出来.使用VS自带的WCF工具进行测试.如下 ...
- cobbler部署机器的默认密码
修改cobbler的默认密码: 用 openssl 生成一串密码后加入到 cobbler 的配置文件(/etc/cobbler/settings)里,替换 default_password_crypt ...
- 如何将phantomjs单独部署在服务端
如何将phantomjs单独部署在服务端 文章目录 一. 容我分析(lao dao)几句 二. 服务端 Look here 服务端phantomjs搭建 web端搭建及如何调用phantomjs 三. ...
- 百度分享不支持https的解决方案(单独部署静态文件)
首先是参考了博客,下载百度分享的静态代码 static 链接为:https://www.cnblogs.com/mmzuo-798/p/6434576.html 后来在nginx的 nginx.con ...
- QPS、PV和需要部署机器数量计算公式
QPS:Queries Per Second意思是“每秒查询率”,是一台服务器每秒能够相应的查询次数,是对一个特定的查询服务器在规定时间内所处理流量多少的衡量标准. TPS是 Transactions ...
- windows下在Eclipse中启动的tomcat没有乱码,单独部署到tomcat下乱码解决方案
今天遇到了一个很奇怪的问题,在Eclipse中调试,运行项目一切正常,项目的所有编码都是统一的UTF-8.但是在单独部署到tomcat上的时候出现了中文乱码. 解决方案 第一步:确保项目,jsp页面, ...
- Hadoop入门第五篇:Hive简介以及部署
标签(空格分隔): Hadoop Hive hwi 1.Hive简介 之前我一直在Maxcompute上进行大数据开发,所以对数仓这块还算比较了解,在接受Hive的时候基本上没什么大的障碍.所以, ...
- Spark入门实战系列--5.Hive(上)--Hive介绍及部署
[注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .Hive介绍 1.1 Hive介绍 月开源的一个数据仓库框架,提供了类似于SQL语法的HQ ...
随机推荐
- LPS HDOJ 4745 Two Rabbits
题目传送门 /* 题意:一只兔子顺时针跳,另一只逆时针跳,跳石头权值相等而且不能越过起点 LPS:这道就是LPS的应用,把环倍增成链,套一下LPS,然而并不能理解dp[i][i+n-2] + 1,看别 ...
- SQL数据库--数据访问
数据访问: 对应命名空间:System.Data.SqlClient; SqlConnection:连接对象 SqlCommand:命令对象 SqlDataReader:读取器对象 //造连接字符串 ...
- 纵横填字map版(初始数据结构)
新数据结构设计: 定义一个map: key是横纵坐标字符串,比如“0,4” value是一个json,包含以下属性:字,横向的词(若 有的话,无的话,空串),纵向的词(若有的话,无的话,空串). 另有 ...
- git reset作用
git reset: 1. 文件从暂存区回退到工作区,撤销add 2. 版本回退 一:文件从暂存区回退到工作区,撤销add 如果想取消某个add的文件,可以使用该命令来进行撤销操作 撤消add:gi ...
- Angular——内置服务
$location <!DOCTYPE html> <html lang="en" ng-app="App"> <head> ...
- HDU_1789_doing homework again_贪心
Doing Homework again Time Limit: 1000/1000 MS (Java/Others) Memory Limit: 32768/32768 K (Java/Oth ...
- Android(java)学习笔记205:JNI之编写jni程序适配所有处理器型号
1. 还是以"02_两个数相加"为例,你会发现这个jni程序只能在ARM处理器下运行,如下: 如果我们让上面的程序运行在x86模拟器上,处理平台不对应,报如下错误: 03-29 ...
- 弹性分布式数据集(RDD)
spark围绕弹性分布式数据集(RDD)的概念展开的,RDD是一个可以并行操作的容错集合. 创建RDD的方法: 1.并行化集合(并行化驱动程序中现有的集合) 调用SparkContext的parall ...
- python实现二叉树的遍历以及基本操作
主要内容: 二叉树遍历(先序.中序.后序.宽度优先遍历)的迭代实现和递归实现: 二叉树的深度,二叉树到叶子节点的所有路径: 首先,先定义二叉树类(python3),代码如下: class TreeNo ...
- 07C语言程序语句
C语言程序语句 判断语句 if(表达式) {语句} #include <stdio.h> int main(){ printf("请输入2个数字:"); int a,b ...