在进行Spark与HBase 集成的过程中遇到以下问题:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$.apply(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$.apply(SparkSession.scala:)
at scala.Option.getOrElse(Option.scala:)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$.apply(SparkSession.scala:)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$.apply(SparkSession.scala:)
at scala.collection.mutable.HashMap$$anonfun$foreach$.apply(HashMap.scala:)
at scala.collection.mutable.HashMap$$anonfun$foreach$.apply(HashMap.scala:)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:)
... elided
Caused by: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/htrace/Trace when creating Hive client using classpath:...... Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply$mcZ$sp(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$.apply(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:)
... more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.NoClassDefFoundError: org/htrace/Trace
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:)
at java.lang.reflect.Constructor.newInstance(Constructor.java:)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:)
... more
Caused by: java.lang.NoClassDefFoundError: org/htrace/Trace
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:)
at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:)
at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem$.doCall(DistributedFileSystem.java:)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:)
... more
Caused by: java.lang.ClassNotFoundException: org.htrace.Trace
at java.net.URLClassLoader.findClass(URLClassLoader.java:)
at java.lang.ClassLoader.loadClass(ClassLoader.java:)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:)
at java.lang.ClassLoader.loadClass(ClassLoader.java:)
... more
<console>:: error: not found: value spark
import spark.implicits._
^
<console>:: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.
/_/ Using Scala version 2.11. (Java HotSpot(TM) -Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information. scala>

我事先已经将所需要的jar支持包全部拷贝到了各节点下的spark路径下的jar目录,但是在启动bin/spark-shell时总是报以上错误。在网上也看来很多解决方法,比如将缺失的jar包 :hbase/lib 下 hbase*.jar,metrics*.jar.htrace*.jar  等拷贝到spark/jar目录,但这不正是我的启动准备工作吗,等于没说!

从故障根源分析,造成此错误的原因在于Spark启动的时候无法正常从我们的jar目录加载到它所需要的jar包(此处表现为/htrace-core-3.0.4.jar包)。所以,直接将包添加到jar目录是无法解决此问题的,而我们需要做的是给它指明CLASSPATH路径,在它不能找到所需要的jar包时就可以从此路径寻找。

解决办法:

在spark/conf目录下的spark-env.sh配置文件添加以下配置内容

export HBASE_CLASSPATH='hbase classpath'
export HADOOP_CLASSPATH='hadoop classpath'
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_CLASSPATH


以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢!同时也欢迎转载,但必须在博文明显位置标注原文地址,解释权归博主所有!

Spark-HBase集成错误之 java.lang.NoClassDefFoundError: org/htrace/Trace的更多相关文章

  1. Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

    问题: 今天用Maven搭建了一个Spark的Scala项目,运行后遇到下面异常: Apache Spark Exception in thread “main” java.lang.NoClassD ...

  2. 错误:java.lang.NoClassDefFoundError: com/project/common/exception/ServiceException 的解决

    问题: 项目编译通过,启动报错误信息java.lang.NoClassDefFoundError: com/project/common/exception/ServiceException. 解决方 ...

  3. Selenium 运行时出现错误(java.lang.NoClassDefFoundError: com/google/common/base/Function)

    已经写好了java脚本,点击运行的过程中如果出现如下的错误提示时: java.lang.NoClassDefFoundError: com/google/common/base/Function 问题 ...

  4. eclipse启动错误:java.lang.NoClassDefFoundError: org/eclipse/core/resources/IContainer

    转自:http://blog.csdn.net/niu_hao/article/details/9332521 eclipse启动时报错如下:java.lang.NoClassDefFoundErro ...

  5. 【MyEclipse常见错误】-java.lang.NoClassDefFoundError: org/apache/juli/logging/LogFactory的解决

    ApacheJavaTomcatMyeclipse  自己前一段时间出现了这个问题,通过在网上搜索,大概知道了原因,整理下一,以供大家参考. 将项目部署好后,启动tomcat后报错,java.lang ...

  6. 错误:java.lang.NoClassDefFoundError: org/jaxen/JaxenException

    tomcat运行时候报错: java.lang.NoClassDefFoundError: org/jaxen/JaxenException at org.dom4j.DocumentFactory. ...

  7. 用idea+maven编译打包spark project core错误:java.lang.RuntimeException: Unable to load a Suite class

    Discovery starting. *** RUN ABORTED *** java.lang.RuntimeException: Unable to load a Suite class tha ...

  8. spring boot 集成axis1.4 java.lang.NoClassDefFoundError: Could not initialize class org.apache.axis.client.AxisClient

    pom配置: <dependencies> <dependency> <groupId>org.springframework.boot</groupId&g ...

  9. SparkStreaming运行出现 java.lang.NoClassDefFoundError: org/apache/htrace/Trace 错误

    1.简介 最近在摸索利用sparkstreaming从kafka中准实时的读取数据,并将在读取的过程中,可以做一个简单的分析,最后将分析结果写入hbase中. 2.出现的问题 (1)将从kafka中读 ...

随机推荐

  1. MongoDB整理笔记の增加节点

    MongoDB Replica Sets 不仅提供高可用性的解决方案,它也同时提供负载均衡的解决方案,增减Replica Sets 节点在实际应用中非常普遍,例如当应用的读压力暴增时,3 台节点的环境 ...

  2. delphi让exe开机自启动

    procedure AutoRunOnSystemStart(Title, FileName: String);const  _Software_Microsoft_Windows_CurrentVe ...

  3. 最近的一些零碎知识点,jquery遍历

    1.使按钮无法点击 $(“#btn”).attr("disable",true); 2.返回上一个页面 history.back(-1); 3.$(this).siblings() ...

  4. Django项目运行时出现self.status.split(' ',1)[0], self.bytes_sent,ConnectionAbortedError: [WinError 10053] 你的主机中的软件中止了一个已建立的连接。

    [02/Nov/2018 09:46:51] "GET /new_industry/category HTTP/1.1" 200 2891792 Traceback (most r ...

  5. 关于Flag 老是忘掉的东西

    OrderState enums = OrderState.CustomerCanceled | OrderState.CustomerOrdered | OrderState.CustomerQue ...

  6. 操作文件方法简单总结(File,Directory,StreamReader,StreamWrite ) - Zery-zhang

    一 基本介绍 操作文档,文件夹,需要用到的类 1 Directory (静态类) :      用于创建.移动和删除等操作通过 目录 和子 目录 DirectoryInfo (非静态): 2 File ...

  7. javascript framework js常用框架

    js常用框架 一.node.js   二.angularjs.js   三.react.js   四.webpack.js   五.flux.js   六.vue.js   七.bootstrap   ...

  8. Android app如何加密?

    欢迎访问网易云社区,了解更多网易技术产品运营经验. Android App包含的内容有dex文件,so文件,res,assets资源文件.对应的加密按此内容分为三大方面:dex保护.so加密.资源保护 ...

  9. Can't install Solaris 10 on XenServer 6.5 VM

    I have XenServer 6.5 installed on a server, and i have been trying to install Solaris 10 on a VM, it ...

  10. 如何让ie8地址栏下拉框里不显示历史记录和收藏夹

    打开浏览器,点击IE浏览器上方工具——Internet选项——内容——自动完成的设置——地址栏里面的勾勾去掉,浏览历史记录和收藏夹里面的东西就自动没有了