spark提示Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;

起因

编写了一个处理两列是否相等的UDF,这两列的数据结构是一样的,但是结构比较复杂,如下:

|-- list: array (nullable = true)
| |-- element: map (containsNull = true)
| | |-- key: string
| | |-- value: array (valueContainsNull = true)
| | | |-- element: struct (containsNull = true)
| | | | |-- Date: integer (nullable = true)
| | | | |-- Name: string (nullable = true)
|-- list2: array (nullable = true)
| |-- element: map (containsNull = true)
| | |-- key: string
| | |-- value: array (valueContainsNull = true)
| | | |-- element: struct (containsNull = true)
| | | | |-- Date: integer (nullable = true)
| | | | |-- Name: string (nullable = true)

也就是说Array里嵌着Map,Map里还嵌着一个Array,只能依次去比较,编写的UDF如下:

case class AppList(Date: Int, versionCode: Int, Name:String)

    def isMapEqual(map1: Map[String, Array[AppList]], map2:Map[String, Array[AppList]]): Boolean = {
try{
if (map1.size != map2.size){
return false
} else{
for ( x <- map1.keys){
if (map1(x) != map2(x)){
return false
}
}
return true
}
} catch {
case e: Exception => false
}
} def isListEqual(list1: Array[Map[String, Array[AppList]]], list2:Seq[Map[String, Seq[AppList]]]): Boolean = {
try {
if (list1.length != list2.length){
return false
} else if (list1.length == 0 || list2.length == 0){
return false
} else {
return isMapEqual(list1(0), list2(0))
}
} catch {
case e: Exception => false
}
} val isColumnEqual = udf((list1: Array[Map[String, Array[AppList]]], list2:Array[Map[String, Array[AppList]]]) => {
isListEqual(list1, list2)
})

然后我就贴到spark-shell里去执行了下面语句:

val dat = df.withColumn("equal", isColumnEqual($"list1", $"list2"))
dat.show()

此时就出现了如下错误:

Caused by: org.apache.spark.SparkException: Failed to execute user defined function($anonfun$1: (array<map<string,array<struct<Date:int,Name:string>>>>, array<map<string,array<struct<Date:int,Name:string>>>>) => boolean)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;
at $anonfun$1.apply(<console>:42)
... 16 more
解决办法

所谓的解决办法,自然是去谷歌了…

这里看到,说把Array改成Seq就好了,囧,尝试了一下,果然就好了

原因

这里说:

So it looks like the ArrayType on Dataframe "idDF" is really a WrappedArray and not an Array - So the function call to "filterMapKeysWithSet" failed as it expected an Array but got a WrappedArray/ Seq instead (which doesn't implicitly convert to Array in Scala 2.8 and above).

意思是,此Array非Scala中的原生Array,而是封装了一下的Array(有错的一定指出来,我都没写过Scala,慌

参考

spark提示Caused by: java.lang.ClassCastException: scala.collection.mutable.WrappedArray$ofRef cannot be cast to [Lscala.collection.immutable.Map;的更多相关文章

  1. hive orc压缩数据异常java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.hive.ql.io.orc.OrcSerde$OrcSerdeRow

    hive表在创建时候指定存储格式 STORED AS ORC tblproperties ('orc.compress'='SNAPPY'); 当insert数据到表时抛出异常 Caused by: ...

  2. sbt spark2.3.1 idea打包Caused by: java.lang.ClassNotFoundException: scala.Product$class

    今天同事在服务区上面装的是最新版本的hadoop3.10和spark2.3.1,因为用scala开发, 所以我想用sbt进行开发.过程中遇到各种坑,刚开始用的jdk10,结果也报错,后来改成jdk1. ...

  3. 关于android使用ksoap2报Caused by: java.lang.ClassCastException: org.ksoap2.SoapFault cannot be cast to org.ksoap2.serialization.SoapObject

    Caused by: java.lang.ClassCastException: org.ksoap2.SoapFault cannot be cast to org.ksoap2.serializa ...

  4. Caused by: java.lang.ClassCastException: org.springframework.web.SpringServletContainerInitializer cannot be cast to javax.servlet.ServletContainerInitializer错误解决办法

    严重: Failed to initialize end point associated with ProtocolHandler ["http-bio-8080"] java. ...

  5. Caused by: java.lang.ClassCastException: org.springframework.web.SpringServletContainerInitializer cannot be cast to javax.servlet.ServletContainerInitializer

    A child container failed during startjava.util.concurrent.ExecutionException: org.apache.catalina.Li ...

  6. Caused by: java.lang.ClassCastException: class java.lang.Double cannot be cast to class org.apache.hadoop.io.WritableComparable

    错误: Caused by: java.lang.ClassCastException: class java.lang.Double cannot be cast to class org.apac ...

  7. java.lang.ClassCastException: net.sf.ezmorph.bean.MorphDynaBean cannot be cast to

    Java.lang.ClassCastException: net.sf.ezmorph.bean.MorphDynaBean cannot be cast to 在使用JSONObject.toBe ...

  8. SSH整合时执行hibernate查询报错:java.lang.ClassCastException: com.ch.hibernate.Department_$$_javassist_0 cannot be cast to javassist.util.proxy

    今天在整合ssh三个框架时,有一个功能,是查询所有员工信息,且员工表和部门表是多对一的映射关系,代码能正常运行到查询得到一个List集合,但在页面展示的时候,就报异常了, java.lang.Clas ...

  9. java.lang.ClassCastException: org.springframework.web.filter.CharacterEncodingFilter cannot be cast to javax.servlet.Filter

    java.lang.ClassCastException: org.springframework.web.filter.CharacterEncodingFilter cannot be cast ...

随机推荐

  1. Ubuntu 13.04下安装WPS for Linux

    [日期:2013-06-03]   有人说Linux下不是有open office 和libre office么?是啊,可是将windows下的doc文档或者ppt放到Libreoffice上打开的时 ...

  2. 自定义cnblogs样式小结

     写在前面:  博客模版(皮肤)很多, 这里选择了一套相对"干净"的模版, 这套模版本身已经很好了, 简约大方, 在此基础上进行改动一下. 1.页面背景图源自网络. 2.回到顶部i ...

  3. ajax, jQuery, jQueryeasyUI

    1.ajax与jQueryajax是jquery库里面的一个被封装好的函数,可以拿来直接使用.没有jquery的话,ajax的使用就得用原生的javascript去写,比较麻烦. 2.jQuery E ...

  4. Linux查看文件安装路径与文件所在路径

    一.查看文件安装路径: 由于软件安装的地方不止一个地方,所有先说查看文件安装的所有路径(地址). 这里以Oracle为例.比如说我安装了Oracle,但是不知道文件都安装在哪些地方.放在哪些文件夹里, ...

  5. java struts2入门学习---自定义类型转换

    自定义类型转换器的作用就是将struts无法识别的类型转换成自己所需要的. 比如输入:广东-东莞-虎门,对应的输出时能输出:广东省 东莞市 虎门(镇/区) 这里涉及到的知识点即是将String转换为任 ...

  6. Charles 网络抓包工具

    1.Charles 简介 Charles 是在 Mac.Linux 或 Windows 下常用的 http 协议网络包截取工具,在平常的测试与调式过程中,掌握此工具就基本可以不用其他抓包工具了.Cha ...

  7. ADF_Starting系列3_使用ADF开发富Web应用程序之开发User Interface

    内容中包含 base64string 图片造成字符过多,拒绝显示

  8. OLTP和OLAP有何区别?

    OLTP即联机事务处理,就是我们经常说的关系数据库,意即记录即时的增.删.改.查,就是我们经常应用的东西,这是数据库的基础:OLAP即联机分析处理,是数据仓库的核心部心,所谓数据仓库是对于大量已经由O ...

  9. 【转载】Android 关于arm64-v8a、armeabi-v7a、armeabi、x86下的so文件兼容问题

    转自:[欧阳鹏]http://blog.csdn.net/ouyang_peng Android 设备的CPU类型(通常称为”ABIs”) armeabiv-v7a: 第7代及以上的 ARM 处理器. ...

  10. C#基础课程之三循环语句

    for循环: ; i < ; i++) { Console.WriteLine("执行"+i+"次"); } while循环: while (true) ...