1. // :: WARN RandomBlockReplicationPolicy: Expecting replicas with only peer/s.
  2. // :: WARN BlockManager: Block input-- replicated to only peer(s) instead of peers
  3. // :: ERROR Executor: Exception in task 0.0 in stage 113711.0 (TID )
  4. java.lang.AssertionError: assertion failed
  5. at scala.Predef$.assert(Predef.scala:)
  6. at org.apache.spark.storage.BlockInfo.checkInvariants(BlockInfoManager.scala:)
  7. at org.apache.spark.storage.BlockInfo.readerCount_$eq(BlockInfoManager.scala:)
  8. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  9. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  10. at scala.Option.foreach(Option.scala:)
  11. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  12. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  13. at scala.collection.Iterator$class.foreach(Iterator.scala:)
  14. at scala.collection.AbstractIterator.foreach(Iterator.scala:)
  15. at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:)
  16. at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:)
  17. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
  18. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
  19. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
  20. at java.lang.Thread.run(Thread.java:)
  21. // :: WARN TaskSetManager: Lost task 0.0 in stage 113711.0 (TID , localhost, executor driver): java.lang.AssertionError: assertion failed
  22. at scala.Predef$.assert(Predef.scala:)
  23. at org.apache.spark.storage.BlockInfo.checkInvariants(BlockInfoManager.scala:)
  24. at org.apache.spark.storage.BlockInfo.readerCount_$eq(BlockInfoManager.scala:)
  25. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  26. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  27. at scala.Option.foreach(Option.scala:)
  28. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  29. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  30. at scala.collection.Iterator$class.foreach(Iterator.scala:)
  31. at scala.collection.AbstractIterator.foreach(Iterator.scala:)
  32. at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:)
  33. at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:)
  34. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
  35. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
  36. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
  37. at java.lang.Thread.run(Thread.java:)
  38.  
  39. // :: ERROR TaskSetManager: Task in stage 113711.0 failed times; aborting job
  40. // :: ERROR JobScheduler: Error running job streaming job ms.
  41. org.apache.spark.SparkException: An exception was raised by Python:
  42. Traceback (most recent call last):
  43. File "/home/admin/agent/spark/python/lib/pyspark.zip/pyspark/streaming/util.py", line , in call
  44. r = self.func(t, *rdds)
  45. File "/home/admin/agent/spark/python/lib/pyspark.zip/pyspark/streaming/dstream.py", line , in takeAndPrint
  46. taken = rdd.take(num + )
  47. File "/home/admin/agent/spark/python/lib/pyspark.zip/pyspark/rdd.py", line , in take
  48. res = self.context.runJob(self, takeUpToNumLeft, p)
  49. File "/home/admin/agent/spark/python/lib/pyspark.zip/pyspark/context.py", line , in runJob
  50. port = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd, partitions)
  51. File "/home/admin/agent/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line , in __call__
  52. answer, self.gateway_client, self.target_id, self.name)
  53. File "/home/admin/agent/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line , in get_return_value
  54. format(target_id, ".", name), value)
  55. py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
  56. : org.apache.spark.SparkException: Job aborted due to stage failure: Task in stage 113711.0 failed times, most recent failure: Lost task 0.0 in stage 113711.0 (TID , localhost, executor driver): java.lang.AssertionError: assertion failed
  57. at scala.Predef$.assert(Predef.scala:)
  58. at org.apache.spark.storage.BlockInfo.checkInvariants(BlockInfoManager.scala:)
  59. at org.apache.spark.storage.BlockInfo.readerCount_$eq(BlockInfoManager.scala:)
  60. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  61. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$$$anonfun$apply$.apply(BlockInfoManager.scala:)
  62. at scala.Option.foreach(Option.scala:)
  63. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  64. at org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$.apply(BlockInfoManager.scala:)
  65. at scala.collection.Iterator$class.foreach(Iterator.scala:)
  66. at scala.collection.AbstractIterator.foreach(Iterator.scala:)
  67. at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:)
  68. at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:)
  69. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:)
  70. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
  71. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
  72. at java.lang.Thread.run(Thread.java:)
      
  1. Driver stacktrace:
  2. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:)
  3. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$.apply(DAGScheduler.scala:)
  4. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$.apply(DAGScheduler.scala:)
  5. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:)
  6. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:)
  7. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:)
  8. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$.apply(DAGScheduler.scala:)
  9. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$.apply(DAGScheduler.scala:)
  10. at scala.Option.foreach(Option.scala:)
  11. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:)
  12. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:)
  13. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:)
  14. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:)
  15. at org.apache.spark.util.EventLoop$$anon$.run(EventLoop.scala:)
  16. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:)
  17. at org.apache.spark.SparkContext.runJob(SparkContext.scala:)
  18. at org.apache.spark.SparkContext.runJob(SparkContext.scala:)
  19. at org.apache.spark.SparkContext.runJob(SparkContext.scala:)
  20. at org.apache.spark.api.python.PythonRDD$.runJob(PythonRDD.scala:)
  21. at org.apache.spark.api.python.PythonRDD.runJob(PythonRDD.scala)
  22. at sun.reflect.GeneratedMethodAccessor55.invoke(Unknown Source)
  23. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
  24. at java.lang.reflect.Method.invoke(Method.java:)
  25. at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:)
  26. at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:)
  27. at py4j.Gateway.invoke(Gateway.java:)
  28. at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:)
  29. at py4j.commands.CallCommand.execute(CallCommand.java:)
  30. at py4j.GatewayConnection.run(GatewayConnection.java:)
  31. at java.lang.Thread.run(Thread.java:)

排查原因1:

  1. 【不是】由于代码中checkpoint目录为本地导致,搭建了hdfs,将checkpoint移到hdfs,发现还是运行一天左右就挂掉,报错如上。

  2. 待续

请大虾们指点。

filebeat+kafka+SparkStreaming程序报错及解决办法的更多相关文章

  1. 【Runtime Error】打开Matlib7.0运行程序报错的解决办法

    1.在C盘建立一个文件夹temp,存放临时文件: 2.右键我的电脑-属性-高级系统设置-环境变量-系统变量,将TEMP.TMP的值改成C:\temp: 3.还是在第2步那里,新建变量,变量名称为BLA ...

  2. 未在本地计算机上注册“microsoft.ACE.oledb.12.0”提供程序报错的解决办法

    https://www.jb51.net/article/157457.htm 下载32位版本安装即可 Microsoft Access Database Engine Redistributable ...

  3. Base64 报错 的解决办法 (Base-64 字符数组或字符串的长度无效。, 输入的不是有效的 Base-64 字符串,因为它包含非 Base-64 字符、两个以上的填充字符,或者填充字符间包含非法字符。)

    Base64 报错 的解决办法, 报错如下:1. FormatException: The input is not a valid Base-64 string as it contains a n ...

  4. Springboot数据库连接池报错的解决办法

    Springboot数据库连接池报错的解决办法 这个异常通常在Linux服务器上会发生,原因是Linux系统会主动断开一个长时间没有通信的连接 那么我们的问题就是:数据库连接池长时间处于间歇状态,导致 ...

  5. Loadrunner参数化连接oracle、mysql数据源报错及解决办法

    Loadrunner参数化连接oracle.mysql数据源报错及解决办法 (本人系统是Win7 64,  两位小伙伴因为是默认安装lr,安装在 最终参数化的时候,出现连接字符串无法自动加载出来: 最 ...

  6. PHP empty函数报错的解决办法

    PHP empty函数在检测一个非变量情况下报错的解决办法. PHP开发时,当你使用empty检查一个函数返回的结果时会报错:Fatal error: Can't use function retur ...

  7. eclipse中的js文件报错的解决办法

    在使用别人的项目的时候,导入到eclipse中发现js文件报错,解决办法是关闭eclipse的js校验功能. 三个步骤: 1. 右键点击项目->properties->Validation ...

  8. VM装mac10.9教程+报错信息解决办法

    VM装mac10.9教程+报错信息解决办法 教程1: 教你在Vmware 10下安装苹果Mac10.9系统 地址:http://tieba.baidu.com/p/2847457021 教程2: VM ...

  9. Oracle数据库误删文件导致rman备份报错RMAN-06169解决办法

    Oracle数据库误删文件导致rman备份报错RMAN-06169解决办法 可能是误删文件导致在使用rman备份时候出现以下提示 RMAN-06169: could not read file hea ...

随机推荐

  1. CSS3 calc()函数使用

    1.calc是什么? calc是英文单词calculate(计算)的缩写,用于动态计算长度值. calc()函数支持 "+", "-", "*&quo ...

  2. 解决win10锁屏后无法进入桌面

    方法一: TAKEOWN /F "%SystemRoot%\System32\InputMethod\CHS\ChsIME.exe" icacls "%SystemRoo ...

  3. 在命令行上 Ubuntu 下使用 mutt 和 msmtp 发送 Gmail 邮件

    在命令行写email from ubuntu 参考:      http://www.habadog.com/2011/11/23/send-mail-with-msmtp-mutt-linux    ...

  4. python之函数用法file()

    # -*- coding: utf-8 -*- #python 27 #xiaodeng #python之函数用法file() #file() #说明:file()内建函数它的功能等于open(),但 ...

  5. 如何在MyEclipse中建立一个代理服务器

    一.什么是 TCP/IP Monitor TCP/IP monitor 是可以监控在某个端口上通过 TCP/IP 协议传送的通信数据的一个工具软件. TCP/IP monitor 工具,通过一些配置, ...

  6. iOS7 UIWebview加载进度条实现

    不同于WKWebview,wk是有自己的加载进度值的,我们可以直接通过kvo检测到,并显示到进度条内. 但如果我们为了适配ios7,只能使用UIWebview了,这里的加载进度,就比较尴尬了 所以我们 ...

  7. 【LeetCode】52. N-Queens II

    N-Queens II Follow up for N-Queens problem. Now, instead outputting board configurations, return the ...

  8. 阿里云ECS服务器Linux环境下配置php服务器(一)--基础配置篇

    开始安装软件了,我们需要安装的软件有apache,php和MySQL. ps:如果你购买的是北京的服务器,有个安全组需要设置,我全部用的默认设置,暂时还没发现会有什么影响. 首先关闭SELINUX(S ...

  9. [原创] Java JDBC连接数据库,反射创建实体类对象并赋值数据库行记录(支持存储过程)

    1.SqlHelper.java import java.lang.reflect.*; import java.sql.*; import java.util.*; public class Sql ...

  10. NSURLRequestCachePolicy 缓存策略

    1> NSURLRequestUseProtocolCachePolicy = 0, 默认的缓存策略, 如果缓存不存在,直接从服务端获取.如果缓存存在,会根据response中的Cache-Co ...