> sc <- sparkR.init()
Re-using existing Spark Context. Please stop SparkR with sparkR.stop() or restart R to create a new Spark Context
> sqlContext <- sparkRSQL.init(sc)
> df <- createDataFrame(sqlContext, faithful)
17/03/01 15:05:56 INFO SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:-2
17/03/01 15:05:56 INFO DAGScheduler: Got job 0 (collectPartitions at NativeMethodAccessorImpl.java:-2) with 1 output partitions
17/03/01 15:05:56 INFO DAGScheduler: Final stage: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:-2)
17/03/01 15:05:56 INFO DAGScheduler: Parents of final stage: List()
17/03/01 15:05:56 INFO DAGScheduler: Missing parents: List()
17/03/01 15:05:56 INFO DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:460), which has no missing parents
17/03/01 15:05:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1280.0 B, free 1280.0 B)
17/03/01 15:05:56 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 854.0 B, free 2.1 KB)
17/03/01 15:05:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 172.16.31.137:49150 (size: 854.0 B, free: 511.5 MB)
17/03/01 15:05:56 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
17/03/01 15:05:56 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:460)
17/03/01 15:05:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/03/01 15:05:56 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, test3, partition 0,PROCESS_LOCAL, 12976 bytes)
17/03/01 15:05:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on test3:50531 (size: 854.0 B, free: 511.5 MB)
17/03/01 15:05:56 INFO DAGScheduler: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:-2) finished in 0.396 s
17/03/01 15:05:56 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 389 ms on test3 (1/1)
17/03/01 15:05:56 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
17/03/01 15:05:56 INFO DAGScheduler: Job 0 finished: collectPartitions at NativeMethodAccessorImpl.java:-2, took 0.526915 s
> showDF(df)
17/03/01 15:06:02 INFO SparkContext: Starting job: showString at NativeMethodAccessorImpl.java:-2
17/03/01 15:06:02 INFO DAGScheduler: Got job 1 (showString at NativeMethodAccessorImpl.java:-2) with 1 output partitions
17/03/01 15:06:02 INFO DAGScheduler: Final stage: ResultStage 1 (showString at NativeMethodAccessorImpl.java:-2)
17/03/01 15:06:02 INFO DAGScheduler: Parents of final stage: List()
17/03/01 15:06:02 INFO DAGScheduler: Missing parents: List()
17/03/01 15:06:02 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at showString at NativeMethodAccessorImpl.java:-2), which has no missing parents
17/03/01 15:06:02 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 8.7 KB, free 10.8 KB)
17/03/01 15:06:02 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.5 KB, free 14.4 KB)
17/03/01 15:06:02 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 172.16.31.137:49150 (size: 3.5 KB, free: 511.5 MB)
17/03/01 15:06:02 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
17/03/01 15:06:02 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at showString at NativeMethodAccessorImpl.java:-2)
17/03/01 15:06:02 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
17/03/01 15:06:02 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, test2, partition 0,PROCESS_LOCAL, 12976 bytes)
17/03/01 15:06:03 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on test2:57552 (size: 3.5 KB, free: 511.5 MB)
17/03/01 15:06:04 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1, test2): java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at org.apache.spark.api.r.RRDD$.createRProcess(RRDD.scala:413)
at org.apache.spark.api.r.RRDD$.createRWorker(RRDD.scala:429)
at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:63)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:187)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 20 more 17/03/01 15:06:04 INFO TaskSetManager: Starting task 0.1 in stage 1.0 (TID 2, test2, partition 0,PROCESS_LOCAL, 12976 bytes)
17/03/01 15:06:04 INFO TaskSetManager: Lost task 0.1 in stage 1.0 (TID 2) on executor test2: java.io.IOException (Cannot run program "Rscript": error=2, No such file or directory) [duplicate 1]
17/03/01 15:06:04 INFO TaskSetManager: Starting task 0.2 in stage 1.0 (TID 3, test3, partition 0,PROCESS_LOCAL, 12976 bytes)
17/03/01 15:06:04 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on test3:50531 (size: 3.5 KB, free: 511.5 MB)
17/03/01 15:06:04 INFO TaskSetManager: Lost task 0.2 in stage 1.0 (TID 3) on executor test3: java.io.IOException (Cannot run program "Rscript": error=2, No such file or directory) [duplicate 2]
17/03/01 15:06:04 INFO TaskSetManager: Starting task 0.3 in stage 1.0 (TID 4, test3, partition 0,PROCESS_LOCAL, 12976 bytes)
17/03/01 15:06:04 INFO TaskSetManager: Lost task 0.3 in stage 1.0 (TID 4) on executor test3: java.io.IOException (Cannot run program "Rscript": error=2, No such file or directory) [duplicate 3]
17/03/01 15:06:04 ERROR TaskSetManager: Task 0 in stage 1.0 failed 4 times; aborting job
17/03/01 15:06:04 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
17/03/01 15:06:04 INFO TaskSchedulerImpl: Cancelling stage 1
17/03/01 15:06:04 INFO DAGScheduler: ResultStage 1 (showString at NativeMethodAccessorImpl.java:-2) failed in 2.007 s
17/03/01 15:06:04 INFO DAGScheduler: Job 1 failed: showString at NativeMethodAccessorImpl.java:-2, took 2.027519 s
17/03/01 15:06:04 ERROR RBackendHandler: showString on 15 failed
Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) :
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4, test3): java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at org.apache.spark.api.r.RRDD$.createRProcess(RRDD.scala:413)
at org.apache.spark.api.r.RRDD$.createRWorker(RRDD.scala:429)
at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:63)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.R
  org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4, test3): java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory

重点为这一句
这一错误,使得在sparkr中,定义class为

class(df)
[1] "DataFrame"
attr(,"package")
[1] "SparkR"
的对象之后,使用class以及names以及show可以查看

但使用showDF以及head则报出如上错误。即无法读取

关注重点报错句,可知,其他节点上没有

Rscript

解决办法为,登陆其他的机器,将将Rscript copy到/usr/bin便可

或改成单节点:

即启动时,去掉--master

sparkR --driver-class-path /data1/mysql-connector-java-5.1.18.jar

sparkr——报错的更多相关文章

  1. 启动 ./spark-shell 命令报错

    当使用./spark-shell 命令报错 Caused by: ERROR XJ040: Failed to start database @476fde05, see the next excep ...

  2. Windows 7上执行Cake 报错原因是Powershell 版本问题

    在Windows 7 SP1 电脑上执行Cake的的例子 http://cakebuild.net/docs/tutorials/getting-started ,运行./Build.ps1 报下面的 ...

  3. 关于VS2015 ASP.NET MVC添加控制器的时候报错

    调试环境:VS2015 数据库Mysql  WIN10 在调试过程中出现类似下两图的同学们,注意啦. 其实也是在学习的过程中遇到这个问题的,找了很多资料都没有正面的解决添加控制器的时候报错的问题,还是 ...

  4. php报错 ----> Call to undefined function imagecreatetruecolor()

    刚才在写验证码的时候,发现报错,然后排查分析了一下,原来是所用的php版本(PHP/5.3.13)没有开启此扩展功能. 进入php.ini 找到extension=php_gd2.dll ,将其前面的 ...

  5. scp报错 -bash: scp: command not found

    环境:RHEL6.5 使用scp命令报错: [root@oradb23 media]# scp /etc/hosts oradb24:/etc/ -bash: scp: command not fou ...

  6. VS2015使用scanf报错的解决方案

    1.在程序最前面加: #define _CRT_SECURE_NO_DEPRECATE 2.在程序最前面加: #pragma warning(disable:4996) 3.把scanf改为scanf ...

  7. VS项目中使用Nuget还原包后编译生产还一直报错?

    Nuget官网下载Nuget项目包的命令地址:https://www.nuget.org/packages 今天就遇到一个比较奇葩的问题,折腾了很久终于搞定了: 问题是这样的:我的解决方案原本是好好的 ...

  8. Tomcat启动报错org.springframework.web.context.ContextLoaderListener类配置错误——SHH框架

    SHH框架工程,Tomcat启动报错org.springframework.web.context.ContextLoaderListener类配置错误 1.查看配置文件web.xml中是否配置.or ...

  9. Android——eclipse下运行android项目报错 Conversion to Dalvik format failed with error 1解决

    在eclipse中导入android项目,项目正常没有任何错误,但是运行时候会报错,(clean什么的都没用了.....)如图: 百度大神大多说是jdk的问题,解决: 右键项目-Properties如 ...

随机推荐

  1. Python学习笔记010——函数文档字符串

    函数文档字符串documentation string (docstring)是在函数开头,用来解释其接口的字符串.简而言之:帮助文档 包含函数的基础信息 包含函数的功能简介 包含每个形参的类型,使用 ...

  2. Leetcode 二分查找 Search Insert Position

    本文为senlie原创,转载请保留此地址:http://blog.csdn.net/zhengsenlie Search Insert Position Total Accepted: 14279 T ...

  3. AjaxPro实现异步调用,解决浏览器假死及超时问题

    平时使用AjaxPro的时候基本上非常easy var msg = UseClass.Method(argument).value; 由于后台响应比較慢,所以加了个"loading" ...

  4. Struts如何获取客户端ip地址

    在JSP里,获取客户端的IP地址的方法是:request.getRemoteAddr(),这种方法在大部分情况下都是有效的.但是在通过了Apache,Squid等反向代理软件就不能获取到客户端的真实I ...

  5. nginx 内置变量大全(转)

    HTTP核心模块支持一些内置变量,变量名与apache里的对应.比如 $http_user_agent,$http_cookie等表示HTTP请求信息的变量.更多变量:$args, 请求中的参数; $ ...

  6. mysql分组取每组前几条记录(排序)

    首先来造一部分数据,表mygoods为商品表,cat_id为分类id,goods_id为商品id,status为商品当前的状态位(1:有效,0:无效). CREATE TABLE `mygoods` ...

  7. Java 小数类 及四舍五入的方法 精度非常高的小数时用

    注意假设结果是无限位小数,不指定位数进行四舍五入的话会报错 import java.util.Scanner; import java.math.BigDecimal; public class Ma ...

  8. Vagrant安装指南

    ubuntu的易用性很高,安装很简单,颜值也高,但是我工作中经常使用centos,我希望我的笔记本也是centos,但是,centos颜值太低,配置文件很复杂,不想弄这个太麻烦,于是,我想到了Vagr ...

  9. UVA10519 - !! Really Strange !!(数论+高精度)

    10519 - !! Really Strange !!(数论+高精度) option=com_onlinejudge&Itemid=8&category=24&page=sh ...

  10. ORACLE 仿照原表建表语法

    用于: 1.修改表前,可用于对原表表结构或表数据的备份 2.仿照原表的表结构建立一张新表 CREATE TABLE T_XXXX_BAK_130810 AS SELECT * FROM T_XXXX ...