Provided Maven Coordinates must be in the form 'groupId:artifactId:version'.
[hadoop@hadoop1 bin]$ ./spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10-2.2.1
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Provided Maven Coordinates must be in the form 'groupId:artifactId:version'. The coordinate provided is: org.mongodb.spark:mongo-spark-connector_2.10-2.2.1
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.deploy.SparkSubmitUtils$$anonfun$extractMavenCoordinates$1.apply(SparkSubmit.scala:901)
at org.apache.spark.deploy.SparkSubmitUtils$$anonfun$extractMavenCoordinates$1.apply(SparkSubmit.scala:899)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
at org.apache.spark.deploy.SparkSubmitUtils$.extractMavenCoordinates(SparkSubmit.scala:899)
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1127)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:298)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[hadoop@hadoop1 bin]$ ./spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.10:2.2.1
Ivy Default Cache set to: /home/hadoop/.ivy2/cache
The jars for the packages stored in: /home/hadoop/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb.spark#mongo-spark-connector_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found org.mongodb.spark#mongo-spark-connector_2.10;2.2.1 in central
found org.mongodb#mongo-java-driver;3.4.2 in central
downloading https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.10/2.2.1/mongo-spark-connector_2.10-2.2.1.jar ...
[SUCCESSFUL ] org.mongodb.spark#mongo-spark-connector_2.10;2.2.1!mongo-spark-connector_2.10.jar (1138ms)
downloading https://repo1.maven.org/maven2/org/mongodb/mongo-java-driver/3.4.2/mongo-java-driver-3.4.2.jar ...
[SUCCESSFUL ] org.mongodb#mongo-java-driver;3.4.2!mongo-java-driver.jar (614ms)
:: resolution report :: resolve 2947ms :: artifacts dl 1756ms
:: modules in use:
org.mongodb#mongo-java-driver;3.4.2 from central in [default]
org.mongodb.spark#mongo-spark-connector_2.10;2.2.1 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 2 | 2 | 0 || 2 | 2 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
2 artifacts copied, 0 already retrieved (2300kB/7ms)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.5/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/11/24 04:12:12 WARN util.Utils: Your hostname, hadoop1 resolves to a loopback address: 127.0.0.1; using 192.168.2.51 instead (on interface eno1)
17/11/24 04:12:12 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar."
17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
17/11/24 04:12:14 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/hadoop/spark-2.2.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/hadoop/spark/jars/datanucleus-core-3.2.10.jar."
17/11/24 04:12:17 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.2.51:4040
Spark context available as 'sc' (master = local[*], app id = local-1511467933297).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.0
/_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144)
Type in expressions to have them evaluated.
Type :help for more information. scala>
Provided Maven Coordinates must be in the form 'groupId:artifactId:version'.的更多相关文章
- Maven坐标 groupId artifactId version packaging classifier name
groupId定义项目属于哪个组,这个组往往和项目所在的组织或公司存在关联.譬如在googlecode上建立一个名为myapp的项目,那么groupId就应该是com.googlecode.myapp ...
- maven在Idea建立工程,运行出现Server IPC version 9 cannot communicate with client version 4错误
问题的根源在于,工程当中maven dependencies里面的包,有个hadoop-core的包,版本太低,这样,程序里面所有引用到org.apache.hadoop的地方,都是低版本的,你用的是 ...
- (转)搜索Maven仓库 获取 groupid artifactId
转载自:http://blog.csdn.net/z69183787/article/details/22188561 使用Maven进行开发的时候,比较常见的一个问题就是如何寻找我要的依赖,比如说, ...
- maven与jdk版本不一致报:Unsupported major.minor version 51.0
I recently uninstalled Java 8, to use Java 6 as I want my code/creations to be usable by more people ...
- Maven Waring : GroupId is duplicate of parent groupId 和 Version is duplicate of parent version
问题描述: 新项目在创建的时候,因为用到了分模块的,所以导致子模块的pom文件,报了 如下警告: 解决办法: 直接 Window --> Preferences --> Maven -- ...
- maven项目 Java compiler level does not match the version of the installed Java project facet
因工作的关系,Eclipse开发的Java项目拷来拷去,有时候会报一个很奇怪的错误.明明源码一模一样,为什么项目复制到另一台机器上,就会报“java compiler level does not m ...
- JavaWeb 之Ubuntu intelliJ 新建maven项目及配置tomcat
一. 破解安装 intelliJ 下载网址:https://www.jetbrains.com/idea/ 破解激活:https://www.cnblogs.com/tanrong/p/7309343 ...
- Maven实战读书笔记(二):Maven坐标与仓库
2.1 Maven坐标 Maven坐标是Maven用来表示一个组件依赖的标示. Maven通过下面几个元素定义坐标:groupId.artifactId.version.packaging.class ...
- Maven运行测试
原文:http://tianya23.blog.51cto.com/1081650/292315/ Maven运行用于测试中的最佳实践(个人认为,呵呵) 1.创建maven工程 mvn arche ...
随机推荐
- java面试题之简单介绍一下集合框架
集合框架分为三块:List列表.Set集合.Map映射 List列表在数据结构上可以被看做线性表,常用的有ArrayList和LinkList(不常用的有Vector(类似于ArrayList)),他 ...
- Bzoj2007 [Noi2010]海拔(平面图最短路)
2007: [Noi2010]海拔 Time Limit: 20 Sec Memory Limit: 552 MBSubmit: 2742 Solved: 1318[Submit][Status] ...
- Spoj-BIPCSMR16 Team Building
To make competitive programmers of BUBT, authority decide to take regular programming contest. To ma ...
- java 汉字保存到mysql 乱码
保存之前正常,插入数据乱码 确认jsp mysql编码都确定为utf8 在连接数据库是加上编码 jdbc:mysql://localhost:3306/test?useUnicode=true& ...
- Cards BZOJ 1004
Cards [问题描述] 小春现在很清闲,面对书桌上的N张牌,他决定给每张染色,目前小春只有3种颜色:红色,蓝色,绿色.他询问Sun有多少种染色方案,Sun很快就给出了答案.进一步,小春要求染出Sr张 ...
- Linux下AT&T汇编语法格式与Intel汇编语法格式异同
由于绝大多数的国内程序员以前只接触过Intel格式的汇编语言,很少或几乎没有接触过AT&T汇编语言,虽然这些汇编代码都是Intel风格的.但在Unix和Linux系统中,更多采用的还是AT&a ...
- VirtualBox 5.0.10 中 Fedora 23 在安装了增强工具后无法自动调节虚拟机分辨率的问题(改)
VirtualBox 5.0.10 中安装 Fedora 23,即使在安装了增强工具后,仍然会发现虚拟机无法根据 VirtualBox 的运行窗口大小自动进行分辨率调节.究其原因,主要是因为 Fedo ...
- Java游戏服务器搭建
一.前言 此游戏服务器架构是一个单服的形式,也就是说所有游戏逻辑在一个工程里,没有区分登陆服务器.战斗服务器.世界服务器等.此架构已成功应用在了多款页游服务器 .在此框架中没有实现相关业务逻辑,只有简 ...
- Java并发编程,深入理解ReentrantLock
ReentrantLock简介 ReentrantLock重入锁,是实现Lock接口的一个类,也是在实际编程中使用频率很高的一个锁, 支持重入性,表示能够对共享资源能够重复加锁,即当前线程获取该锁再次 ...
- JS标签获取另一个页面传过来的href值
a href=b.html?id=楼主>B页面</a>b.html中的获取函数:function getParam(){C1=window.location.href.split(& ...