[root@db02 scala-2.11.5]# spark-shell

Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel).

Welcome to
       ____              __
      / __/__  ___ _____/ /__
     _\ \/ _ \/ _ `/ __/  '_/
    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
       /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)

Type in expressions to have them evaluated.

Type :help for more information.

17/08/26 10:48:23 ERROR spark.SparkContext: Error initializing SparkContext.

java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
     at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:292)
     at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:139)
     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:157)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:542)
     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1022)
     at $line3.$read$$iwC$$iwC.<init>(<console>:15)
     at $line3.$read$$iwC.<init>(<console>:25)
     at $line3.$read.<init>(<console>:27)
     at $line3.$read$.<init>(<console>:31)
     at $line3.$read$.<clinit>(<console>)
     at $line3.$eval$.<init>(<console>:7)
     at $line3.$eval$.<clinit>(<console>)
     at $line3.$eval.$print(<console>)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:305)
     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:160)
     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
     at org.apache.spark.repl.Main$.main(Main.scala:35)
     at org.apache.spark.repl.Main.main(Main.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

17/08/26 10:48:23 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!

17/08/26 10:48:23 ERROR util.Utils: Uncaught exception in thread main

java.lang.NullPointerException
     at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:152)
     at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1317)
     at org.apache.spark.SparkEnv.stop(SparkEnv.scala:96)
     at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1768)
     at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
     at org.apache.spark.SparkContext.stop(SparkContext.scala:1767)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:614)
     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1022)
     at $line3.$read$$iwC$$iwC.<init>(<console>:15)
     at $line3.$read$$iwC.<init>(<console>:25)
     at $line3.$read.<init>(<console>:27)
     at $line3.$read$.<init>(<console>:31)
     at $line3.$read$.<clinit>(<console>)
     at $line3.$eval$.<init>(<console>:7)
     at $line3.$eval$.<clinit>(<console>)
     at $line3.$eval.$print(<console>)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:305)
     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:160)
     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
     at org.apache.spark.repl.Main$.main(Main.scala:35)
     at org.apache.spark.repl.Main.main(Main.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.
     at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:292)
     at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:139)
     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:157)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:542)
     at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1022)
     at $iwC$$iwC.<init>(<console>:15)
     at $iwC.<init>(<console>:25)
     at <init>(<console>:27)
     at .<init>(<console>:31)
     at .<clinit>(<console>)
     at .<init>(<console>:7)
     at .<clinit>(<console>)
     at $print(<console>)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:305)
     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:160)
     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
     at org.apache.spark.repl.Main$.main(Main.scala:35)
     at org.apache.spark.repl.Main.main(Main.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

java.lang.NullPointerException
     at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1375)
     at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
     at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1033)
     at $iwC$$iwC.<init>(<console>:15)
     at $iwC.<init>(<console>:24)
     at <init>(<console>:26)
     at .<init>(<console>:30)
     at .<clinit>(<console>)
     at .<init>(<console>:7)
     at .<clinit>(<console>)
     at $print(<console>)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:133)
     at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:305)
     at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
     at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:160)
     at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
     at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
     at org.apache.spark.repl.Main$.main(Main.scala:35)
     at org.apache.spark.repl.Main.main(Main.scala)
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     at java.lang.reflect.Method.invoke(Method.java:606)
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:16: error: not found: value sqlContext
          import sqlContext.implicits._
                 ^

<console>:16: error: not found: value sqlContext
          import sqlContext.sql
                 ^

通过以上日志我们可以看到如下信息:

17/08/26 10:48:23 ERROR spark.SparkContext: Error initializing SparkContext.

java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.

这里我们调整参数:yarn.scheduler.maximum-allocation-mb' 和 'yarn.nodemanager.resource.memory-mb'的默认值为2g。

cm5.9.2安装spark启动报错解决办法的更多相关文章

  1. MySQL5.7.26安装及启动报错解决

    一.安装依赖包 [root@db01 ~]# yum install -y lrzsz [文件上传/下载] [root@db01 ~]# yum -y install xfsprogs [安装磁盘格式 ...

  2. R语言安装openxl包报错解决办法

    在R语言中使用openxlsx包,会报错 解决办法就是: 下载安装Set-Rtool,安装时注意勾选对话框 然后在R中运行以下代码: Sys.setenv("R_ZIPCMD" = ...

  3. keepalived yum安装后启动报错解决

    [root@centos8 ~]yum install keepalived -y [root@centos8 ~]systemctl start keepalived.services [root@ ...

  4. 【转载】struts应用在断网情况下启动报错解决办法(java/net/AbstractPlainSocketImpl.java:178:-1)

    无意间struts应用在有网络的情况下启动正常,在断网的情况下启动报错,报错代码如下图所示: SEVERE: Exception starting filter struts2 Class: java ...

  5. Nodejs npm安装socket.io报错解决办法

    安装socket.io时,报错,提示需要安装Microsoft visual studio 2005 或 Net framework 2.0 sdk,没有找到vcbuild.exe,解决办法是安装 . ...

  6. Tomcat启动报错java.net.AbstractPlainSocketImpl(java/net/AbstractPlainSocketImpl.java:178:-1)Struts在网络复杂情况下启动报错解决办法

    SSH项目 在网络复杂的情况(具体规律未知)下,Tomcat启动时,报如下错误: [ERROR] 2014-08-12 14:52:58,484 [org.apache.struts2.dispatc ...

  7. tomcat 启动报错 解决办法 A child container failed during

    控制台报错: Caused by: org.apache.catalina.LifecycleException: A child container failed during start at o ...

  8. 3.django连接mysql数据库及安装mysqldb驱动报错解决办法

    1.在setting.py设置连接数据库 DATABASES = { 'default': { 'ENGINE': 'django.db.backends.mysql', 'NAME': 'djang ...

  9. tomcat 启动报错 解决办法 A child container failed during&nbsp

    转自:http://blog.sina.com.cn/s/blog_4e1e357d0102v55c.html 控制台报错: Caused by: org.apache.catalina.Lifecy ...

随机推荐

  1. 分享一个Godaddy的优惠码,可以优惠35%——2013-11-23

    国外的域名注册商就是好,还有优惠码,付费的时候贴上优惠码就能免相应的金额,不错. 在网上找的一个35%优惠的优惠码,可以买域名和主机.(主机就免了,有点贵,域名不错) 我买了个com域名,原本$12. ...

  2. 【读书笔记】Data_Mining_with_R---Chapter_2_Predicting Algae Blooms

    本书概要 <Data Mining with R>这本书通过实例,并结合R讲解数据挖掘技术.本书的核心理念就是"Learning it by doing".本书分5章, ...

  3. Fiddler插件开发 - 实现网站离线浏览功能

    有这么一种应用场景: 你是做前端或APP开发的,需要调用服务端提供的接口,接口只能在公司内网访问:在公司外就无法调试代码了. 想在公司外访问怎么办呢? 如果在公司的时候将所有接口的响应内容都保存起来, ...

  4. docker的swarm介绍

    转载自:https://blog.csdn.net/karamos/article/details/80132082 另外一篇:https://www.jianshu.com/p/9eb9995884 ...

  5. ElasticStack系列之九 & master、data 和 client 节点

    在生产环境下,如果不修改elasticsearch节点的角色信息,在高数据量,高并发的场景下集群容易出现脑裂等问题. 默认情况下,elasticsearch 集群中每个节点都有成为主节点的资格,也都存 ...

  6. 【Unity】UGUI无法修改UI元素的Pivot锚点位置

    如下图,要点击切换左边的Toggle按钮变为Pivot才可以编辑Pivot!   参考: https://answers.unity.com/questions/871238/cant-change- ...

  7. android 自定义无限循环播放的viewPager。轮播ViewPager。实现循环播放 广告,主题内容,活动,新闻内容时。

    前言 实际项目需要一个 播放广告的控件,可能有多个广告图片.每个一段时间更换该图片.简单来说,就是一个 “循环播放图片”的控件. 间隔时间更换图片 一般来说,图片切换时需要有动画效果 需要支持手势,用 ...

  8. 在windows下安装git后没有ssh文件夹

    在windows7下安装git后,运行 cd ~/.ssh $ bash: cd: /c/Users/Administrator/.ssh: No such file or directory 出现以 ...

  9. L1&L2 Regularization的原理

    L1&L2 Regularization   正则化方法:防止过拟合,提高泛化能力 在训练数据不够多时,或者overtraining时,常常会导致overfitting(过拟合).其直观的表现 ...

  10. react如何监听路由url变化

    "componentWillReceiveProps" "shouldComponentUpdate" "componentWillUpdate&qu ...