运行 ./sbin/start-master.sh :


  1. SparkCommand:/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp /home/server/spark/conf/:/home/server/spark/jars/*:/home/server/hadoop/etc/hadoop/:/home/server/hadoop/share/hadoop/common/lib/:/home/server/hadoop/share/hadoop/common/:/home/server/hadoop/share/hadoop/mapreduce/:/home/server/hadoop/share/hadoop/mapreduce/lib/:/home/server/hadoop/share/hadoop/yarn/:/home/server/hadoop/share/hadoop/yarn/lib/ -Xmx1g org.apache.spark.deploy.master.Master --host ThinkPad-W550s-Lab --port 7077 --webui-port 8080
  2. ========================================
  3. Error: A JNI error has occurred, please check your installation and try again
  4. Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
  5. at java.lang.Class.getDeclaredMethods0(Native Method)
  6. at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
  7. at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
  8. at java.lang.Class.getMethod0(Class.java:3018)
  9. at java.lang.Class.getMethod(Class.java:1784)
  10. at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
  11. at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
  12. Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
  13. at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  14. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  15. at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  16. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  17. ... 7 more​​

参考Spark官网:http://spark.apache.org/docs/latest/hadoop-provided.html

Using Spark's "Hadoop Free" Build

Spark uses Hadoop client libraries for HDFS and YARN. Starting in version Spark 1.4, the project packages “Hadoop free” builds that lets you more easily connect a single Spark binary to any Hadoop version. To use these builds, you need to modify SPARK_DIST_CLASSPATH to include Hadoop’s package jars. The most convenient place to do this is by adding an entry in conf/spark-env.sh.

This page describes how to connect Spark to Hadoop for different types of distributions.

Apache Hadoop

For Apache distributions, you can use Hadoop’s ‘classpath’ command. For instance:


  1. ### in conf/spark-env.sh ###
  2. # If 'hadoop' binary is on your PATH
  3. export SPARK_DIST_CLASSPATH=$(hadoop classpath)
  4. # With explicit path to 'hadoop' binary
  5. export SPARK_DIST_CLASSPATH=$(/path/to/hadoop/bin/hadoop classpath)
  6. # Passing a Hadoop configuration directory
  7. export SPARK_DIST_CLASSPATH=$(hadoop --config /path/to/configs classpath)​

最终在spark-env.sh文件添加如下配置:


  1. export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/hadoop-2.7.3/bin/hadoop classpath)

启动运行,成功!

Spark Hadoop Free 安装遇到的问题的更多相关文章

  1. 最新hadoop+hbase+spark+zookeeper环境安装(vmmare下)

    说明:我这里安装的版本是hadoop2.7.3,hbase1.2.4,spark2.0.2,zookeeper3.4.9 (安装包:链接:http://pan.baidu.com/s/1c25hI4g ...

  2. hadoop环境的安装 和 spark环境的安装

    hadoop环境的安装1.前提:安装了java spark环境的安装1.前提:安装了java,python2.直接pip install pyspark就可以安装完成.(pip是python的软件安装 ...

  3. spark集群安装并集成到hadoop集群

    前言 最近在搞hadoop+spark+python,所以就搭建了一个本地的hadoop环境,基础环境搭建地址hadoop2.7.7 分布式集群安装与配置 本篇博客主要说明,如果搭建spark集群并集 ...

  4. spark实验(三)--Spark和Hadoop的安装(1)

    一.实验目的 (1)掌握在 Linux 虚拟机中安装 Hadoop 和 Spark 的方法: (2)熟悉 HDFS 的基本使用方法: (3)掌握使用 Spark 访问本地文件和 HDFS 文件的方法. ...

  5. 实验 3 Spark 和 Hadoop 的安装

      1.           安装 Hadoop 和 Spark 进入 Linux 系统,参照本教程官网"实验指南"栏目的"Hadoop 的安装和使用",完成 ...

  6. Windows下搭建Spark+Hadoop开发环境

    Windows下搭建Spark+Hadoop开发环境需要一些工具支持. 只需要确保您的电脑已装好Java环境,那么就可以开始了. 一. 准备工作 1. 下载Hadoop2.7.1版本(写Spark和H ...

  7. [bigdata] spark集群安装及测试

    在spark安装之前,应该已经安装了hadoop原生版或者cdh,因为spark基本要基于hdfs来进行计算. 1. 下载 spark:  http://mirrors.cnnic.cn/apache ...

  8. Win7 单机Spark和PySpark安装

    欢呼一下先.软件环境菜鸟的我终于把单机Spark 和 Pyspark 安装成功了.加油加油!!! 1. 安装方法参考: 已安装Pycharm 和 Intellij IDEA. win7 PySpark ...

  9. spark集群安装配置

    spark集群安装配置 一. Spark简介 Spark是一个通用的并行计算框架,由UCBerkeley的AMP实验室开发.Spark基于map reduce 算法模式实现的分布式计算,拥有Hadoo ...

随机推荐

  1. centos安装nginx和配置

    安装所需环境 Nginx 是 C语言 开发,建议在 Linux 上运行,当然,也可以安装 Windows 版本,本篇则使用 CentOS 7 作为安装环境. 一. gcc 安装安装 nginx 需要先 ...

  2. slaac

    https://zhidao.baidu.com/question/460186176.html slaac是IPv6中的术语.Stateless address autoconfiguration, ...

  3. strcmp,stricmp

    strcmp,stricmp:原型:int strcmp(const void *s1, const void *s2);功能:比较字符串s1和s2是否相同,区分大小写. 说明:如果s1=s2则返回零 ...

  4. L1-028. 判断素数

    本题的目标很简单,就是判断一个给定的正整数是否素数. 输入格式: 输入在第一行给出一个正整数N(<=10),随后N行,每行给出一个小于231的需要判断的正整数. 输出格式: 对每个需要判断的正整 ...

  5. getAttribLocation的返回值

    var coord = gl.getAttribLocation(shaderProgram, "coordinates");    // 0 var coord2 = gl.ge ...

  6. threejs教程

    http://www.haomou.net/2015/08/30/2015_threejs0/ http://www.johannes-raida.de/tutorials.htm https://w ...

  7. Groovy实现代码热载的机制和原理

    前言: 真的很久没在博客园上更新博客了, 现在趁这段空闲的时间, 对之前接触的一些工程知识做下总结. 先来讲下借用Groovy如何来实现代码的热载, 以及其中涉及到的原理和需要注意的点. 总的来说, ...

  8. 【leetcode】278. First Bad Version

    problem 278. First Bad Version solution1:遍历: // Forward declaration of isBadVersion API. bool isBadV ...

  9. [LeetCode&Python] Problem 401. Binary Watch

    A binary watch has 4 LEDs on the top which represent the hours (0-11), and the 6 LEDs on the bottom ...

  10. [LeetCode&Python] Problem 690. Employee Importance

    You are given a data structure of employee information, which includes the employee's unique id, his ...