安装Scala 
1,到http://www.scala-lang.org/download/ 下载与Spark版本对应的Scala。Spark1.2对应于Scala2.10的版本。这里下载scala-2.10.4.tgz。 
2,解压安装Scala 
1), 执行#tar -axvf scala-2.10.4.tgz,解压到/root/spark/scala-2.10.4。 
2),在~/.bash_profile中添加如下配置:

export SCALA_HOME=/root/spark/scala-2.10.4
export PATH=$JAVA_HOME/bin$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$PATH

3),使环境变量生效,#source ~/.bash_profile 
3,验证安装,在命令行中输入scala命令,可以进入scala命令控制台。

# scala
Welcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_45).
Type in expressions to have them evaluated.
Type :help for more information. scala>

三,安装Spark 
1,到http://spark.apache.org/downloads.html下载spark-1.2.0-bin-hadoop2.4.tgz,解压到/root/spark/spark-1.2.0-bin-hadoop2.4。 
2,在.bash_profile中添加如下配置:

export SPARK_HOME=/root/spark/spark-1.2.0-bin-hadoop2.4
export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$HIVE_HOME/bin:$PATH

3,使环境变量生效,#source ~/.bash_profile

四,配置Spark 
1,进入Spark的配置文件路径,#cd $SPARK_HOME/conf 
2,执行,#cp spark-env.sh.template spark-env.sh 
3,在spark-env.sh文件中添加如下配置:

export JAVA_HOME=/usr/lib/jdk1.6.0_45
export SCALA_HOME=/root/spark/scala-2.10.4
export HADOOP_CONF_DIR=/root/hadoop/hadoop-2.6.0/etc/hadoop

五,启动Spark 
1,进入spark的安装路径,#cd /root/spark/spark-1.2.0-bin-hadoop2.4 
2,执行#./sbin/start-all.sh命令 
3,执行 #jps命令,会有Master和Worker进程

# jps
38907 RunJar
39030 RunJar
54679 NameNode
26587 Jps
54774 DataNode
9850 Worker
9664 Master
55214 NodeManager
55118 ResourceManager
54965 SecondaryNameNode

4,进入Spark的Web界面:http://datanode-4:8080/ 

5,执行,#./bin/spark-shell命令,可以进入Spark的shell环境,可以通过http://datanode-4:4040,看到SparkUI的情况。 

Last login: Sun Oct  8 05:35:42 2017 from 192.168.1.1
[hadoop@blm ~]$ java -version
java version "1.7.0_65"
Java(TM) SE Runtime Environment (build 1.7.0_65-b17)
Java HotSpot(TM) Client VM (build 24.65-b04, mixed mode)
[hadoop@blm ~]$ ifconfig
eth0      Link encap:Ethernet  HWaddr 00:0C:29:3C:BF:E3  
          inet addr:192.168.1.103  Bcast:192.168.1.255  Mask:255.255.255.0
          inet6 addr: fe80::20c:29ff:fe3c:bfe3/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:4461 errors:0 dropped:0 overruns:0 frame:0
          TX packets:5051 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:362317 (353.8 KiB)  TX bytes:411434 (401.7 KiB)
          Interrupt:19 Base address:0x2024

lo        Link encap:Local Loopback  
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:325 errors:0 dropped:0 overruns:0 frame:0
          TX packets:325 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0
          RX bytes:27918 (27.2 KiB)  TX bytes:27918 (27.2 KiB)

[hadoop@blm ~]$ uname -a
Linux blm 2.6.32-358.el6.i686 #1 SMP Thu Feb 21 21:50:49 UTC 2013 i686 i686 i386 GNU/Linux
[hadoop@blm ~]$ ll
total 449508
-rw-rw-r--.  1 hadoop hadoop  80288778 Oct  5 22:25 apache-hive-0.14.0-bin.tar.gz
drwxrwxr-x. 10 hadoop hadoop      4096 Oct  8 05:27 app
-rw-rw-r--.  1 hadoop hadoop         0 Oct  4 20:21 a.txt
-rw-rw-r--.  1 hadoop hadoop         0 Oct  4 20:21 b.txt
-rw-rw-r--.  1 hadoop hadoop         0 Oct  4 20:21 c.txt
drwxrwxr-x.  2 hadoop hadoop      4096 Oct  4 22:34 download
-rwxrw-rw-.  1 hadoop hadoop 160860571 Oct  2 14:19 eclipse-java-luna-SR2-linux-gtk.tar.gz
-rw-rw-r--.  1 hadoop hadoop     27315 Oct  4 00:32 flow.jar
-rw-rw-r--.  1 hadoop hadoop     17765 Oct  4 03:50 flowsum.jar
-rw-rw-r--.  1 hadoop hadoop  15417097 Oct  1 03:33 hadoop-2.4.1-src.tar.gz
-rw-rw-r--.  1 hadoop hadoop 138656756 Oct  1 03:33 hadoop-2.4.1.tar.gz
-rwxrw-rw-.  1 hadoop hadoop      2214 Jul 30  2013 HTTP_20130313143750.dat
drwxr-xr-x.  8 hadoop hadoop      4096 Jun 16  2014 jdk1.7.0_65
lrwxrwxrwx.  1 hadoop hadoop        32 Oct  2 02:33 Link to eclipse -> /home/hadoop/app/eclipse/eclipse
-rw-rw-r--.  1 hadoop hadoop  29937534 Oct 13 08:57 scala-2.10.4.tgz.gz
-rw-rw-r--.  1 hadoop hadoop     10808 Oct  3 01:57 wc.jar
-rw-rw-r--.  1 hadoop hadoop        96 Oct  3 01:41 word.log
drwxrwxr-x.  6 hadoop hadoop      4096 Oct  2 08:10 workspace
-r-xr--r--.  1 hadoop hadoop  35042811 Oct  4 22:39 zookeeper-3.4.10.tar.gz
[hadoop@blm ~]$ tar -zxvf scala-2.10.4.tgz.gz  -C app/
scala-2.10.4/
scala-2.10.4/man/
scala-2.10.4/man/man1/
scala-2.10.4/man/man1/scaladoc.1
scala-2.10.4/man/man1/scalap.1
scala-2.10.4/man/man1/scalac.1
scala-2.10.4/man/man1/fsc.1
scala-2.10.4/man/man1/scala.1
scala-2.10.4/src/
scala-2.10.4/src/scala-library-src.jar
scala-2.10.4/src/scala-swing-src.jar
scala-2.10.4/src/fjbg-src.jar
scala-2.10.4/src/scala-reflect-src.jar
scala-2.10.4/src/scalap-src.jar
scala-2.10.4/src/msil-src.jar
scala-2.10.4/src/scala-compiler-src.jar
scala-2.10.4/src/scala-actors-src.jar
scala-2.10.4/src/scala-partest-src.jar
scala-2.10.4/doc/
scala-2.10.4/doc/README
scala-2.10.4/doc/tools/
scala-2.10.4/doc/tools/index.html
scala-2.10.4/doc/tools/scalap.html
scala-2.10.4/doc/tools/scalac.html
scala-2.10.4/doc/tools/fsc.html
scala-2.10.4/doc/tools/css/
scala-2.10.4/doc/tools/css/style.css
scala-2.10.4/doc/tools/images/
scala-2.10.4/doc/tools/images/scala_logo.png
scala-2.10.4/doc/tools/images/external.gif
scala-2.10.4/doc/tools/scaladoc.html
scala-2.10.4/doc/tools/scala.html
scala-2.10.4/doc/LICENSE
scala-2.10.4/doc/licenses/
scala-2.10.4/doc/licenses/mit_jquery-ui.txt
scala-2.10.4/doc/licenses/mit_tools.tooltip.txt
scala-2.10.4/doc/licenses/mit_sizzle.txt
scala-2.10.4/doc/licenses/mit_jquery-layout.txt
scala-2.10.4/doc/licenses/mit_jquery.txt
scala-2.10.4/doc/licenses/bsd_jline.txt
scala-2.10.4/doc/licenses/apache_jansi.txt
scala-2.10.4/doc/licenses/bsd_asm.txt
scala-2.10.4/examples/
scala-2.10.4/examples/sort2.scala
scala-2.10.4/examples/iterators.scala
scala-2.10.4/examples/monads/
scala-2.10.4/examples/monads/callccInterpreter.scala
scala-2.10.4/examples/monads/stateInterpreter.scala
scala-2.10.4/examples/monads/simpleInterpreter.scala
scala-2.10.4/examples/monads/directInterpreter.scala
scala-2.10.4/examples/monads/errorInterpreter.scala
scala-2.10.4/examples/tcpoly/
scala-2.10.4/examples/tcpoly/monads/
scala-2.10.4/examples/tcpoly/monads/Monads.scala
scala-2.10.4/examples/futures.scala
scala-2.10.4/examples/boundedbuffer.scala
scala-2.10.4/examples/sort1.scala
scala-2.10.4/examples/parsing/
scala-2.10.4/examples/parsing/ListParsers.scala
scala-2.10.4/examples/parsing/ListParser.scala
scala-2.10.4/examples/parsing/ArithmeticParser.scala
scala-2.10.4/examples/parsing/lambda/
scala-2.10.4/examples/parsing/lambda/TestParser.scala
scala-2.10.4/examples/parsing/lambda/Main.scala
scala-2.10.4/examples/parsing/lambda/TestSyntax.scala
scala-2.10.4/examples/parsing/lambda/test/
scala-2.10.4/examples/parsing/lambda/test/test-02.kwi
scala-2.10.4/examples/parsing/lambda/test/test-07.kwi
scala-2.10.4/examples/parsing/lambda/test/test-08.kwi
scala-2.10.4/examples/parsing/lambda/test/test-03.kwi
scala-2.10.4/examples/parsing/lambda/test/test-04.kwi
scala-2.10.4/examples/parsing/lambda/test/test-06.kwi
scala-2.10.4/examples/parsing/lambda/test/test-05.kwi
scala-2.10.4/examples/parsing/lambda/test/test-01.kwi
scala-2.10.4/examples/parsing/JSON.scala
scala-2.10.4/examples/parsing/MiniML.scala
scala-2.10.4/examples/parsing/ArithmeticParsers.scala
scala-2.10.4/examples/fors.scala
scala-2.10.4/examples/patterns.scala
scala-2.10.4/examples/computeserver.scala
scala-2.10.4/examples/oneplacebuffer.scala
scala-2.10.4/examples/sort.scala
scala-2.10.4/examples/package.scala
scala-2.10.4/examples/actors/
scala-2.10.4/examples/actors/seq.scala
scala-2.10.4/examples/actors/producers.scala
scala-2.10.4/examples/actors/links.scala
scala-2.10.4/examples/actors/boundedbuffer.scala
scala-2.10.4/examples/actors/message.scala
scala-2.10.4/examples/actors/auction.scala
scala-2.10.4/examples/actors/channels.scala
scala-2.10.4/examples/actors/fringe.scala
scala-2.10.4/examples/actors/pingpong.scala
scala-2.10.4/examples/actors/looping.scala
scala-2.10.4/examples/xml/
scala-2.10.4/examples/xml/phonebook/
scala-2.10.4/examples/xml/phonebook/phonebook1.scala
scala-2.10.4/examples/xml/phonebook/phonebook2.scala
scala-2.10.4/examples/xml/phonebook/embeddedBook.scala
scala-2.10.4/examples/xml/phonebook/verboseBook.scala
scala-2.10.4/examples/xml/phonebook/phonebook.scala
scala-2.10.4/examples/xml/phonebook/phonebook3.scala
scala-2.10.4/examples/gadts.scala
scala-2.10.4/examples/maps.scala
scala-2.10.4/misc/
scala-2.10.4/misc/scala-devel/
scala-2.10.4/misc/scala-devel/plugins/
scala-2.10.4/misc/scala-devel/plugins/continuations.jar
scala-2.10.4/lib/
scala-2.10.4/lib/typesafe-config.jar
scala-2.10.4/lib/akka-actors.jar
scala-2.10.4/lib/scala-actors.jar
scala-2.10.4/lib/scala-compiler.jar
scala-2.10.4/lib/scala-reflect.jar
scala-2.10.4/lib/scala-library.jar
scala-2.10.4/lib/scala-swing.jar
scala-2.10.4/lib/jline.jar
scala-2.10.4/lib/scala-actors-migration.jar
scala-2.10.4/lib/scalap.jar
scala-2.10.4/bin/
scala-2.10.4/bin/scaladoc.bat
scala-2.10.4/bin/scala.bat
scala-2.10.4/bin/scalac.bat
scala-2.10.4/bin/scala
scala-2.10.4/bin/scaladoc
scala-2.10.4/bin/fsc.bat
scala-2.10.4/bin/fsc
scala-2.10.4/bin/scalac
scala-2.10.4/bin/scalap.bat
scala-2.10.4/bin/scalap
[hadoop@blm ~]$ cd app
[hadoop@blm app]$ ll
total 22732
drwxrwxr-x.  8 hadoop hadoop     4096 Oct  5 22:32 apache-hive-0.14.0-bin
drwxrwxr-x.  9 hadoop hadoop     4096 Oct  3 21:36 eclipse
drwxr-xr-x. 11 hadoop hadoop     4096 Oct  1 05:16 hadoop-2.4.1
drwxr-xr-x. 15 hadoop hadoop     4096 Jun 20  2014 hadoop-2.4.1-src
drwxrwxr-x.  2 hadoop hadoop     4096 Oct  6 03:15 hive
drwxrwxr-x.  2 hadoop hadoop     4096 Oct  6 04:02 hivetestdata
-rw-rw-r--.  1 hadoop hadoop  7232487 Oct  5 23:59 MySQL-client-5.1.73-1.glibc23.i386.rpm
-rw-rw-r--.  1 hadoop hadoop 16004449 Oct  5 23:59 MySQL-server-5.1.73-1.glibc23.i386.rpm
drwxrwxr-x.  9 hadoop hadoop     4096 Mar 18  2014 scala-2.10.4
drwxr-xr-x. 11 root   root       4096 Oct  7 07:44 xx
drwxr-xr-x. 11 root   root       4096 Oct  8 05:27 zookeeper-3.4.5
[hadoop@blm app]$ clear
[hadoop@blm app]$ cd /etc/profile
-bash: cd: /etc/profile: Not a directory
[hadoop@blm app]$ su root
Password:
su: incorrect password
[hadoop@blm app]$ su
Password:
[root@blm app]# clear
[root@blm app]# vi /etc/profile
# /etc/profile

# System wide environment and startup programs, for login setup
# Functions and aliases go in /etc/bashrc

# It's NOT a good idea to change this file unless you know what you
# are doing. It's much better to create a custom.sh shell script in
# /etc/profile.d/ to make custom changes to your environment, as this
# will prevent the need for merging in future updates.

pathmunge () {
    case ":${PATH}:" in
        *:"$1":*)
            ;;
        *)
            if [ "$2" = "after" ] ; then
                PATH=$PATH:$1
            else
                PATH=$1:$PATH
            fi
    esac
}

if [ -x /usr/bin/id ]; then
    if [ -z "$EUID" ]; then
        # ksh workaround
        EUID=`id -u`
        UID=`id -ru`
    fi
    USER="`id -un`"
    LOGNAME=$USER
    MAIL="/var/spool/mail/$USER"
fi

# Path manipulation
if [ "$EUID" = "0" ]; then
    pathmunge /sbin
    pathmunge /usr/sbin
    pathmunge /usr/local/sbin
else
    pathmunge /usr/local/sbin after
    pathmunge /usr/sbin after
    pathmunge /sbin after
fi

HOSTNAME=`/bin/hostname 2>/dev/null`
HISTSIZE=1000
if [ "$HISTCONTROL" = "ignorespace" ] ; then
    export HISTCONTROL=ignoreboth
else
    export HISTCONTROL=ignoredups
fi

export PATH USER LOGNAME MAIL HOSTNAME HISTSIZE HISTCONTROL

export JAVA_HOME=/home/hadoop/jdk1.7.0_65
export SCALA_HOME=/home/hadoop/app/scala-2.10.4
export HADOOP_HOME=/home/hadoop/app/hadoop-2.4.1
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin
export HIVE_HOME=/home/hadoop/app/apache-hive-0.14.0-bin

# By default, we want umask to get set. This sets it for login shell
# Current threshold for system reserved uid/gids is 200
# You could check uidgid reservation validity in
# /usr/share/doc/setup-*/uidgid file
if [ $UID -gt 199 ] && [ "`id -gn`" = "`id -un`" ]; then
    umask 002
else
    umask 022
fi

for i in /etc/profile.d/*.sh ; do
    if [ -r "$i" ]; then
        if [ "${-#*i}" != "$-" ]; then
            . "$i"
        else
            . "$i" >/dev/null 2>&1
        fi
    fi
done

unset i
"/etc/profile" 85L, 2078C written
[root@blm app]# scala
bash: scala: command not found
[root@blm app]# java
Usage: java [-options] class [args...]
           (to execute a class)
   or  java [-options] -jar jarfile [args...]
           (to execute a jar file)
where options include:
    -d32          use a 32-bit data model if available
    -d64          use a 64-bit data model if available
    -client       to select the "client" VM
    -server       to select the "server" VM
    -hotspot      is a synonym for the "client" VM  [deprecated]
                  The default VM is client.

-cp <class search path of directories and zip/jar files>
    -classpath <class search path of directories and zip/jar files>
                  A : separated list of directories, JAR archives,
                  and ZIP archives to search for class files.
    -D<name>=<value>
                  set a system property
    -verbose:[class|gc|jni]
                  enable verbose output
    -version      print product version and exit
    -version:<value>
                  require the specified version to run
    -showversion  print product version and continue
    -jre-restrict-search | -no-jre-restrict-search
                  include/exclude user private JREs in the version search
    -? -help      print this help message
    -X            print help on non-standard options
    -ea[:<packagename>...|:<classname>]
    -enableassertions[:<packagename>...|:<classname>]
                  enable assertions with specified granularity
    -da[:<packagename>...|:<classname>]
    -disableassertions[:<packagename>...|:<classname>]
                  disable assertions with specified granularity
    -esa | -enablesystemassertions
                  enable system assertions
    -dsa | -disablesystemassertions
                  disable system assertions
    -agentlib:<libname>[=<options>]
                  load native agent library <libname>, e.g. -agentlib:hprof
                  see also, -agentlib:jdwp=help and -agentlib:hprof=help
    -agentpath:<pathname>[=<options>]
                  load native agent library by full pathname
    -javaagent:<jarpath>[=<options>]
                  load Java programming language agent, see java.lang.instrument
    -splash:<imagepath>
                  show splash screen with specified image
See http://www.oracle.com/technetwork/java/javase/documentation/index.html for more details.
[root@blm app]# su
[root@blm app]# vi /etc/profile
# /etc/profile

# System wide environment and startup programs, for login setup
# Functions and aliases go in /etc/bashrc

# It's NOT a good idea to change this file unless you know what you
# are doing. It's much better to create a custom.sh shell script in
# /etc/profile.d/ to make custom changes to your environment, as this
# will prevent the need for merging in future updates.

pathmunge () {
    case ":${PATH}:" in
        *:"$1":*)
            ;;
        *)
            if [ "$2" = "after" ] ; then
                PATH=$PATH:$1
            else
                PATH=$1:$PATH
            fi
    esac
}

if [ -x /usr/bin/id ]; then
    if [ -z "$EUID" ]; then
        # ksh workaround
        EUID=`id -u`
        UID=`id -ru`
    fi
    USER="`id -un`"
    LOGNAME=$USER
    MAIL="/var/spool/mail/$USER"
fi

# Path manipulation
if [ "$EUID" = "0" ]; then
    pathmunge /sbin
    pathmunge /usr/sbin
    pathmunge /usr/local/sbin
else
    pathmunge /usr/local/sbin after
    pathmunge /usr/sbin after
    pathmunge /sbin after
fi

HOSTNAME=`/bin/hostname 2>/dev/null`
HISTSIZE=1000
if [ "$HISTCONTROL" = "ignorespace" ] ; then
    export HISTCONTROL=ignoreboth
else
    export HISTCONTROL=ignoredups
fi

export PATH USER LOGNAME MAIL HOSTNAME HISTSIZE HISTCONTROL

export JAVA_HOME=/home/hadoop/jdk1.7.0_65
export SCALA_HOME=/home/hadoop/app/scala-2.10.4
export HADOOP_HOME=/home/hadoop/app/hadoop-2.4.1
export SPARK_HOME=/home/hadoop/app/spark-1.2.0-bin-hadoop2.4
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/bin
export HIVE_HOME=/home/hadoop/app/apache-hive-0.14.0-bin

# By default, we want umask to get set. This sets it for login shell
# Current threshold for system reserved uid/gids is 200
# You could check uidgid reservation validity in
# /usr/share/doc/setup-*/uidgid file
if [ $UID -gt 199 ] && [ "`id -gn`" = "`id -un`" ]; then
    umask 002
else
    umask 022
fi

for i in /etc/profile.d/*.sh ; do
    if [ -r "$i" ]; then
        if [ "${-#*i}" != "$-" ]; then
            . "$i"
        else
            . "$i" >/dev/null 2>&1
        fi
    fi
done

unset i
"/etc/profile" 86L, 2155C written

==============================================================================

[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ cd logs
[hadoop@blm logs]$ ll
total 8
-rw-rw-r--. 1 hadoop hadoop 2014 Oct 13 09:40 spark-hadoop-org.apache.spark.deploy.master.Master-1-blm.out
-rw-rw-r--. 1 hadoop hadoop 2091 Oct 13 09:40 spark-hadoop-org.apache.spark.deploy.worker.Worker-1-blm.out
[hadoop@blm logs]$ tail -100f spark-hadoop-org.apache.spark.deploy.master.Master-1-blm.out
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /home/hadoop/jdk1.7.0_65/bin/java -cp ::/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/app/hadoop-2.4.1 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip blm --port 7077 --webui-port 8080
========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/13 09:40:11 INFO Master: Registered signal handlers for [TERM, HUP, INT]
17/10/13 09:40:12 INFO SecurityManager: Changing view acls to: hadoop
17/10/13 09:40:12 INFO SecurityManager: Changing modify acls to: hadoop
17/10/13 09:40:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
17/10/13 09:40:15 INFO Slf4jLogger: Slf4jLogger started
17/10/13 09:40:16 INFO Remoting: Starting remoting
17/10/13 09:40:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@blm:7077]
17/10/13 09:40:17 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkMaster@blm:7077]
17/10/13 09:40:17 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
17/10/13 09:40:18 INFO Master: Starting Spark master at spark://blm:7077
17/10/13 09:40:28 INFO Utils: Successfully started service 'MasterUI' on port 8080.
17/10/13 09:40:28 INFO MasterWebUI: Started MasterWebUI at http://blm:8080
17/10/13 09:40:29 INFO Master: I have been elected leader! New state: ALIVE
17/10/13 09:40:32 INFO Master: Registering worker blm:38727 with 1 cores, 512.0 MB RAM
^C
[hadoop@blm logs]$ cat  spark-hadoop-org.apache.spark.deploy.worker.Worker-1-blm.out
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /home/hadoop/jdk1.7.0_65/bin/java -cp ::/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/app/hadoop-2.4.1 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.worker.Worker spark://blm:7077
========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/13 09:40:16 INFO Worker: Registered signal handlers for [TERM, HUP, INT]
17/10/13 09:40:16 INFO SecurityManager: Changing view acls to: hadoop
17/10/13 09:40:16 INFO SecurityManager: Changing modify acls to: hadoop
17/10/13 09:40:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
17/10/13 09:40:18 INFO Slf4jLogger: Slf4jLogger started
17/10/13 09:40:19 INFO Remoting: Starting remoting
17/10/13 09:40:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkWorker@blm:38727]
17/10/13 09:40:19 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkWorker@blm:38727]
17/10/13 09:40:19 INFO Utils: Successfully started service 'sparkWorker' on port 38727.
17/10/13 09:40:20 INFO Worker: Starting Spark worker blm:38727 with 1 cores, 512.0 MB RAM
17/10/13 09:40:20 INFO Worker: Spark home: /home/hadoop/app/spark-1.2.0-bin-hadoop2.4
17/10/13 09:40:30 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
17/10/13 09:40:30 INFO WorkerWebUI: Started WorkerWebUI at http://blm:8081
17/10/13 09:40:30 INFO Worker: Connecting to master spark://blm:7077...
17/10/13 09:40:32 INFO Worker: Successfully registered with master spark://blm:7077

export JAVA_HOME=/home/hadoop/jdk1.7.0_65
export SCALA_HOME=/home/hadoop/app/scala-2.10.4
export HADOOP_CONF_DIR=/home/hadoop/app/hadoop-2.4.1

"spark-env.sh" 59L, 3361C written                                                            
[hadoop@blm conf]$ ll
total 28
-rw-rw-r--. 1 hadoop hadoop  303 Dec 10  2014 fairscheduler.xml.template
-rw-rw-r--. 1 hadoop hadoop  620 Dec 10  2014 log4j.properties.template
-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10  2014 metrics.properties.template
-rw-rw-r--. 1 hadoop hadoop   80 Dec 10  2014 slaves.template
-rw-rw-r--. 1 hadoop hadoop  507 Dec 10  2014 spark-defaults.conf.template
-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh
[hadoop@blm conf]$ jps
4382 Jps
4027 Worker
3890 Master
[hadoop@blm conf]$ ll
total 28
-rw-rw-r--. 1 hadoop hadoop  303 Dec 10  2014 fairscheduler.xml.template
-rw-rw-r--. 1 hadoop hadoop  620 Dec 10  2014 log4j.properties.template
-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10  2014 metrics.properties.template
-rw-rw-r--. 1 hadoop hadoop   80 Dec 10  2014 slaves.template
-rw-rw-r--. 1 hadoop hadoop  507 Dec 10  2014 spark-defaults.conf.template
-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh
[hadoop@blm conf]$ pwd
/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/conf
[hadoop@blm conf]$ ll
total 28
-rw-rw-r--. 1 hadoop hadoop  303 Dec 10  2014 fairscheduler.xml.template
-rw-rw-r--. 1 hadoop hadoop  620 Dec 10  2014 log4j.properties.template
-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10  2014 metrics.properties.template
-rw-rw-r--. 1 hadoop hadoop   80 Dec 10  2014 slaves.template
-rw-rw-r--. 1 hadoop hadoop  507 Dec 10  2014 spark-defaults.conf.template
-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh
[hadoop@blm conf]$ cd ..
[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ ll
total 120
drwxrwxr-x. 2 hadoop hadoop  4096 Dec 10  2014 bin
drwxrwxr-x. 2 hadoop hadoop  4096 Oct 13 09:36 conf
drwxrwxr-x. 3 hadoop hadoop  4096 Dec 10  2014 data
drwxrwxr-x. 4 hadoop hadoop  4096 Dec 10  2014 ec2
drwxrwxr-x. 3 hadoop hadoop  4096 Dec 10  2014 examples
drwxrwxr-x. 2 hadoop hadoop  4096 Dec 10  2014 lib
-rw-rw-r--. 1 hadoop hadoop 45242 Dec 10  2014 LICENSE
drwxrwxr-x. 2 hadoop hadoop  4096 Oct 13 09:40 logs
-rw-rw-r--. 1 hadoop hadoop 22559 Dec 10  2014 NOTICE
drwxrwxr-x. 7 hadoop hadoop  4096 Dec 10  2014 python
-rw-rw-r--. 1 hadoop hadoop  3645 Dec 10  2014 README.md
-rw-rw-r--. 1 hadoop hadoop    35 Dec 10  2014 RELEASE
drwxrwxr-x. 2 hadoop hadoop  4096 Dec 10  2014 sbin
drwxrwxr-x. 2 hadoop hadoop  4096 Oct 13 09:40 work
[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ cd bin
[hadoop@blm bin]$ ll
total 108
-rwxrwxr-x. 1 hadoop hadoop 1047 Dec 10  2014 beeline
-rw-rw-r--. 1 hadoop hadoop  953 Dec 10  2014 beeline.cmd
-rw-rw-r--. 1 hadoop hadoop 5374 Dec 10  2014 compute-classpath.cmd
-rwxrwxr-x. 1 hadoop hadoop 6377 Dec 10  2014 compute-classpath.sh
-rw-rw-r--. 1 hadoop hadoop 2065 Dec 10  2014 load-spark-env.sh
-rwxrwxr-x. 1 hadoop hadoop 5049 Dec 10  2014 pyspark
-rw-rw-r--. 1 hadoop hadoop 2412 Dec 10  2014 pyspark2.cmd
-rw-rw-r--. 1 hadoop hadoop 1023 Dec 10  2014 pyspark.cmd
-rwxrwxr-x. 1 hadoop hadoop 2131 Dec 10  2014 run-example
-rw-rw-r--. 1 hadoop hadoop 2869 Dec 10  2014 run-example2.cmd
-rw-rw-r--. 1 hadoop hadoop 1035 Dec 10  2014 run-example.cmd
-rwxrwxr-x. 1 hadoop hadoop 6750 Dec 10  2014 spark-class
-rw-rw-r--. 1 hadoop hadoop 6482 Dec 10  2014 spark-class2.cmd
-rw-rw-r--. 1 hadoop hadoop 1033 Dec 10  2014 spark-class.cmd
-rwxrwxr-x. 1 hadoop hadoop 2884 Dec 10  2014 spark-shell
-rw-rw-r--. 1 hadoop hadoop  971 Dec 10  2014 spark-shell2.cmd
-rwxrwxr-x. 1 hadoop hadoop 1031 Dec 10  2014 spark-shell.cmd
-rwxrwxr-x. 1 hadoop hadoop 1744 Dec 10  2014 spark-sql
-rwxrwxr-x. 1 hadoop hadoop 2562 Dec 10  2014 spark-submit
-rw-rw-r--. 1 hadoop hadoop 2603 Dec 10  2014 spark-submit2.cmd
-rw-rw-r--. 1 hadoop hadoop 1033 Dec 10  2014 spark-submit.cmd
-rwxrwxr-x. 1 hadoop hadoop 2058 Dec 10  2014 utils.sh
[hadoop@blm bin]$ spark-shell
-bash: spark-shell: command not found
[hadoop@blm bin]$ ./spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/13 09:52:03 INFO SecurityManager: Changing view acls to: hadoop
17/10/13 09:52:03 INFO SecurityManager: Changing modify acls to: hadoop
17/10/13 09:52:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
17/10/13 09:52:03 INFO HttpServer: Starting HTTP Server
17/10/13 09:52:03 INFO Utils: Successfully started service 'HTTP class server' on port 40534.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.2.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) Client VM, Java 1.7.0_65)
Type in expressions to have them evaluated.
Type :help for more information.
17/10/13 09:52:29 INFO SecurityManager: Changing view acls to: hadoop
17/10/13 09:52:29 INFO SecurityManager: Changing modify acls to: hadoop
17/10/13 09:52:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
17/10/13 09:52:31 INFO Slf4jLogger: Slf4jLogger started
17/10/13 09:52:31 INFO Remoting: Starting remoting
17/10/13 09:52:33 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@blm:43213]
17/10/13 09:52:33 INFO Utils: Successfully started service 'sparkDriver' on port 43213.
17/10/13 09:52:34 INFO SparkEnv: Registering MapOutputTracker
17/10/13 09:52:34 INFO SparkEnv: Registering BlockManagerMaster
17/10/13 09:52:34 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20171013095234-d91a
17/10/13 09:52:34 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
17/10/13 09:52:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/13 09:52:37 INFO HttpFileServer: HTTP File server directory is /tmp/spark-a2325c17-1794-4c66-a240-6fecb4150ea1
17/10/13 09:52:37 INFO HttpServer: Starting HTTP Server
17/10/13 09:52:38 INFO Utils: Successfully started service 'HTTP file server' on port 41906.
17/10/13 09:52:49 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/10/13 09:52:49 INFO SparkUI: Started SparkUI at http://blm:4040
17/10/13 09:52:50 INFO Executor: Using REPL class URI: http://192.168.1.103:40534
17/10/13 09:52:50 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@blm:43213/user/HeartbeatReceiver
17/10/13 09:52:51 INFO NettyBlockTransferService: Server created on 46708
17/10/13 09:52:51 INFO BlockManagerMaster: Trying to register BlockManager
17/10/13 09:52:51 INFO BlockManagerMasterActor: Registering block manager localhost:46708 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 46708)
17/10/13 09:52:51 INFO BlockManagerMaster: Registered BlockManager
17/10/13 09:52:52 INFO SparkILoop: Created spark context..
Spark context available as sc.

install scala & spark env的更多相关文章

  1. Eclipse+maven+scala+spark环境搭建

    准备条件 我用的Eclipse版本 Eclipse Java EE IDE for Web Developers. Version: Luna Release (4.4.0) 我用的是Eclipse ...

  2. brdd 惰性执行 mapreduce 提取指定类型值 WebUi 作业信息 全局临时视图 pyspark scala spark 安装

    [rdd 惰性执行] 为了提高计算效率 spark 采用了哪些机制 1-rdd 基于分布式内存数据集进行运算 2-lazy evaluation  :惰性执行,即rdd的变换操作并不是在运行该代码时立 ...

  3. 在IntelliJ IDEA中创建和运行java/scala/spark程序

    本文将分两部分来介绍如何在IntelliJ IDEA中运行Java/Scala/Spark程序: 基本概念介绍 在IntelliJ IDEA中创建和运行java/scala/spark程序 基本概念介 ...

  4. [Spark] 00 - Install Hadoop & Spark

    Hadoop安装 Java环境配置 安装课程:安装配置 配置手册:Hadoop安装教程_单机/伪分布式配置_Hadoop2.6.0/Ubuntu14.04[依照步骤完成配置] jsk安装使用的链接中第 ...

  5. eclipse构建maven+scala+spark工程 转载

    转载地址:http://jingpin.jikexueyuan.com/article/47043.html 本文先叙述如何配置eclipse中maven+scala的开发环境,之后,叙述如何实现sp ...

  6. MacOS使用IDEA+Maven+Scala+Spark进行本地调试

    参考:spark开发环境搭建(基于idea 和maven) 安装JDK 从这里下载Java 8的JDK 设置JAVA_HOME环境变量,在Mac上它大概会是/Library/Java/JavaVirt ...

  7. Windows下Eclipse+Scala+Spark开发环境搭建

    1.安装JDK及配置java环境变量 本文使用版本为jdk1.7.0_79,过程略 2.安装scala 本文使用版本为2.11.8,过程略 3.安装spark 本文使用版本为spark-2.0.1-b ...

  8. Scala - Spark Lambda“goesto“ => 分析

    /// 定义一个函数AddNoise,参数分别为rdd,Fraction.其中rdd为(BreezeDenseMatrix, BreezeDenseMatrix)元组构成的RDD.Fraction为一 ...

  9. Eclipse + Idea + Maven + Scala + Spark +sbt

    http://jingpin.jikexueyuan.com/article/47043.html 新的scala 编译器idea使用 https://www.jetbrains.com/idea/h ...

随机推荐

  1. 【PyQt5-Qt Designer】简易的数字键盘输入+简易计算器

    参考如下键盘格式写了一个键盘输入,目前还不能进行运算,后期完善... 效果如下: 完整代码: from PyQt5.QtWidgets import (QApplication,QWidget,QPu ...

  2. MySQL Community Server 8.0.11下载与安装配置

    一.下载 1.选择合适的安装包,我在这里下载的是目前最新的安装包,8.0.11,而且我选择下载的是解压版的,安装版的话,安装会比较麻烦. MySQL Community Server下载链接:http ...

  3. python-面向对象-12_模块和包

    模块和包 目标 模块 包 发布模块 01. 模块 1.1 模块的概念 模块是 Python 程序架构的一个核心概念 每一个以扩展名 py 结尾的 Python 源代码文件都是一个 模块 模块名 同样也 ...

  4. git push error:error: insufficient permission for adding an object to repository database ./object解决

    在服务器代码库xxx.git文件夹中:1.sudo chmod -R g+ws *2.sudo chgrp -R mygroup * //mygroup是该文件夹的所有组3.git repo-conf ...

  5. python类型错误:can only concatenate list (not "str") to list

    TypeError:can only concatenate list (not "str") to list: 类型错误:只能将list类型和list类型联系起来,而不是str类 ...

  6. 矩形嵌套(dp)

    矩形嵌套 时间限制:3000 ms  |  内存限制:65535 KB 难度:4   描述 有n个矩形,每个矩形可以用a,b来描述,表示长和宽.矩形X(a,b)可以嵌套在矩形Y(c,d)中当且仅当a& ...

  7. web页面判断是否首次加载

    判断web页面是否是首次加载: if(!window.name){ window.name ='name' this.setState({ note:true })}

  8. map+case结构使用技巧

    people.txt文本如下 lyzx1, lyzx2, lyzx3, lyzx4, lyzx5, lyzx6, lyzx7, lyzx7,,哈哈 托塔天王 import org.apache.spa ...

  9. 一、程序设计与C语言

    @程序:用特殊的编程语言编写的代码,用于表达如何解决问题. @编程语言的作用:编程语言不是用来和计算机交谈的,而是用它来描述要求计算机如何解决问的过程或方法.计算机只能执行(懂得)机器语言. @辗转相 ...

  10. Java中输出正则表达式匹配到的内容

    import java.util.regex.Matcher; import java.util.regex.Pattern; public class A { public static void ...