Exploring the Spark shell

Spark comes bundled with a PERL shell, which is a wrapper around the Scala shell. Though the Spark shell looks lime a command line for simple things, in reality a lot of complex queries can also be executed using it.

1. create the words directory

mkdir words

2. go into the words directory

cd words

3. create a sh.txt file

echo "to be or not to be" > sh.txt

4. start the Spark shell

spark-shell

5. load the words directory as RDD(Resilient Distributed Dataset)

Scala> val words = sc.textFile("hdfs://localhost:9000/user/hduser/words")

6. count the number of lines(result: 1)

Scala> words.count

7. divide the line (or lines) into multiple words

Scala> val wordsFlatMap = words.flatmap(_.split("\\W+"))

8. convert word to (word, 1)

Scala> val wordsMap = wordsFlatMap.map(w => (w, 1))

9. add the number of occurrences for each word

Scala> val wordCount = wordsMap.reduceByKey((a, b) => (a + b))

10. sort the results

Scala> val wordCountSorted = wordCount.sortByKey(true)

11. print the RDD

Scala> wordCountSorted.collect.foreach(println)

12. doing all operations in one step

Scala> sc.textFile("hdfs://localhost:9000/user/hduser/words").flatMap(_.split("\\W+")).map(w => (w,1)).reduceByKey((a,b) => (a+b)).sortByKey(true).collect.foreach(println)

This gives us the following output:
(or,1)
(to,2)
(not,1)
(be,2)

Developing Spark applications in Eclipse with Maven

Maven has two primary features:

1. Convention over configuration

/src/main/scala
/src/main/java
/src/main/resources
/src/test/scala
/src/test/java
/src/test/resources

2. Declarative dependency management

<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>

Install Maven plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Click on Collaboration tools

5. Check Maven's integration with Eclipse

6. Click on Next and then click on Finish

Install the Scala plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Type http://download.scala-ide.org/sdk/helium/e38/scala210/stable/site

5. Press Enter

6. Select Scala IDE for Eclipse

7. Click on Next and then click on Finish

8. Navigate to Window | Open Perspective | Scala

Developing Spark applications in Eclipse with SBT

Simple Build Tool(SBT) is a build tool made especially for Scala-based development. SBT follows Maven-based naming conventions and declarative dependency management.

SBT provides the following enchancements over Maven:

1. Dependencies are in the form of key-value pairs in the build.sbt file as opposed to pom.xml in Maven

2. It provides a shell that makes it very handy to perform build operations

3. For simple projects without dependencies, you do not even need the build.sbt file

In build.sbt, the first line is the project definition:

lazy val root = (project in file("."))

Each project has an immutable map of key-value pairs.

lazy val root = (project in file("."))
settings(
name := "wordcount"
)

Every change in the settings leads to a new map, as it's an immutable map

1. add to the global plugin file

mkdir /home/hduser/.sbt/0.13/plugins
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > /home/hduser/.sbt/0.13/plugins/plugin.sbt

or add to specific project

cd <project-home>
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > plugin.sbt

2. start the sbt shell

sbt

3. type eclipse and it will make an Eclipse-ready project

eclipse

4. navigate to File | Import | Import existing project into workspace to load the project into Eclipse

Spark(2) - Developing Application with Spark的更多相关文章

  1. (一)Spark简介-Java&Python版Spark

    Spark简介 视频教程: 1.优酷 2.YouTube 简介: Spark是加州大学伯克利分校AMP实验室,开发的通用内存并行计算框架.Spark在2013年6月进入Apache成为孵化项目,8个月 ...

  2. Spark学习(四) -- Spark作业提交

    标签(空格分隔): Spark 作业提交 先回顾一下WordCount的过程: sc.textFile("README.rd").flatMap(line => line.s ...

  3. Spark入门实战系列--1.Spark及其生态圈简介

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .简介 1.1 Spark简介 年6月进入Apache成为孵化项目,8个月后成为Apache ...

  4. Spark入门实战系列--3.Spark编程模型(上)--编程模型及SparkShell实战

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .Spark编程模型 1.1 术语定义 l应用程序(Application): 基于Spar ...

  5. Spark入门实战系列--4.Spark运行架构

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 1. Spark运行架构 1.1 术语定义 lApplication:Spark Appli ...

  6. Spark中文指南(入门篇)-Spark编程模型(一)

    前言 本章将对Spark做一个简单的介绍,更多教程请参考:Spark教程 本章知识点概括 Apache Spark简介 Spark的四种运行模式 Spark基于Standlone的运行流程 Spark ...

  7. Spark On Yarn:提交Spark应用程序到Yarn

    转载自:http://lxw1234.com/archives/2015/07/416.htm 关键字:Spark On Yarn.Spark Yarn Cluster.Spark Yarn Clie ...

  8. 大数据技术之_19_Spark学习_01_Spark 基础解析 + Spark 概述 + Spark 集群安装 + 执行 Spark 程序

    第1章 Spark 概述1.1 什么是 Spark1.2 Spark 特点1.3 Spark 的用户和用途第2章 Spark 集群安装2.1 集群角色2.2 机器准备2.3 下载 Spark 安装包2 ...

  9. 【Spark深入学习 -14】Spark应用经验与程序调优

    ----本节内容------- 1.遗留问题解答 2.Spark调优初体验 2.1 利用WebUI分析程序瓶颈 2.2 设置合适的资源 2.3 调整任务的并发度 2.4 修改存储格式 3.Spark调 ...

随机推荐

  1. .NET中的三种接口实现方式

    摘自:http://www.cnblogs.com/zhangronghua/archive/2009/11/25/1610713.html 一般来说.NET提供了三种不同的接口实现方式,分别为隐式接 ...

  2. 如何在VirtualBox虚拟机软件上安装Win7虚拟系统

    在Windows系统中安装VirtualBox 双击从官网上下载的VirtualBox-4.3.12-93733-Win.exe安装程序,默认下一步,下一步完成基础安装. 在VirtualBox虚拟机 ...

  3. LTE Module User Documentation(翻译3)——仿真输出

    LTE用户文档 (如有不当的地方,欢迎指正!) 6 仿真输出 ns-3 LTE 模型当前支持输出 PHY, MAC, RLC 和 PDCP 级别的 Key Performance Indicators ...

  4. Doragon Kuesuto 1.6

    /* * <<D Q>> * * Author xkfx<wyzxk_fx@163.com> * * 游戏规则:利用适当的决策,在13回合内击杀恶龙取得胜利. * ...

  5. C# 上传RAR文件 解压 获取解压后的文件名称

    此方法适用于C盘windows文件夹中有WinRAR.exe文件 if (fileExt.ToUpper() == ".RAR") { string zpath = Server. ...

  6. Python SSH登陆--pexpect,pxssh

    from pexpect import pxssh host = '192.168.80.139'user = 'allen'password = 'allen'command = 'df -h' d ...

  7. Linux 中如何卸载已安装的软件

    Linux 中如何卸载已安装的软件. Linux软件的安装和卸载一直是困扰许多新用户的难题.在Windows中,我们可以使用软件自带的安装卸载程序或在控制面板中的“添加/删除程序”来实现.与其相类似, ...

  8. 题目:在泛型为Integer的容器内添加一个字符串.

    这个题目有两种解法,第一种利用反射来解决: //ArrayList<Integer> list = new ArrayList<Integer>(); //在这个泛型为Inte ...

  9. (三)uboot源码分析

    一.九鼎官方uboot和三星原版uboot对比(1)以九鼎官方的uboot为蓝本来学习的,以三星官方的这份为对照.(2)不同版本的uboot或者同一版本不同人移植的uboot,可能目录结构和文件内容都 ...

  10. js文字上下滚动代码

    <div id="dome"> <div id="dome1"> <ul class="express"> ...