Exploring the Spark shell

Spark comes bundled with a PERL shell, which is a wrapper around the Scala shell. Though the Spark shell looks lime a command line for simple things, in reality a lot of complex queries can also be executed using it.

1. create the words directory

mkdir words

2. go into the words directory

cd words

3. create a sh.txt file

echo "to be or not to be" > sh.txt

4. start the Spark shell

spark-shell

5. load the words directory as RDD(Resilient Distributed Dataset)

Scala> val words = sc.textFile("hdfs://localhost:9000/user/hduser/words")

6. count the number of lines(result: 1)

Scala> words.count

7. divide the line (or lines) into multiple words

Scala> val wordsFlatMap = words.flatmap(_.split("\\W+"))

8. convert word to (word, 1)

Scala> val wordsMap = wordsFlatMap.map(w => (w, 1))

9. add the number of occurrences for each word

Scala> val wordCount = wordsMap.reduceByKey((a, b) => (a + b))

10. sort the results

Scala> val wordCountSorted = wordCount.sortByKey(true)

11. print the RDD

Scala> wordCountSorted.collect.foreach(println)

12. doing all operations in one step

Scala> sc.textFile("hdfs://localhost:9000/user/hduser/words").flatMap(_.split("\\W+")).map(w => (w,1)).reduceByKey((a,b) => (a+b)).sortByKey(true).collect.foreach(println)

This gives us the following output:
(or,1)
(to,2)
(not,1)
(be,2)

Developing Spark applications in Eclipse with Maven

Maven has two primary features:

1. Convention over configuration

/src/main/scala
/src/main/java
/src/main/resources
/src/test/scala
/src/test/java
/src/test/resources

2. Declarative dependency management

<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>

Install Maven plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Click on Collaboration tools

5. Check Maven's integration with Eclipse

6. Click on Next and then click on Finish

Install the Scala plugin for Eclipse:

1. Open Eclipse and navigate to Help | Install New Software

2. Click on the Work with drop-down menu

3. Select the <eclipse version> update site

4. Type http://download.scala-ide.org/sdk/helium/e38/scala210/stable/site

5. Press Enter

6. Select Scala IDE for Eclipse

7. Click on Next and then click on Finish

8. Navigate to Window | Open Perspective | Scala

Developing Spark applications in Eclipse with SBT

Simple Build Tool(SBT) is a build tool made especially for Scala-based development. SBT follows Maven-based naming conventions and declarative dependency management.

SBT provides the following enchancements over Maven:

1. Dependencies are in the form of key-value pairs in the build.sbt file as opposed to pom.xml in Maven

2. It provides a shell that makes it very handy to perform build operations

3. For simple projects without dependencies, you do not even need the build.sbt file

In build.sbt, the first line is the project definition:

lazy val root = (project in file("."))

Each project has an immutable map of key-value pairs.

lazy val root = (project in file("."))
settings(
name := "wordcount"
)

Every change in the settings leads to a new map, as it's an immutable map

1. add to the global plugin file

mkdir /home/hduser/.sbt/0.13/plugins
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > /home/hduser/.sbt/0.13/plugins/plugin.sbt

or add to specific project

cd <project-home>
echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0") > plugin.sbt

2. start the sbt shell

sbt

3. type eclipse and it will make an Eclipse-ready project

eclipse

4. navigate to File | Import | Import existing project into workspace to load the project into Eclipse

Spark(2) - Developing Application with Spark的更多相关文章

  1. (一)Spark简介-Java&Python版Spark

    Spark简介 视频教程: 1.优酷 2.YouTube 简介: Spark是加州大学伯克利分校AMP实验室,开发的通用内存并行计算框架.Spark在2013年6月进入Apache成为孵化项目,8个月 ...

  2. Spark学习(四) -- Spark作业提交

    标签(空格分隔): Spark 作业提交 先回顾一下WordCount的过程: sc.textFile("README.rd").flatMap(line => line.s ...

  3. Spark入门实战系列--1.Spark及其生态圈简介

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .简介 1.1 Spark简介 年6月进入Apache成为孵化项目,8个月后成为Apache ...

  4. Spark入门实战系列--3.Spark编程模型(上)--编程模型及SparkShell实战

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 .Spark编程模型 1.1 术语定义 l应用程序(Application): 基于Spar ...

  5. Spark入门实战系列--4.Spark运行架构

    [注]该系列文章以及使用到安装包/测试数据 可以在<倾情大奉送--Spark入门实战系列>获取 1. Spark运行架构 1.1 术语定义 lApplication:Spark Appli ...

  6. Spark中文指南(入门篇)-Spark编程模型(一)

    前言 本章将对Spark做一个简单的介绍,更多教程请参考:Spark教程 本章知识点概括 Apache Spark简介 Spark的四种运行模式 Spark基于Standlone的运行流程 Spark ...

  7. Spark On Yarn:提交Spark应用程序到Yarn

    转载自:http://lxw1234.com/archives/2015/07/416.htm 关键字:Spark On Yarn.Spark Yarn Cluster.Spark Yarn Clie ...

  8. 大数据技术之_19_Spark学习_01_Spark 基础解析 + Spark 概述 + Spark 集群安装 + 执行 Spark 程序

    第1章 Spark 概述1.1 什么是 Spark1.2 Spark 特点1.3 Spark 的用户和用途第2章 Spark 集群安装2.1 集群角色2.2 机器准备2.3 下载 Spark 安装包2 ...

  9. 【Spark深入学习 -14】Spark应用经验与程序调优

    ----本节内容------- 1.遗留问题解答 2.Spark调优初体验 2.1 利用WebUI分析程序瓶颈 2.2 设置合适的资源 2.3 调整任务的并发度 2.4 修改存储格式 3.Spark调 ...

随机推荐

  1. XAF实现运行时填加验证规则并保存到数据库中

    有几种方法可以用来声明一个验证规则.最常用的方法是使用对应的Attribute来定义.详见这里.验证模块还允许您通过在业务类实现 IRuleSource 接口定义自定义的验证规则的来源. IRuleS ...

  2. C#中Monitor和Lock以及区别

    Monitor对象 1.Monitor.Enter(object)方法是获取锁,Monitor.Exit(object)方法是释放锁,这就是Monitor最常用的两个方法,当然在使用过程中为了避免获取 ...

  3. [转载] Linux进程关系

    在工作中, 主进程创建了子进程, 而子进程又创建了孙子进程, 然而子进程被莫名其妙的 kill 了, 结果主进程又启动了一个子进程, 子进程又尝试创建孙子进程, 但是这时候就有问题了, 因为孙子进程还 ...

  4. Javascript中日期函数的相关操作

    Date对象具有多种构造函数,下面简单列举如下: new Date() new Date(milliseconds) new Date(datestring) new Date(year, month ...

  5. Android alertdialog实现确认退出

    package com.example.alertdialog; import android.os.Bundle; import android.app.Activity; import andro ...

  6. Python学习笔记--XML的应用

    XML的定义 XML 指可扩展标记语言(EXtensible Markup Language) XML 是一种标记语言,很类似 HTML XML 的设计宗旨是传输数据,而非显示数据 XML 标签没有被 ...

  7. 关于json 的那些知识点

    深入理解JSON对象 前面的话 json(javascript object notation)全称是javascript对象表示法,它是一种数据交换的文本格式,而不是一种编程语言,用于读取结构化数据 ...

  8. Eclipse上安装springsource-tool-suite(zhuan)

    http://jingyan.baidu.com/article/1612d5005fd087e20f1eee10.html *********************************** s ...

  9. ListView配合BaseAdapter

    BaseAdapter使用比较麻烦,它是个抽象类,需要重写4个方法分别是getCount() getItem(..) getItemId(..) getVew(..),相应的使用BaseAdapter ...

  10. PacBio全基因组测序和组装

    PacBio公司的业务范围也就5个(官网): Whole Genome Sequencing Targeted Sequencing Complex Populations RNA Sequencin ...