Apache Spark 2.0: Faster, Easier, and Smarter

http://blog.madhukaraphatak.com/categories/spark-two/

https://amplab.cs.berkeley.edu/technical-preview-of-apache-spark-2-0-easier-faster-and-smarter/

 

 

Dataset - New Abstraction of Spark

For long, RDD was the standard abstraction of Spark.
But from Spark 2.0, Dataset will become the new abstraction layer for spark. Though RDD API will be available, it will become low level API, used mostly for runtime and library development. All user land code will be written against the Dataset abstraction and it’s subset Dataframe API.

 

2.0中,最关键的是在RDD这个low level抽象层上,又加了一组DataSet的high level的抽象层,让用户可以跟方便的开发

 

From Definition, ” A Dataset is a strongly typed collection of domain-specific objects that can be transformed in parallel using functional or relational operations. Each dataset also has an untyped view called a DataFrame, which is a Dataset of Row. “

which sounds similar to RDD definition

” RDD represents an immutable,partitioned collection of elements that can be operated on in parallel “

 

Dataset is a superset of Dataframe API which is released in Spark 1.3.

Dataframe是一种特殊化的Dataset,Dataframe = Dataset[row]

 

SparkSession - New entry point of Spark

In earlier versions of spark, spark context was entry point for Spark. As RDD was main API, it was created and manipulated using context API’s. For every other API,we needed to use different contexts.For streaming, we needed StreamingContext, for SQL sqlContext and for hive HiveContext. But as DataSet and Dataframe API’s are becoming new standard API’s we need an entry point build for them. So in Spark 2.0, we have a new entry point for DataSet and Dataframe API’s called as Spark Session.

因为有了新的抽象层,所以需要加新的入口,SparkSession

就像RDD对应于SparkContext

 

更强大的SQL支持

On the SQL side, we have significantly expanded the SQL capabilities of Spark, with the introduction of a new ANSI SQL parser and support for subqueries.

Spark 2.0 can run all the 99 TPC-DS queries, which require many of the SQL:2003 features.

 

Tungsten 2.0

Spark 2.0 ships with the second generation Tungsten engine.

This engine builds upon ideas from modern compilers and MPP databases and applies them to data processing.

The main idea is to emit optimized bytecode at runtime that collapses the entire query into a single function, eliminating virtual function calls and leveraging CPU registers for intermediate data. We call this technique “whole-stage code generation.”

在内存管理和基于CPU的性能优化上,faster

 

Structured Streaming

Spark 2.0’s Structured Streaming APIs is a novel way to approach streaming.
It stems from the realization that the simplest way to compute answers on streams of data is to not having to reason about the fact that it is a stream.

 

意思就是说,不要意识到流,流就是个无限的DataFrames

这个是典型的spark的思路,batch是一切的根本,和Flink截然相反

Spark 2.0的更多相关文章

  1. Spark 1.0.0 横空出世 Spark on Yarn 部署(Hadoop 2.4)

    就在昨天,北京时间5月30日20点多.Spark 1.0.0最终公布了:Spark 1.0.0 released 依据官网描写叙述,Spark 1.0.0支持SQL编写:Spark SQL Progr ...

  2. APACHE SPARK 2.0 API IMPROVEMENTS: RDD, DATAFRAME, DATASET AND SQL

    What’s New, What’s Changed and How to get Started. Are you ready for Apache Spark 2.0? If you are ju ...

  3. Apache Spark 3.0 将内置支持 GPU 调度

    如今大数据和机器学习已经有了很大的结合,在机器学习里面,因为计算迭代的时间可能会很长,开发人员一般会选择使用 GPU.FPGA 或 TPU 来加速计算.在 Apache Hadoop 3.1 版本里面 ...

  4. spark 2.0.0集群安装与hive on spark配置

    1. 环境准备: JDK1.8 hive 2.3.4 hadoop 2.7.3 hbase 1.3.3 scala 2.11.12 mysql5.7 2. 下载spark2.0.0 cd /home/ ...

  5. Spark 2.0 PCA主成份分析

    PCA在Spark2.0中用法比较简单,只需要设置: .setInputCol(“features”)//保证输入是特征值向量 .setOutputCol(“pcaFeatures”)//输出 .se ...

  6. Apache Spark 2.0三种API的传说:RDD、DataFrame和Dataset

    Apache Spark吸引广大社区开发者的一个重要原因是:Apache Spark提供极其简单.易用的APIs,支持跨多种语言(比如:Scala.Java.Python和R)来操作大数据. 本文主要 ...

  7. Spark 2.0 DataFrame map操作中Unable to find encoder for type stored in a Dataset.问题的分析与解决

    转载:http://blog.csdn.net/sparkexpert/article/details/52871000 随着新版本的spark已经逐渐稳定,最近拟将原有框架升级到spark 2.0. ...

  8. Spark 2.0.0 SPARK-SQL returns NPE Error

    com.esotericsoftware.kryo.KryoException: java.lang.NullPointerExceptionSerialization trace:underlyin ...

  9. Apache Spark 3.0 预览版正式发布,多项重大功能发布

    2019年11月08日 数砖的 Xingbo Jiang 大佬给社区发了一封邮件,宣布 Apache Spark 3.0 预览版正式发布,这个版本主要是为了对即将发布的 Apache Spark 3. ...

随机推荐

  1. MyEclipse导入Maven项目

    转自:http://blog.csdn.net/xuelu198708/article/details/8561115 导入分两种方法: 1.使用MyEclipse的普通工程导入,步骤如下: 1> ...

  2. 编程第一个Apple Watch程序创建项目

    编程第一个Apple Watch程序创建项目 2.4  编程第一个程序 本节将通过编写第一个程序,为开发者讲解如何添加Watch应用对象.运行程序.界面设计.编写代码等内容本文选自Apple Watc ...

  3. POJ2288 Islands and Bridges(TSP:状压DP)

    求一个图的哈密顿路径的最大权及其路径数.显然状态压缩+DP. dp[v][u][S] 表示从v走到当前顶点 u且走过的顶点集合是S的 最大权值和方案数 这题我用记忆化搜索,从终点开始递归进行,感觉这样 ...

  4. [Leetcode] Wildcard Matching

    Implement wildcard pattern matching with support for '?' and '*'. '?' Matches any single character. ...

  5. 什么是SQL注入式攻击

    什么是SQL注入式攻击? 所谓SQL注入式攻击,就是攻击者把SQL命令插入到Web表单的输入域或页面请求的查询字符串,欺骗服务器执行恶意的SQL命令.在某些表单中,用户输入的内容直接用来构造(或者影响 ...

  6. sum()over()和count()over()分析函数

    创建测试表 ),sales ),dest ),dept ),revenue number); 插入测试数据 ); ); ); ); ); ); ); commit; 查看表记录 SQL> sel ...

  7. spring整合quartz并持久化

    spring整合quartz有两种方式: 一.常见是使用配置文件,将定时任务保存到内存中 简单示例: <!-- 短信催还提醒任务调度 --> <bean id="overd ...

  8. 外部调用JS文件时出现中文乱码的解决办法

    若测试网页的编码格式为:gb2312,而调用外部JS文件时出现了乱码(前提是JS文件无错误),则将调用的外部JS文件用记事本打开,然后再保存成编码格式为UTF-8的JS文件即可. 若测试网页的编码格式 ...

  9. MHA监控进程异常退出

    这两天遇到一个非常诡异的问题,打算和大家分享一下.只所以诡异估计是自己知识面不够吧.线上的MHA一直没有开启自动切换,都是手动切换的,最近开启了自动切换以后,退出securecrt窗口以后发现监控进程 ...

  10. Jquery - Select 和 Checkbox 、Textarea的操作

    Checkbox //判断是否选中 if ($(this).is(':checked')) { alert("它处于选中状态"); } else { alert("它不处 ...