spark源码学习-withScope
以前的sparkUI中只有stage的执行情况,也就是说我们不可以看到上个RDD到下个RDD的具体信息。于是为了在
sparkUI中能展示更多的信息。所以把所有创建的RDD的方法都包裹起来,同时用RDDOperationScope 记录 RDD 的操作历史和关联,就能达成目标。下面就是一张WordCount的DAG visualization on SparkUI
记录关系的RDDOperationScope源码如下:
/**
* A general, named code block representing an operation that instantiates RDDs.
*
* All RDDs instantiated in the corresponding code block will store a pointer to this object.
* Examples include, but will not be limited to, existing RDD operations, such as textFile,
* reduceByKey, and treeAggregate.
*
* An operation scope may be nested in other scopes. For instance, a SQL query may enclose
* scopes associated with the public RDD APIs it uses under the hood.
*
* There is no particular relationship between an operation scope and a stage or a job.
* A scope may live inside one stage (e.g. map) or span across multiple jobs (e.g. take).
*/
@JsonInclude(Include.NON_NULL)
@JsonPropertyOrder(Array("id", "name", "parent"))
private[spark] class RDDOperationScope(
val name: String,
val parent: Option[RDDOperationScope] = None,
val id: String = RDDOperationScope.nextScopeId().toString) { def toJson: String = {
RDDOperationScope.jsonMapper.writeValueAsString(this)
} /**
* Return a list of scopes that this scope is a part of, including this scope itself.
* The result is ordered from the outermost scope (eldest ancestor) to this scope.
*/
@JsonIgnore
def getAllScopes: Seq[RDDOperationScope] = {
parent.map(_.getAllScopes).getOrElse(Seq.empty) ++ Seq(this)
} override def equals(other: Any): Boolean = {
other match {
case s: RDDOperationScope =>
id == s.id && name == s.name && parent == s.parent
case _ => false
}
} override def hashCode(): Int = Objects.hashCode(id, name, parent) override def toString: String = toJson
} /**
* A collection of utility methods to construct a hierarchical representation of RDD scopes.
* An RDD scope tracks the series of operations that created a given RDD.
*/
private[spark] object RDDOperationScope extends Logging {
private val jsonMapper = new ObjectMapper().registerModule(DefaultScalaModule)
private val scopeCounter = new AtomicInteger() def fromJson(s: String): RDDOperationScope = {
jsonMapper.readValue(s, classOf[RDDOperationScope])
} /** Return a globally unique operation scope ID. */
def nextScopeId(): Int = scopeCounter.getAndIncrement /**
* Execute the given body such that all RDDs created in this body will have the same scope.
* The name of the scope will be the first method name in the stack trace that is not the
* same as this method's.
*
* Note: Return statements are NOT allowed in body.
*/
private[spark] def withScope[T](
sc: SparkContext,
allowNesting: Boolean = false)(body: => T): T = {
val ourMethodName = "withScope"
val callerMethodName = Thread.currentThread.getStackTrace()
.dropWhile(_.getMethodName != ourMethodName)
.find(_.getMethodName != ourMethodName)
.map(_.getMethodName)
.getOrElse {
// Log a warning just in case, but this should almost certainly never happen
logWarning("No valid method name for this RDD operation scope!")
"N/A"
}
withScope[T](sc, callerMethodName, allowNesting, ignoreParent = false)(body)
} /**
* Execute the given body such that all RDDs created in this body will have the same scope.
*
* If nesting is allowed, any subsequent calls to this method in the given body will instantiate
* child scopes that are nested within our scope. Otherwise, these calls will take no effect.
*
* Additionally, the caller of this method may optionally ignore the configurations and scopes
* set by the higher level caller. In this case, this method will ignore the parent caller's
* intention to disallow nesting, and the new scope instantiated will not have a parent. This
* is useful for scoping physical operations in Spark SQL, for instance.
*
* Note: Return statements are NOT allowed in body.
*/
private[spark] def withScope[T](
sc: SparkContext,
name: String,
allowNesting: Boolean,
ignoreParent: Boolean)(body: => T): T = {
// Save the old scope to restore it later
val scopeKey = SparkContext.RDD_SCOPE_KEY
val noOverrideKey = SparkContext.RDD_SCOPE_NO_OVERRIDE_KEY
val oldScopeJson = sc.getLocalProperty(scopeKey)
val oldScope = Option(oldScopeJson).map(RDDOperationScope.fromJson)
val oldNoOverride = sc.getLocalProperty(noOverrideKey)
try {
if (ignoreParent) {
// Ignore all parent settings and scopes and start afresh with our own root scope
sc.setLocalProperty(scopeKey, new RDDOperationScope(name).toJson)
} else if (sc.getLocalProperty(noOverrideKey) == null) {
// Otherwise, set the scope only if the higher level caller allows us to do so
sc.setLocalProperty(scopeKey, new RDDOperationScope(name, oldScope).toJson)
}
// Optionally disallow the child body to override our scope
if (!allowNesting) {
sc.setLocalProperty(noOverrideKey, "true")
log.info("this is textFile1")
log.info("this is textFile2" )
//println("this is textFile3")
log.error("this is textFile4err")
log.warn("this is textFile5WARN")
log.debug("this is textFile6debug")
}
body
} finally {
// Remember to restore any state that was modified before exiting
sc.setLocalProperty(scopeKey, oldScopeJson)
sc.setLocalProperty(noOverrideKey, oldNoOverride)
}
}
}
spark源码学习-withScope的更多相关文章
- Spark源码学习1.2——TaskSchedulerImpl.scala
许久没有写博客了,没有太多时间,最近陆续将Spark源码的一些阅读笔记传上,接下来要修改Spark源码了. 这个类继承于TaskScheduler类,重载了TaskScheduler中的大部分方法,是 ...
- Spark源码学习1.1——DAGScheduler.scala
本文以Spark1.1.0版本为基础. 经过前一段时间的学习,基本上能够对Spark的工作流程有一个了解,但是具体的细节还是需要阅读源码,而且后续的科研过程中也肯定要修改源码的,所以最近开始Spark ...
- Spark源码学习2
转自:http://www.cnblogs.com/hseagle/p/3673123.html 在源码阅读时,需要重点把握以下两大主线. 静态view 即 RDD, transformation a ...
- Spark源码学习1.6——Executor.scala
Executor.scala 一.Executor类 首先判断本地性,获取slaves的host name(不是IP或者host: port),匹配运行环境为集群或者本地.如果不是本地执行,需要启动一 ...
- Spark源码学习1.5——BlockManager.scala
一.BlockResult类 该类用来表示返回的匹配的block及其相关的参数.共有三个参数: data:Iterator [Any]. readMethod: DataReadMethod.Valu ...
- Spark源码学习1.4——MapOutputTracker.scala
相关类:MapOutputTrackerMessage,GetMapOutputStatuses extends MapPutputTrackerMessage,StopMapOutputTracke ...
- Spark源码学习3
转自:http://www.cnblogs.com/hseagle/p/3673132.html 一.概要 本篇主要阐述在TaskRunner中执行的task其业务逻辑是如何被调用到的,另外试图讲清楚 ...
- Spark源码学习1
转自:http://www.cnblogs.com/hseagle/p/3664933.html 一.基本概念(Basic Concepts) RDD - resillient distributed ...
- Spark源码学习1.8——ShuffleBlockManager.scala
shuffleBlockManager继承于Logging,参数为blockManager和shuffleManager.shuffle文件有三个特性:shuffleId,整个shuffle stag ...
随机推荐
- 浅谈JavaScript的面向对象程序设计(三)
前面已经对JavaScript的面向对象程序设计作了简单的介绍,包括了对象的属性.对象的工厂模式.构造函数和原型等.通过介绍,这些创建对象的方法依然有不少优化和改进的地方. 组合使用构造函数模式和原型 ...
- 常用的Atom插件
1.simplified-chinese-menu 2.tree-view-finder 3.minimap 4.linter和linter-jshint 5.linter-js-standard 6 ...
- 原生ajax请求和jsonp
1.原生ajax请求 var obj = new XMLHttpRequest(); obj.open("POST", url, true); obj.setRequestHead ...
- 设计模式-(12)迭代器模式 (swift版)
一,概念 迭代器模式(Iterator Pattern)是 Java 和 .Net 编程环境中非常常用的设计模式.这种模式用于顺序访问集合对象的元素,不需要知道集合对象的底层表示.迭代器模式属于行为型 ...
- 初识JVM虚拟机
前言: Java语言里负责解释执行字节码文件的是Java虚拟机,即JVM——Java Virtual Machine(Java虚拟机). 执行Java程序的两个步骤: 由Java语言编写的程序需要进过 ...
- 表单提交数据量大于2m,java 后台接受不到表单传递过来的数据
一般来说 post请求提交的数据无大小限制,但是tomcat 设置默认的表单传输数据大小不能2m,这时候当数据大于2m后台接收达不到表单的数据,需要修改tomcat的server.xml的的maxPo ...
- 并不对劲的bzoj4198:loj2132:uoj130:p2168:[NOI2015]荷马史诗
题目大意 有\(n\)(\(n\leq10^5\))种单词,其中第\(i\)种单词在文章中的出现次数为\(w_i\) 要将每个单词替换成一个字符集为\(k\)(\(k\leq9\))的字符串,使对于任 ...
- U-Boot编译过程完全分析
2.1 U-Boot Makefile分析 2.1.1 U-Boot编译命令 对于mini2440开发板,编译U-Boot需要执行如下的命令: $ make m ...
- 配置DTD提示的方法
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE sqlMap PUBLIC "-/ ...
- Phpspy 2011继续身份验证绕过漏洞
Author: Tm3yShell7 官方目前下载已经修补上了 目前官方下载是2011.php, 文件名为2011ok.php的是带洞版本. 今天m0r5和我说phpspy2011 我都不知道2011 ...