spark复习笔记(4):spark脚本分析
1.[start-all.sh]
#!/usr/bin/env bash #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# # Start all spark daemons. #启动所有守护进程
# Starts the master on this node. #在这个节点上启动master
# Starts a worker on each node specified in conf/slaves #在conf/slaves文件下指定的节点上启动worker进程 if [ -z "${SPARK_HOME}" ]; then #判断spark环境变量在不在
export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)" #如果在的话,就把环境变量导入
fi # Load the Spark configuration #加载配置文件
. "${SPARK_HOME}/sbin/spark-config.sh" #调用spark-config.sh脚本 # Start Master
"${SPARK_HOME}/sbin"/start-master.sh #启动master脚本进程 # Start Workers
"${SPARK_HOME}/sbin"/start-slaves.sh #启动slaves脚本进程
2.[start-master.sh]
#!/usr/bin/env bash #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# # Starts the master on the machine this script is executed on. #在这个机器上执行脚本,启动master进程 if [ -z "${SPARK_HOME}" ]; then #首先配置环境变量
export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
fi # NOTE: This exact class name is matched downstream by SparkSubmit.
# Any changes need to be reflected there.
CLASS="org.apache.spark.deploy.master.Master" if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
echo "Usage: ./sbin/start-master.sh [options]"
pattern="Usage:"
pattern+="\|Using Spark's default log4j profile:"
pattern+="\|Registered signal handlers for" "${SPARK_HOME}"/bin/spark-class $CLASS --help >& | grep -v "$pattern" >&
exit
fi ORIGINAL_ARGS="$@" . "${SPARK_HOME}/sbin/spark-config.sh" . "${SPARK_HOME}/bin/load-spark-env.sh" if [ "$SPARK_MASTER_PORT" = "" ]; then
SPARK_MASTER_PORT=
fi if [ "$SPARK_MASTER_HOST" = "" ]; then
SPARK_MASTER_HOST=`hostname -f`
fi if [ "$SPARK_MASTER_WEBUI_PORT" = "" ]; then
SPARK_MASTER_WEBUI_PORT=
fi "${SPARK_HOME}/sbin"/spark-daemon.sh start $CLASS \
--host $SPARK_MASTER_HOST --port $SPARK_MASTER_PORT --webui-port $SPARK_MASTER_WEBUI_PORT \
$ORIGINAL_ARGS
spark复习笔记(4):spark脚本分析的更多相关文章
- spark复习笔记(2)
之前工作的时候经常用,隔了段时间,现在学校要用学的东西也忘了,翻翻书谢谢博客吧. 1.什么是spark? Spark是一种快速.通用.可扩展的大数据分析引擎,2009年诞生于加州大学伯克利分校AMPL ...
- spark学习笔记总结-spark入门资料精化
Spark学习笔记 Spark简介 spark 可以很容易和yarn结合,直接调用HDFS.Hbase上面的数据,和hadoop结合.配置很容易. spark发展迅猛,框架比hadoop更加灵活实用. ...
- Spark学习笔记-使用Spark History Server
在运行Spark应用程序的时候,driver会提供一个webUI给出应用程序的运行信息,但是该webUI随着应用程序的完成而关闭端口,也就是 说,Spark应用程序运行完后,将无法查看应用程序的历史记 ...
- Spark 学习笔记之 Spark history Server 搭建
在hdfs上建立文件夹/directory hadoop fs -mkdir /directory 进入conf目录 spark-env.sh 增加以下配置 export SPARK_HISTORY ...
- spark复习笔记(5):API分析
0.spark是基于hadoop的mr模型,扩展了MR,高效实用MR模型,内存型集群计算,提高了app处理速度. 1.特点:(1)在内存中存储中间结果 (2)支持多种语言:java scala pyt ...
- spark复习笔记(4):RDD变换
一.RDD变换 1.返回执行新的rdd的指针,在rdd之间创建依赖关系.每个rdd都有一个计算函数和指向父rdd的指针 Spark是惰性的,因此除非调用某个转换或动作,否则不会执行任何操作,否则将触发 ...
- spark复习笔记(1)
使用spark实现work count ---------------------------------------------------- (1)用sc.textFile(" &quo ...
- spark复习笔记(7):sparkstreaming
一.介绍 1.sparkStreaming是核心模块Spark API的扩展,具有可伸缩,高吞吐量以及容错的实时数据流处理等.数据可以从许多来源(如Kafka,Flume,Kinesis或TCP套接字 ...
- spark复习笔记(7):sparkSQL
一.saprkSQL模块,使用类sql的方式访问Hadoop,实现mr计算,底层使用的是rdd 1.hive //hadoop mr sql 2.phenoix //hbase上构建sql的交互过 ...
随机推荐
- Java——容器(Map)
[Map接口]
- DOS命令里面的EQ、NE、GT、LT、GE、LE分别是什么意思
EQ 就是 EQUAL等于NQ 就是 NOT EQUAL不等于 GT 就是 GREATER THAN大于 LT 就是 LESS THAN小于 GE 就是 GREATER THAN OR EQUAL 大 ...
- formdata方式上传文件,支持大文件分割上传
1.upload.html <!DOCTYPE html> <html xmlns="http://www.w3.org/1999/html"> <h ...
- XXX is not a function
今天,一个以前的小伙伴跟我说他遇到了一个问题,调试了将近两天(这家伙一开始不打算干程序员,跑去干了两个月销售,现在又想回来写代码了,所以就自己折腾一个demo,免得面试的时候被问住) 我把他的代码从头 ...
- bzoj 4298 [ONTAK2015]Bajtocja——哈希+启发式合并
题目:https://www.lydsy.com/JudgeOnline/problem.php?id=4298 题面: 给定d张无向图,每张图都有n个点.一开始,在任何一张图中都没有任何边.接下来有 ...
- git本地文件提交
一.github在线上传文件夹 1.点击上传文件 2 .直接拖拽 直接拖拽即可上传文件夹及文件夹里面的文件.如果点击 choose your files 就只能上传单个文件. 二.通过git工具上传本 ...
- [CSP-S模拟测试]:辣鸡(ljh) (暴力)
题目描述 辣鸡$ljh\ NOI$之后就退役了,然后就滚去学文化课了.然而在上化学课的时候,数学和化学都不好的$ljh$却被一道简单题难住了,受到了大佬的嘲笑.题目描述是这样的:在一个二维平面上有一层 ...
- 如何将一个SpringBoot简便地打成一个war包(亲测有效)
正常情况下SpringBoot项目是以jar包的形式,通过命令行: 来运行的,并且SpringBoot是内嵌Tomcat服务器,所以每次重新启动都是用的新的Tomcat服务器.正因如此,也出现了一个问 ...
- Ta还没有分享呢,过段时间再来看看吧~ 解决办法
自己摸索出来的.只能查看以前分享的奥. 找到要查看用户的id号 利用特百度搜索工具实现检索 http://www.tebaidu.com/user--1.html 将红字部分替换为刚才复制的u ...
- a special kind of crossword called a word square
w Language Log: Wordplay Watch #2: The hunt for the ten-square http://itre.cis.upenn.edu/~myl/langu ...