解决pyspark-linux-windowsIDE JAVA_HOME not set
对
os.environ 赋值
ssh://root@192.168.2.51:22/usr/bin/python -u /home/data/tmp_test/trunk/personas/tmp_spark_mongo_relset_javahome.py
JAVA_HOME is not set
Traceback (most recent call last):
File "/home/data/tmp_test/trunk/personas/tmp_spark_mongo_relset_javahome.py", line 16, in <module>
.appName("PythonPi")\
File "/root/.local/lib/python3.6/site-packages/pyspark/sql/session.py", line 169, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/root/.local/lib/python3.6/site-packages/pyspark/context.py", line 334, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/root/.local/lib/python3.6/site-packages/pyspark/context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/root/.local/lib/python3.6/site-packages/pyspark/context.py", line 283, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/root/.local/lib/python3.6/site-packages/pyspark/java_gateway.py", line 95, in launch_gateway
raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number
查源码
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# import atexit
import os
import sys
import select
import signal
import shlex
import socket
import platform
from subprocess import Popen, PIPE if sys.version >= '':
xrange = range from py4j.java_gateway import java_import, JavaGateway, GatewayClient
from pyspark.find_spark_home import _find_spark_home
from pyspark.serializers import read_int def launch_gateway(conf=None):
"""
launch jvm gateway
:param conf: spark configuration passed to spark-submit
:return:
"""
if "PYSPARK_GATEWAY_PORT" in os.environ:
gateway_port = int(os.environ["PYSPARK_GATEWAY_PORT"])
else:
SPARK_HOME = _find_spark_home()
# Launch the Py4j gateway using Spark's run command so that we pick up the
# proper classpath and settings from spark-env.sh
on_windows = platform.system() == "Windows"
script = "./bin/spark-submit.cmd" if on_windows else "./bin/spark-submit"
command = [os.path.join(SPARK_HOME, script)]
if conf:
for k, v in conf.getAll():
command += ['--conf', '%s=%s' % (k, v)]
submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS", "pyspark-shell")
if os.environ.get("SPARK_TESTING"):
submit_args = ' '.join([
"--conf spark.ui.enabled=false",
submit_args
])
command = command + shlex.split(submit_args) # Start a socket that will be used by PythonGatewayServer to communicate its port to us
callback_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
callback_socket.bind(('127.0.0.1', ))
callback_socket.listen()
callback_host, callback_port = callback_socket.getsockname()
env = dict(os.environ)
env['_PYSPARK_DRIVER_CALLBACK_HOST'] = callback_host
env['_PYSPARK_DRIVER_CALLBACK_PORT'] = str(callback_port) # Launch the Java gateway.
# We open a pipe to stdin so that the Java gateway can die when the pipe is broken
if not on_windows:
# Don't send ctrl-c / SIGINT to the Java gateway:
def preexec_func():
signal.signal(signal.SIGINT, signal.SIG_IGN)
proc = Popen(command, stdin=PIPE, preexec_fn=preexec_func, env=env)
else:
# preexec_fn not supported on Windows
proc = Popen(command, stdin=PIPE, env=env) gateway_port = None
# We use select() here in order to avoid blocking indefinitely if the subprocess dies
# before connecting
while gateway_port is None and proc.poll() is None:
timeout = # (seconds)
readable, _, _ = select.select([callback_socket], [], [], timeout)
if callback_socket in readable:
gateway_connection = callback_socket.accept()[]
# Determine which ephemeral port the server started on:
gateway_port = read_int(gateway_connection.makefile(mode="rb"))
gateway_connection.close()
callback_socket.close()
if gateway_port is None:
raise Exception("Java gateway process exited before sending the driver its port number") # In Windows, ensure the Java child processes do not linger after Python has exited.
# In UNIX-based systems, the child process can kill itself on broken pipe (i.e. when
# the parent process' stdin sends an EOF). In Windows, however, this is not possible
# because java.lang.Process reads directly from the parent process' stdin, contending
# with any opportunity to read an EOF from the parent. Note that this is only best
# effort and will not take effect if the python process is violently terminated.
if on_windows:
# In Windows, the child process here is "spark-submit.cmd", not the JVM itself
# (because the UNIX "exec" command is not available). This means we cannot simply
# call proc.kill(), which kills only the "spark-submit.cmd" process but not the
# JVMs. Instead, we use "taskkill" with the tree-kill option "/t" to terminate all
# child processes in the tree (http://technet.microsoft.com/en-us/library/bb491009.aspx)
def killChild():
Popen(["cmd", "/c", "taskkill", "/f", "/t", "/pid", str(proc.pid)])
atexit.register(killChild) # Connect to the gateway
gateway = JavaGateway(GatewayClient(port=gateway_port), auto_convert=True) # Import the classes used by PySpark
java_import(gateway.jvm, "org.apache.spark.SparkConf")
java_import(gateway.jvm, "org.apache.spark.api.java.*")
java_import(gateway.jvm, "org.apache.spark.api.python.*")
java_import(gateway.jvm, "org.apache.spark.ml.python.*")
java_import(gateway.jvm, "org.apache.spark.mllib.api.python.*")
# TODO(davies): move into sql
java_import(gateway.jvm, "org.apache.spark.sql.*")
java_import(gateway.jvm, "org.apache.spark.sql.hive.*")
java_import(gateway.jvm, "scala.Tuple2") return gateway
解决pyspark-linux-windowsIDE JAVA_HOME not set的更多相关文章
- 解决Kali Linux没有声音
解决Kali Linux没有声音 Kali Linux系统默认状态下,root用户是无法使用声卡的,也就没有声音.启用的方法如下: (1)在终端执行命令:systemctl --user enab ...
- 解决redhat linux下IP地址可以ping通,域名无法ping通问题
解决redhat linux下IP地址可以ping通,域名无法ping通 在/etc/resolv.conf中添点东西 格式如下: nameserver xxx.xxx.xxx.xxx nameser ...
- 解决虚拟机linux端mysql数据库无法远程访问
解决虚拟机linux端mysql数据库无法远程访问 1. 在控制台执行 mysql -u root -p mysql,CentOS系统提示输入数据库root用户的密码,输入完成后即进入mysql控制台 ...
- 解决在Linux下安装Oracle时的中文乱码问题
本帖最后由 TsengYia 于 2012-2-22 17:06 编辑 解决在Linux下安装Oracle时的中文乱码问题 操作系统:Red Hat Enterprise Linux 6.1数据库:O ...
- 解决在linux环境下面不显示验证码的问题
详见:http://blog.yemou.net/article/query/info/tytfjhfascvhzxcyt235 解决在linux环境下面不显示验证码的问题1.tomcat ...
- 解决遇到Linux网络配置,从熟悉网络配置文件入手
如果接触过Linux,网络配置是一个比较棘手的问题.但是Linux是文件为基础来构建的系统,包括我们windows中设备,Linux也视为文件.所以只要我们明白文件的作用.就能对Linux更加的熟悉, ...
- 终于解决了Linux下运行OCCI程序一直报Error while trying to retrieve text for error ORA-01804错误
终于解决了Linux下运行OCCI程序一直报Error while trying to retrieve text for error ORA-01804错误 http://blog.csdn.net ...
- 快速解决Ubuntu/linux 环境下QT生成没有可执行文件(application/x-executable)
快速解决Ubuntu/linux 环境下QT生成没有可执行文件(application/x-executable)(转载) 问题描述 与windows环境下不同,linux选择debug构建时并不 ...
- tomcat(不仅仅是tomcat)通过熵池解决在linux启动应用慢
tomcat启动过程中报错 -Jul- ::] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web applica ...
- 360:且用且珍惜!解决虚拟机linux启动缓慢以及ssh端卡顿的问题!
优化软件以及杀毒软件想必大家都是用过的,小编自用的第一台电脑自带安装的是金山毒霸,随着时间的偏移渐渐用过小红伞,卡巴斯基,优化大师,鲁大师到后来的360优化杀毒套装,优化软件给大家带来了方便,尤其是上 ...
随机推荐
- 【luogu】P1772物流运输(最短路+DP)
题目链接 对于本题我们设ext[i][j]计算第i个码头在前j天总共有几天不能用(其实就一前缀和),设dis[i][j]是从第i天到第j天不变运输路线的最短路径,设f[i]是前i天运输货物的最小花费. ...
- 【Luogu】P1330封锁阳光大学(bfs染色)
题目链接 这题恶心死我了. bfs染色,统计每个联通块两色的个数,ans加它们的最小值. #include<cstdio> #include<cctype> #include& ...
- [BZOJ1579] [Usaco2009 Feb]Revamping Trails 道路升级(分层图最短路 + 堆优化dijk)
传送门 dis[i][j]表示第i个点,更新了j次的最短路 此题不良心,卡spfa #include <queue> #include <cstdio> #include &l ...
- spring两个核心IOC、AOP
Spring是一个开放源代码的设计层面框架,他解决的是业务逻辑层和其他各层的松耦合问题,因此它将面向接口的编程思想贯穿整个系统应用.Spring是于2003 年兴起的一个轻量级的Java 开发框架,由 ...
- 常州模拟赛d2t1 小X的质数
题目背景 小 X 是一位热爱数学的男孩子,在茫茫的数字中,他对质数更有一种独特的 情感.小 X 认为,质数是一切自然数起源的地方. 题目描述 在小 X 的认知里,质数是除了本身和 1 以外,没有其他因 ...
- 转载: LINK : fatal error LNK1104: 无法打开文件“mfc71.lib”的原因又一例
转载地址:http://blog.csdn.net/mxclxp/article/details/8196142 LINK : fatal error LNK1104: 无法打开文件“mfc71.li ...
- uva 10561 sg定理
Problem C Treblecross Input: Standard Input Output: Standard Output Time Limit: 4 Seconds Treblecros ...
- HDU 6149 Valley Numer II(状压DP)
题目链接 HDU6149 百度之星复赛的题目……比赛的时候并没有做出来. 由于低点只有15个,所以我们可以考虑状压DP. 利用01背包的思想,依次考虑每个低点,然后枚举每个状态. 在每个状态里面任意枚 ...
- 42.QT-QSqlQuery类操作SQLite数据库(创建、查询、删除、修改)详解
Qt 提供了 QtSql 模块来提供平台独立的基于 SQL 的数据库操作.这里我们所说的“平台 独立”,既包括操作系统平台,也包括各个数据库平台,Qt支持以下几种数据库: QT自带SQLITE数据库, ...
- 前端和后端采用接口访问时的调用验证机制(基于JWT的前后端验证)(思路探讨)
说明:基于前后端,尤其是使用Ajax请求的接口,现在市面上网页上调用的Ajax基本都是没有验证的,如果单独提取之后可以无线的刷数据. 继上一篇http://www.cnblogs.com/EasonJ ...