初始化SparkContext时jvm错误中不存在pyspark错误
发布于 2021-01-29 16:30:44
我在EMR上使用spark并编写了pyspark脚本,尝试执行时出现错误
from pyspark import SparkContext
sc = SparkContext()
这是错误
File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM
我发现此答案说明我需要导入sparkcontext,但这也无法正常工作。
关注者
0
被浏览
42
1 个回答