Nebula Spark Connector Compatibility Issue. #5652
Unanswered
xs2tarunkukreja
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My System Configuration -
Window OS 10
Python 3.10.4
Spark 3.3.2
When I launch PySpark shell, it is working fine.
But when I place nebula-spark-connector-3.0.0 spark/jars directory, PySpark Shell shows error. Please suggest the Nebula Connector Version for my system.
Getting issue -
Python 3.10.4 (tags/v3.10.4:9d38120, Mar 23 2022, 23:13:41) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:138)
at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:115)
at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:109)
at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
File "c:\spark\spark33\python\pyspark\shell.py", line 36, in
SparkContext._ensure_initialized()
File "c:\spark\spark33\python\pyspark\context.py", line 417, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "c:\spark\spark33\python\pyspark\java_gateway.py", line 106, in launch_gateway
raise RuntimeError("Java gateway process exited before sending its port number")
RuntimeError: Java gateway process exited before sending its port number
Beta Was this translation helpful? Give feedback.
All reactions