You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Anwar AliKhan <an...@gmail.com> on 2020/06/21 22:21:19 UTC

SPARK_HOME

The only change I am making is to set SPARK_HOME I have made the setting in
config files bashed file . In the Zeppelin interpreter settings. I am
trying to run scala files which comes Zeppelin so I can develop spark scala
app. I keep the same same message. Any ideas  ?


org.apache.zeppelin.interpreter.InterpreterException:
org.apache.zeppelin.interpreter.InterpreterException: Fail to open
SparkInterpreter at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577)
at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
at
org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834) Caused by:
org.apache.zeppelin.interpreter.InterpreterException: Fail to open
SparkInterpreter at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
... 8 more Caused by: java.lang.reflect.InvocationTargetException at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method) at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:292)
at
org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:223)
at
org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98)
... 9 more Caused by: java.lang.ExceptionInInitializerError at
org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) at
org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) at
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189) at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
org.apache.spark.SparkContext.<init>(SparkContext.scala:442) at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
at scala.Option.getOrElse(Option.scala:189) at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
... 17 more Caused by: java.lang.NullPointerException at
org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
more
<http://www.backbutton.co.uk/>

Re: SPARK_HOME

Posted by Jeff Zhang <zj...@gmail.com>.
awesome

Anwar AliKhan <an...@gmail.com> 于2020年6月23日周二 上午1:18写道:

> I found the cause of issue , kind of.
> I built spark from source so now all is good.
>
>
> On Sun, 21 Jun 2020, 23:49 Jeff Zhang, <zj...@gmail.com> wrote:
>
>> JDK issue ?
>>
>> 17 more Caused by: java.lang.NullPointerException at
>> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
>> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
>> at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
>> ... 28 more
>>
>> Anwar AliKhan <an...@gmail.com> 于2020年6月22日周一 上午6:21写道:
>>
>>>
>>> The only change I am making is to set SPARK_HOME I have made the setting
>>> in config files bashed file . In the Zeppelin interpreter settings. I am
>>> trying to run scala files which comes Zeppelin so I can develop spark scala
>>> app. I keep the same same message. Any ideas  ?
>>>
>>>
>>> org.apache.zeppelin.interpreter.InterpreterException:
>>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>>> SparkInterpreter at
>>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
>>> at
>>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>>> at
>>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577)
>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
>>> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>>> at
>>> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>>> at
>>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>>> at
>>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>>> at java.base/java.lang.Thread.run(Thread.java:834) Caused by:
>>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>>> SparkInterpreter at
>>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114)
>>> at
>>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>> ... 8 more Caused by: java.lang.reflect.InvocationTargetException at
>>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method) at
>>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
>>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:292)
>>> at
>>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:223)
>>> at
>>> org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
>>> at
>>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98)
>>> ... 9 more Caused by: java.lang.ExceptionInInitializerError at
>>> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
>>> at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) at
>>> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
>>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) at
>>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189) at
>>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
>>> org.apache.spark.SparkContext.<init>(SparkContext.scala:442) at
>>> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at
>>> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
>>> at scala.Option.getOrElse(Option.scala:189) at
>>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
>>> ... 17 more Caused by: java.lang.NullPointerException at
>>> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
>>> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
>>> org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
>>> more
>>> <http://www.backbutton.co.uk/>
>>>
>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang

Re: SPARK_HOME

Posted by Anwar AliKhan <an...@gmail.com>.
I found the cause of issue , kind of.
I built spark from source so now all is good.


On Sun, 21 Jun 2020, 23:49 Jeff Zhang, <zj...@gmail.com> wrote:

> JDK issue ?
>
> 17 more Caused by: java.lang.NullPointerException at
> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
> at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
> ... 28 more
>
> Anwar AliKhan <an...@gmail.com> 于2020年6月22日周一 上午6:21写道:
>
>>
>> The only change I am making is to set SPARK_HOME I have made the setting
>> in config files bashed file . In the Zeppelin interpreter settings. I am
>> trying to run scala files which comes Zeppelin so I can develop spark scala
>> app. I keep the same same message. Any ideas  ?
>>
>>
>> org.apache.zeppelin.interpreter.InterpreterException:
>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>> SparkInterpreter at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
>> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>> at
>> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> at java.base/java.lang.Thread.run(Thread.java:834) Caused by:
>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>> SparkInterpreter at
>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114)
>> at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>> ... 8 more Caused by: java.lang.reflect.InvocationTargetException at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:292)
>> at
>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:223)
>> at
>> org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
>> at
>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98)
>> ... 9 more Caused by: java.lang.ExceptionInInitializerError at
>> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
>> at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) at
>> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) at
>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189) at
>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
>> org.apache.spark.SparkContext.<init>(SparkContext.scala:442) at
>> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at
>> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
>> at scala.Option.getOrElse(Option.scala:189) at
>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
>> ... 17 more Caused by: java.lang.NullPointerException at
>> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
>> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
>> org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
>> more
>> <http://www.backbutton.co.uk/>
>>
>>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: SPARK_HOME

Posted by Anwar AliKhan <an...@gmail.com>.
Do you have set of  up to date mvn compile prameters because I want to
build from source.

Also I need the build to be compatible with spark 3.



On Sun, 21 Jun 2020, 23:49 Jeff Zhang, <zj...@gmail.com> wrote:

> JDK issue ?
>
> 17 more Caused by: java.lang.NullPointerException at
> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
> at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
> ... 28 more
>
> Anwar AliKhan <an...@gmail.com> 于2020年6月22日周一 上午6:21写道:
>
>>
>> The only change I am making is to set SPARK_HOME I have made the setting
>> in config files bashed file . In the Zeppelin interpreter settings. I am
>> trying to run scala files which comes Zeppelin so I can develop spark scala
>> app. I keep the same same message. Any ideas  ?
>>
>>
>> org.apache.zeppelin.interpreter.InterpreterException:
>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>> SparkInterpreter at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
>> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
>> at
>> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> at java.base/java.lang.Thread.run(Thread.java:834) Caused by:
>> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
>> SparkInterpreter at
>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114)
>> at
>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>> ... 8 more Caused by: java.lang.reflect.InvocationTargetException at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method) at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:292)
>> at
>> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:223)
>> at
>> org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
>> at
>> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98)
>> ... 9 more Caused by: java.lang.ExceptionInInitializerError at
>> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
>> at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) at
>> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) at
>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189) at
>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
>> org.apache.spark.SparkContext.<init>(SparkContext.scala:442) at
>> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at
>> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
>> at scala.Option.getOrElse(Option.scala:189) at
>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
>> ... 17 more Caused by: java.lang.NullPointerException at
>> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
>> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
>> org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
>> more
>> <http://www.backbutton.co.uk/>
>>
>>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: SPARK_HOME

Posted by Jeff Zhang <zj...@gmail.com>.
JDK issue ?

17 more Caused by: java.lang.NullPointerException at
org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
more

Anwar AliKhan <an...@gmail.com> 于2020年6月22日周一 上午6:21写道:

>
> The only change I am making is to set SPARK_HOME I have made the setting
> in config files bashed file . In the Zeppelin interpreter settings. I am
> trying to run scala files which comes Zeppelin so I can develop spark scala
> app. I keep the same same message. Any ideas  ?
>
>
> org.apache.zeppelin.interpreter.InterpreterException:
> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
> SparkInterpreter at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:577)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
> org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
> at
> org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:39)
> at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> at java.base/java.lang.Thread.run(Thread.java:834) Caused by:
> org.apache.zeppelin.interpreter.InterpreterException: Fail to open
> SparkInterpreter at
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:114)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
> ... 8 more Caused by: java.lang.reflect.InvocationTargetException at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method) at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566) at
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:292)
> at
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:223)
> at
> org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:90)
> at
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:98)
> ... 9 more Caused by: java.lang.ExceptionInInitializerError at
> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
> at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) at
> org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) at
> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189) at
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:442) at
> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
> at scala.Option.getOrElse(Option.scala:189) at
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
> ... 17 more Caused by: java.lang.NullPointerException at
> org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
> at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207) at
> org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala) ... 28
> more
> <http://www.backbutton.co.uk/>
>
>

-- 
Best Regards

Jeff Zhang