You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/07/29 11:55:20 UTC

[jira] [Commented] (SPARK-16794) Spark 2.0.0. with Yarn

    [ https://issues.apache.org/jira/browse/SPARK-16794?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15399198#comment-15399198 ] 

Sean Owen commented on SPARK-16794:
-----------------------------------

The errors about binding are noisy but not a problem. It fails to bind to the port you requested so tries others.
The ultimate failure here is due to something that went wrong in requesting YARN resources. This doesn't show the cause. You'd have to investgiate why in YARN logs. This much doesn't show a problem in Spark yet.

> Spark 2.0.0. with Yarn 
> -----------------------
>
>                 Key: SPARK-16794
>                 URL: https://issues.apache.org/jira/browse/SPARK-16794
>             Project: Spark
>          Issue Type: Question
>    Affects Versions: 2.0.0
>         Environment: AWS Cluster with Hortonworks 2.4
>            Reporter: Eliano Marques
>
> I'm trying to start spark 2.0.0 with yarn. First I had the issues pointed out here: https://issues.apache.org/jira/browse/SPARK-15343. I then check the hadoop.yarn.timeline-service.enabled to false and the behaviour changed, i.e. it started executing the application. However I'm facing the following error. This might be a silly config but appreciate if you can help. 
> {code}
> spark-shell --master yarn deploy-mode client 
> {code}
> Log: 
> {code}
> 16/07/29 10:04:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 16/07/29 10:04:17 WARN component.AbstractLifeCycle: FAILED ServerConnector@16da476c{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: Address already in use
> java.net.BindException: Address already in use
> 	at sun.nio.ch.Net.bind0(Native Method)
> 	at sun.nio.ch.Net.bind(Net.java:433)
> 	at sun.nio.ch.Net.bind(Net.java:425)
> 	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> 	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> 	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
> 	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
> 	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.spark_project.jetty.server.Server.doStart(Server.java:366)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:298)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:308)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:308)
> 	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2071)
> 	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
> 	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2062)
> 	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:308)
> 	at org.apache.spark.ui.WebUI.bind(WebUI.scala:139)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:451)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:451)
> 	at scala.Option.foreach(Option.scala:257)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:451)
> 	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
> 	at scala.Option.getOrElse(Option.scala:121)
> 	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
> 	at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
> 	at $line3.$read$$iw$$iw.<init>(<console>:15)
> 	at $line3.$read$$iw.<init>(<console>:31)
> 	at $line3.$read.<init>(<console>:33)
> 	at $line3.$read$.<init>(<console>:37)
> 	at $line3.$read$.<clinit>(<console>)
> 	at $line3.$eval$.$print$lzycompute(<console>:7)
> 	at $line3.$eval$.$print(<console>:6)
> 	at $line3.$eval.$print(<console>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
> 	at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
> 	at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> 	at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
> 	at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
> 	at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
> 	at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
> 	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
> 	at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
> 	at org.apache.spark.repl.Main$.doMain(Main.scala:68)
> 	at org.apache.spark.repl.Main$.main(Main.scala:51)
> 	at org.apache.spark.repl.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/07/29 10:04:17 WARN component.AbstractLifeCycle: FAILED org.spark_project.jetty.server.Server@5503de1: java.net.BindException: Address already in use
> java.net.BindException: Address already in use
> 	at sun.nio.ch.Net.bind0(Native Method)
> 	at sun.nio.ch.Net.bind(Net.java:433)
> 	at sun.nio.ch.Net.bind(Net.java:425)
> 	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> 	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> 	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
> 	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
> 	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.spark_project.jetty.server.Server.doStart(Server.java:366)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:298)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:308)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:308)
> 	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2071)
> 	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
> 	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2062)
> 	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:308)
> 	at org.apache.spark.ui.WebUI.bind(WebUI.scala:139)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:451)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:451)
> 	at scala.Option.foreach(Option.scala:257)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:451)
> 	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
> 	at scala.Option.getOrElse(Option.scala:121)
> 	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
> 	at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
> 	at $line3.$read$$iw$$iw.<init>(<console>:15)
> 	at $line3.$read$$iw.<init>(<console>:31)
> 	at $line3.$read.<init>(<console>:33)
> 	at $line3.$read$.<init>(<console>:37)
> 	at $line3.$read$.<clinit>(<console>)
> 	at $line3.$eval$.$print$lzycompute(<console>:7)
> 	at $line3.$eval$.$print(<console>:6)
> 	at $line3.$eval.$print(<console>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
> 	at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
> 	at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> 	at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
> 	at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
> 	at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
> 	at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
> 	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
> 	at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
> 	at org.apache.spark.repl.Main$.doMain(Main.scala:68)
> 	at org.apache.spark.repl.Main$.main(Main.scala:51)
> 	at org.apache.spark.repl.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/07/29 10:04:17 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
> 16/07/29 10:04:18 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
> 16/07/29 10:04:19 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
> 16/07/29 10:04:26 ERROR spark.SparkContext: Error initializing SparkContext.
> org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
> 	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
> 	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
> 	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
> 	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
> 	at scala.Option.getOrElse(Option.scala:121)
> 	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
> 	at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
> 	at $line3.$read$$iw$$iw.<init>(<console>:15)
> 	at $line3.$read$$iw.<init>(<console>:31)
> 	at $line3.$read.<init>(<console>:33)
> 	at $line3.$read$.<init>(<console>:37)
> 	at $line3.$read$.<clinit>(<console>)
> 	at $line3.$eval$.$print$lzycompute(<console>:7)
> 	at $line3.$eval$.$print(<console>:6)
> 	at $line3.$eval.$print(<console>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
> 	at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
> 	at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> 	at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
> 	at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
> 	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
> 	at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
> 	at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
> 	at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
> 	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
> 	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
> 	at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
> 	at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
> 	at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
> 	at org.apache.spark.repl.Main$.doMain(Main.scala:68)
> 	at org.apache.spark.repl.Main$.main(Main.scala:51)
> 	at org.apache.spark.repl.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/07/29 10:04:26 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
> 16/07/29 10:04:26 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running
> org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
>   at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
>   at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
>   at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
>   at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
>   at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
>   at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
>   at scala.Option.getOrElse(Option.scala:121)
>   at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
>   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
>   ... 47 elided
> <console>:14: error: not found: value spark
>        import spark.implicits._
>               ^
> <console>:14: error: not found: value spark
>        import spark.sql 
> {code}
> Both SPARK_HOME and HADOOP_CONF_DIR are defined. 
> Is there any additional config required with 2.0.0. that we are missing? 
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org