You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Burak Yavuz (JIRA)" <ji...@apache.org> on 2015/08/07 20:52:45 UTC

[jira] [Commented] (SPARK-9742) NullPointerException when using --packages

    [ https://issues.apache.org/jira/browse/SPARK-9742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14662260#comment-14662260 ] 

Burak Yavuz commented on SPARK-9742:
------------------------------------

Did the behavior of Option's change for some reason? Those used to return None, now they send null Strings. I'll submit a fix.

> NullPointerException when using --packages
> ------------------------------------------
>
>                 Key: SPARK-9742
>                 URL: https://issues.apache.org/jira/browse/SPARK-9742
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.0
>            Reporter: Herman van Hovell
>            Assignee: Burak Yavuz
>            Priority: Blocker
>
> I am getting a NullPointerException when I start the Spark Shell with the {{--packages}} option on the latest master.
> This is the command I use to start the shell:
> {noformat}
> ./spark-shell --packages com.databricks:spark-csv_2.10:1.2.0
> {noformat}
> This the error I get:
> {noformat}
> hvanhovell@QT02:/media/hvanhovell/Data/QT/IT/Software/spark/dist/bin$ ./spark-shell --packages com.databricks:spark-csv_2.10:1.2.0
> Ivy Default Cache set to: /home/hvanhovell/.ivy2/cache
> The jars for the packages stored in: /home/hvanhovell/.ivy2/jars
> Exception in thread "main" java.lang.NullPointerException
> 	at org.apache.spark.deploy.SparkSubmitUtils$.createRepoResolvers(SparkSubmit.scala:812)
> 	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:962)
> 	at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> 15/08/07 10:32:05 INFO Utils: Shutdown hook called
> {noformat}
> It also happens with other dependencies.
> Can anybody confirm this? 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org