You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Alex Baretta (JIRA)" <ji...@apache.org> on 2015/06/16 20:22:01 UTC
[jira] [Commented] (SPARK-7944) Spark-Shell 2.11 1.4.0-RC-03 does
not add jars to class path
[ https://issues.apache.org/jira/browse/SPARK-7944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14588506#comment-14588506 ]
Alex Baretta commented on SPARK-7944:
-------------------------------------
Bug confirmed on Spark 1.4.0 with Scala 2.11.6. The --jars option to spark-shell is properly passed on to the SparkSubmit class, and the jars seem to be loaded, but the classes are not available in the REPL.
spark-shell --jars commons-csv-1.0.jar
...
15/06/16 17:57:32 INFO SparkContext: Added JAR file:/home/alex/commons-csv-1.0.jar at http://10.240.57.53:38821/jars/commons-csv-1.0.jar with timestamp 1434477452978
...
scala> org.apache.commons.csv.CSVFormat.DEFAULT
<console>:21: error: object csv is not a member of package org.apache.commons
org.apache.commons.csv.CSVFormat.DEFAULT
^
> Spark-Shell 2.11 1.4.0-RC-03 does not add jars to class path
> ------------------------------------------------------------
>
> Key: SPARK-7944
> URL: https://issues.apache.org/jira/browse/SPARK-7944
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.3.1, 1.4.0
> Environment: scala 2.11
> Reporter: Alexander Nakos
> Priority: Critical
> Attachments: spark_shell_output.txt, spark_shell_output_2.10.txt
>
>
> When I run the spark-shell with the --jars argument and supply a path to a single jar file, none of the classes in the jar are available in the REPL.
> I have encountered this same behaviour in both 1.3.1 and 1.4.0_RC-03 builds for scala 2.11. I have yet to do a 1.4.0 RC-03 build for scala 2.10, but the contents of the jar are available in the 1.3.1_2.10 REPL.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org