You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/28 13:58:07 UTC

[jira] [Updated] (SPARK-6435) spark-shell --jars option does not add all jars to classpath

     [ https://issues.apache.org/jira/browse/SPARK-6435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-6435:
-----------------------------
    Assignee: Masayoshi TSUZUKI

> spark-shell --jars option does not add all jars to classpath
> ------------------------------------------------------------
>
>                 Key: SPARK-6435
>                 URL: https://issues.apache.org/jira/browse/SPARK-6435
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Windows
>    Affects Versions: 1.3.0
>         Environment: Win64
>            Reporter: vijay
>            Assignee: Masayoshi TSUZUKI
>             Fix For: 1.4.0
>
>
> Not all jars supplied via the --jars option will be added to the driver (and presumably executor) classpath.  The first jar(s) will be added, but not all.
> To reproduce this, just add a few jars (I tested 5) to the --jars option, and then try to import a class from the last jar.  This fails.  A simple reproducer: 
> Create a bunch of dummy jars:
> jar cfM jar1.jar log.txt
> jar cfM jar2.jar log.txt
> jar cfM jar3.jar log.txt
> jar cfM jar4.jar log.txt
> Start the spark-shell with the dummy jars and guava at the end:
> %SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
> In the shell, try importing from guava; you'll get an error:
> {code}
> scala> import com.google.common.base.Strings
> <console>:19: error: object Strings is not a member of package com.google.common.base
>        import com.google.common.base.Strings
>               ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org