You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michel Lemay (JIRA)" <ji...@apache.org> on 2016/11/30 13:34:58 UTC
[jira] [Updated] (SPARK-18648) spark-shell --jars option does not
add jars to classpath on windows
[ https://issues.apache.org/jira/browse/SPARK-18648?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michel Lemay updated SPARK-18648:
---------------------------------
Description:
I can't import symbols from command line jars when in the shell:
Adding jars via --jars:
{code}
spark-shell --master local[*] --jars path\to\deeplearning4j-core-0.7.0.jar
{code}
Same result if I add it through maven coordinates:
{code}spark-shell --master local[*] --packages org.deeplearning4j:deeplearning4j-core:0.7.0
{code}
I end up with:
{code}
scala> import org.deeplearning4j
<console>:23: error: object deeplearning4j is not a member of package org
import org.deeplearning4j
{code}
NOTE: It is working as expected when running on linux.
Sample output with --verbose:
{code}
Using properties file: null
Parsed arguments:
master local[*]
deployMode null
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass org.apache.spark.repl.Main
primaryResource spark-shell
name Spark shell
childArgs []
jars file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
packages null
packagesExclusions null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file null:
Main class:
org.apache.spark.repl.Main
Arguments:
System properties:
SPARK_SUBMIT -> true
spark.app.name -> Spark shell
spark.jars -> file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
spark.submit.deployMode -> client
spark.master -> local[*]
Classpath elements:
file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
16/11/30 08:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/30 08:30:51 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://192.168.70.164:4040
Spark context available as 'sc' (master = local[*], app id = local-1480512651325).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.2
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.deeplearning4j
<console>:23: error: object deeplearning4j is not a member of package org
import org.deeplearning4j
^
scala>
{code}
was:
I can't import symbols from command line jars when in the shell:
Adding jars via --jars:
```spark-shell --master local[*] --jars path\to\deeplearning4j-core-0.7.0.jar```
Same result if I add it through maven coordinates:
```spark-shell --master local[*] --packages org.deeplearning4j:deeplearning4j-core:0.7.0```
I end up with:
```scala> import org.deeplearning4j
<console>:23: error: object deeplearning4j is not a member of package org
import org.deeplearning4j
```
NOTE: It is working as expected when running on linux.
Sample output with --verbose:
```
Using properties file: null
Parsed arguments:
master local[*]
deployMode null
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass org.apache.spark.repl.Main
primaryResource spark-shell
name Spark shell
childArgs []
jars file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
packages null
packagesExclusions null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file null:
Main class:
org.apache.spark.repl.Main
Arguments:
System properties:
SPARK_SUBMIT -> true
spark.app.name -> Spark shell
spark.jars -> file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
spark.submit.deployMode -> client
spark.master -> local[*]
Classpath elements:
file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
16/11/30 08:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/30 08:30:51 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://192.168.70.164:4040
Spark context available as 'sc' (master = local[*], app id = local-1480512651325).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.2
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.deeplearning4j
<console>:23: error: object deeplearning4j is not a member of package org
import org.deeplearning4j
^
scala>
```
> spark-shell --jars option does not add jars to classpath on windows
> -------------------------------------------------------------------
>
> Key: SPARK-18648
> URL: https://issues.apache.org/jira/browse/SPARK-18648
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell, Windows
> Affects Versions: 2.0.2
> Environment: Windows 7 x64
> Reporter: Michel Lemay
> Labels: windows
>
> I can't import symbols from command line jars when in the shell:
> Adding jars via --jars:
> {code}
> spark-shell --master local[*] --jars path\to\deeplearning4j-core-0.7.0.jar
> {code}
> Same result if I add it through maven coordinates:
> {code}spark-shell --master local[*] --packages org.deeplearning4j:deeplearning4j-core:0.7.0
> {code}
> I end up with:
> {code}
> scala> import org.deeplearning4j
> <console>:23: error: object deeplearning4j is not a member of package org
> import org.deeplearning4j
> {code}
> NOTE: It is working as expected when running on linux.
> Sample output with --verbose:
> {code}
> Using properties file: null
> Parsed arguments:
> master local[*]
> deployMode null
> executorMemory null
> executorCores null
> totalExecutorCores null
> propertiesFile null
> driverMemory null
> driverCores null
> driverExtraClassPath null
> driverExtraLibraryPath null
> driverExtraJavaOptions null
> supervise false
> queue null
> numExecutors null
> files null
> pyFiles null
> archives null
> mainClass org.apache.spark.repl.Main
> primaryResource spark-shell
> name Spark shell
> childArgs []
> jars file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
> packages null
> packagesExclusions null
> repositories null
> verbose true
> Spark properties used, including those specified through
> --conf and those from the properties file null:
> Main class:
> org.apache.spark.repl.Main
> Arguments:
> System properties:
> SPARK_SUBMIT -> true
> spark.app.name -> Spark shell
> spark.jars -> file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
> spark.submit.deployMode -> client
> spark.master -> local[*]
> Classpath elements:
> file:/C:/Apps/Spark/spark-2.0.2-bin-hadoop2.4/bin/../deeplearning4j-core-0.6.0.jar
> 16/11/30 08:30:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 16/11/30 08:30:51 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
> Spark context Web UI available at http://192.168.70.164:4040
> Spark context available as 'sc' (master = local[*], app id = local-1480512651325).
> Spark session available as 'spark'.
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 2.0.2
> /_/
> Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> import org.deeplearning4j
> <console>:23: error: object deeplearning4j is not a member of package org
> import org.deeplearning4j
> ^
> scala>
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org