You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/05 20:40:39 UTC

[jira] [Commented] (SPARK-11798) Datanucleus jars is missing under lib_managed/jars

    [ https://issues.apache.org/jira/browse/SPARK-11798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15083640#comment-15083640 ] 

Josh Rosen commented on SPARK-11798:
------------------------------------

Datanucleus is only added as a dependency when the Hive build profile is enabled. Are you sure that you enabled that flag?

> Datanucleus jars is missing under lib_managed/jars
> --------------------------------------------------
>
>                 Key: SPARK-11798
>                 URL: https://issues.apache.org/jira/browse/SPARK-11798
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, SQL
>            Reporter: Jeff Zhang
>
> I notice the comments in https://github.com/apache/spark/pull/9575 said that Datanucleus related jars will still be copied to lib_managed/jars. But I don't see any jars under lib_managed/jars. The weird thing is that I see the jars on another machine, but could not see jars on my laptop even after I delete the whole spark project and start from scratch. Does it related with environments ? I try to add the following code in SparkBuild.scala to track the issue, it shows that the jars is empty. 
> {code}
> deployDatanucleusJars := {
>       val jars: Seq[File] = (fullClasspath in assembly).value.map(_.data)
>         .filter(_.getPath.contains("org.datanucleus"))
>       // this is what I added
>       println("*********************************************")
>       println("fullClasspath:"+fullClasspath)
>       println("assembly:"+assembly)
>       println("jars:"+jars.map(_.getAbsolutePath()).mkString(","))
>       //
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org