You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mark Hamstra (JIRA)" <ji...@apache.org> on 2014/11/17 03:00:36 UTC
[jira] [Updated] (SPARK-4436) Debian packaging misses datanucleus
jars
[ https://issues.apache.org/jira/browse/SPARK-4436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mark Hamstra updated SPARK-4436:
--------------------------------
Description:
If Spark is built with Hive support (i.e. -Phive), then the necessary datanucleus jars end up in lib_managed, not as part of the uber-jar. The debian packaging isn't including anything from lib_managed. As a consequence, HiveContext et al. will fail with the packaged Spark even though it was built with -Phive.
see comment in bin/compute-classpath.sh
Packaging everything from lib_managed/jars into <spark>/lib is an adequate solution.
was:
If Spark is built with HIve support (i.e. -Phive), then the necessary datanucleus jars end up in lib_managed, not as part of the uber-jar. The debian packaging isn't including anything from lib_managed. As a consequence, HiveContext et al. will fail with the packaged Spark even though it was built with -Phive.
see comment in bin/compute-classpath.sh
Packaging everything from lib_managed/jars into <spark>/lib is an adequate solution.
> Debian packaging misses datanucleus jars
> ----------------------------------------
>
> Key: SPARK-4436
> URL: https://issues.apache.org/jira/browse/SPARK-4436
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 1.0.0, 1.0.1, 1.1.0
> Reporter: Mark Hamstra
> Assignee: Mark Hamstra
> Priority: Minor
>
> If Spark is built with Hive support (i.e. -Phive), then the necessary datanucleus jars end up in lib_managed, not as part of the uber-jar. The debian packaging isn't including anything from lib_managed. As a consequence, HiveContext et al. will fail with the packaged Spark even though it was built with -Phive.
> see comment in bin/compute-classpath.sh
> Packaging everything from lib_managed/jars into <spark>/lib is an adequate solution.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org