You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "RJ Nowling (JIRA)" <ji...@apache.org> on 2015/12/09 22:25:11 UTC
[jira] [Reopened] (SPARK-4816) Maven profile netlib-lgpl does not
work
[ https://issues.apache.org/jira/browse/SPARK-4816?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
RJ Nowling reopened SPARK-4816:
-------------------------------
I ran into the same issue with Spark 1.4. If I download the tarball from {{spark.apache.org}} and build with {{-Pnetlib-lgpl}}, the native libraries are excluded from the jar by the shader. However, if I check out the branch-1.4 from github and build with that, the appropriate libraries are included.
I don't know much about the source release processes, but it is possible that something in that process is resulting in different maven builds?
> Maven profile netlib-lgpl does not work
> ---------------------------------------
>
> Key: SPARK-4816
> URL: https://issues.apache.org/jira/browse/SPARK-4816
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 1.1.0
> Environment: maven 3.0.5 / Ubuntu
> Reporter: Guillaume Pitel
> Priority: Minor
> Fix For: 1.1.1
>
>
> When doing what the documentation recommends to recompile Spark with Netlib Native system binding (i.e. to bind with openblas or, in my case, MKL),
> mvn -Pnetlib-lgpl -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean package
> The resulting assembly jar still lacked the netlib-system class. (I checked the content of spark-assembly...jar)
> When forcing the netlib-lgpl profile in MLLib package to be active, the jar is correctly built.
> So I guess it's a problem with the way maven passes profiles activitations to children modules.
> Also, despite the documentation claiming that if the job's jar contains netlib with necessary bindings, it should works, it does not. The classloader must be unhappy with two occurrences of netlib ?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org