You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Aleksandr Ovcharenko (JIRA)" <ji...@apache.org> on 2017/09/11 13:16:00 UTC

[jira] [Commented] (SPARK-21974) SVD computation results in failure to load NativeSystemARPACK and NativeRefARPACK

    [ https://issues.apache.org/jira/browse/SPARK-21974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16161217#comment-16161217 ] 

Aleksandr Ovcharenko commented on SPARK-21974:
----------------------------------------------

Ok, thanks for your reply. Is there any way to see what's under $LD_LIBRARY_PATH when in the spark shell already? Also, isn't netlib-java was supposed to be part of Spark 2.2.0? I would imagine that at least the "NativeRefARPACK" issue wouldn't be there.

Thank you.

> SVD computation results in failure to load NativeSystemARPACK and NativeRefARPACK
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-21974
>                 URL: https://issues.apache.org/jira/browse/SPARK-21974
>             Project: Spark
>          Issue Type: Wish
>          Components: MLlib
>    Affects Versions: 2.2.0
>            Reporter: Aleksandr Ovcharenko
>            Priority: Critical
>
> Hello guys,
> I am trying to implement the SVD computation with Spark 2.2.0 using the Python interface. However I keep running into the following issues:
> 17/09/11 11:41:26 WARN ARPACK: Failed to load implementation from: com.github.fommil.netlib.NativeSystemARPACK
> 17/09/11 11:41:26 WARN ARPACK: Failed to load implementation from: com.github.fommil.netlib.NativeRefARPACK
> I've tried to install ARPACK myself and set LD_LIBRARY_PATH accordingly before running pyspark. I've also trid to pass the "--packages com.github.fommil.netlib:all:1.1.2" command in order to pick up netlib-java correctly.
> Is there anything I'm missing? Is it possible to see why native ARPACK is not being picked up when setting the LD_LIBRARY_PATH correctly? Could you advise something to get rid of either or both warnings? I'm really getting a huge performance hit because of that.
> Thank you, I appreciate your response.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org