You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2017/01/18 17:54:26 UTC

[jira] [Resolved] (SPARK-19231) SparkR hangs when there is download or untar failure

     [ https://issues.apache.org/jira/browse/SPARK-19231?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Felix Cheung resolved SPARK-19231.
----------------------------------
          Resolution: Fixed
       Fix Version/s: 2.2.0
                      2.1.1
    Target Version/s: 2.1.1, 2.2.0

> SparkR hangs when there is download or untar failure
> ----------------------------------------------------
>
>                 Key: SPARK-19231
>                 URL: https://issues.apache.org/jira/browse/SPARK-19231
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.1.0
>            Reporter: Felix Cheung
>            Assignee: Felix Cheung
>             Fix For: 2.1.1, 2.2.0
>
>
> When there is any partial download, or download error it is not cleaned up, and sparkR.session will continue to stuck with no error message.
> {code}
> > sparkR.session()
> Spark not found in SPARK_HOME:
> Spark not found in the cache directory. Installation will start.
> MirrorUrl not provided.
> Looking for preferred site from apache website...
> Preferred mirror site found: http://www-eu.apache.org/dist/spark
> Downloading spark-2.1.0 for Hadoop 2.7 from:
> - http://www-eu.apache.org/dist/spark/spark-2.1.0/spark-2.1.0-bin-hadoop2.7.tgz
> trying URL 'http://www-eu.apache.org/dist/spark/spark-2.1.0/spark-2.1.0-bin-hadoop2.7.tgz'
> Content type 'application/x-gzip' length 195636829 bytes (186.6 MB)
> downloaded 31.9 MB
>  
> Installing to C:\Users\felix\AppData\Local\spark\spark\Cache
> Error in untar2(tarfile, files, list, exdir) : incomplete block on file
> In addition: Warning message:
> In download.file(remotePath, localPath) :
>   downloaded length 33471940 != reported length 195636829
> > sparkR.session()
> Spark not found in SPARK_HOME:
> spark-2.1.0 for Hadoop 2.7 found, setting SPARK_HOME to C:\Users\felix\AppData\Local\spark\spark\Cache/spark-2.1.0-bin-hadoop2.7
> Launching java with spark-submit command C:\Users\felix\AppData\Local\spark\spark\Cache/spark-2.1.0-bin-hadoop2.7/bin/spark-submit2.cmd   sparkr-shell C:\Users\felix\AppData\Local\Temp\RtmpCqNdne\backend_port16d04191e7
> {code}
> {code}
> Directory of C:\Users\felix\AppData\Local\spark\spark\Cache
> 01/13/2017  11:25 AM    <DIR>          spark-2.1.0-bin-hadoop2.7
> 01/13/2017  11:25 AM        33,471,940 spark-2.1.0-bin-hadoop2.7.tgz
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org