You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bruce Robbins (JIRA)" <ji...@apache.org> on 2018/01/02 18:24:00 UTC

[jira] [Created] (SPARK-22940) Test suite HiveExternalCatalogVersionsSuite fails on platforms that don't have wget installed

Bruce Robbins created SPARK-22940:
-------------------------------------

             Summary: Test suite HiveExternalCatalogVersionsSuite fails on platforms that don't have wget installed
                 Key: SPARK-22940
                 URL: https://issues.apache.org/jira/browse/SPARK-22940
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.2.1
         Environment: MacOS Sierra 10.12.6
            Reporter: Bruce Robbins
            Priority: Minor


On platforms that don't have wget installed (e.g., Mac OS X), test suite org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite throws an exception and aborts:

java.io.IOException: Cannot run program "wget": error=2, No such file or directory

HiveExternalCatalogVersionsSuite uses wget to download older versions of Spark for compatibility testing. First it uses wget to find a a suitable mirror, and then it uses wget to download a tar file from the mirror.

There are several ways to fix this (in reverse order of difficulty of implementation)

1. Require Mac OS X users to install wget if they wish to run unit tests (or at the very least if they wish to run HiveExternalCatalogVersionsSuite). Also, update documentation to make this requirement explicit.
2. Fall back on curl when wget is not available.
3. Use an HTTP library to query for a suitable mirror and download the tar file.

Number 2 is easy to implement, and I did so to get the unit test to run. But relies on another external program.

Number 3 is probably slightly more complex to implement and requires more corner-case checking (e.g, redirects, etc.).





--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org