You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/09/10 03:54:00 UTC
[jira] [Updated] (SPARK-32837) Leverage pip and setup.py
configurations to pass Hadoop and Hive options in pip installation
[ https://issues.apache.org/jira/browse/SPARK-32837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-32837:
---------------------------------
Component/s: Build
> Leverage pip and setup.py configurations to pass Hadoop and Hive options in pip installation
> --------------------------------------------------------------------------------------------
>
> Key: SPARK-32837
> URL: https://issues.apache.org/jira/browse/SPARK-32837
> Project: Spark
> Issue Type: Improvement
> Components: Build, PySpark
> Affects Versions: 3.1.0
> Reporter: Hyukjin Kwon
> Priority: Major
>
> Currently, pip itself allows a custom option to set via {{--install-option}}. However, when you pass this option to {{pip}}, it passes that option to all dependency installation. Please also see https://github.com/pypa/pip/issues/1883
> It is very hacky or impossible to use this option to pass the option to a specific package. They are discussing to have a more general option.
> Once they have a way to do it, we should remove the environment variables from {{setup.py}} and switch it to proper options.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org