You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2017/10/20 19:00:00 UTC
[jira] [Deleted] (SPARK-22325) SPARK_TESTING env variable breaking
non-spark builds on amplab jenkins
[ https://issues.apache.org/jira/browse/SPARK-22325?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin deleted SPARK-22325:
--------------------------------
> SPARK_TESTING env variable breaking non-spark builds on amplab jenkins
> ----------------------------------------------------------------------
>
> Key: SPARK-22325
> URL: https://issues.apache.org/jira/browse/SPARK-22325
> Project: Spark
> Issue Type: Bug
> Environment: riselab jenkins, all workers (ubuntu & centos)
> Reporter: shane knapp
> Priority: Critical
>
> in the riselab jenkins master config, the SPARK_TESTING environment variable is set to 1 and applied to all workers.
> see: https://amplab.cs.berkeley.edu/jenkins/view/RISELab%20Infra/job/testing-foo/9/console (the 'echo 1' is actually 'echo $SPARK_TESTING')
> and: https://amplab.cs.berkeley.edu/jenkins/job/testing-foo/10/injectedEnvVars/
> this is problematic, as some of our lab builds are attempting to run pyspark as part of the build process, and the hard-coded checks for SPARK_TESTING in the setup scripts are causing hard failures.
> see: https://amplab.cs.berkeley.edu/jenkins/job/ADAM-prb/2440/HADOOP_VERSION=2.6.2,SCALAVER=2.11,SPARK_VERSION=2.2.0,label=centos/consoleFull
> i would strongly suggest that we do the following:
> * remove the SPARK_TESTING environment variable declaration in the jenkins config
> * add the environment variable to each spark build config in github: https://github.com/databricks/spark-jenkins-configurations/
> * add the environment variable to SparkPullRequstBuilder and NewSparkPullRequestBuilder
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org