You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shane Knapp (Jira)" <ji...@apache.org> on 2020/01/09 22:01:00 UTC
[jira] [Updated] (SPARK-29988) Adjust Jenkins jobs for
`hive-1.2/2.3` combination
[ https://issues.apache.org/jira/browse/SPARK-29988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shane Knapp updated SPARK-29988:
--------------------------------
Attachment: Screen Shot 2020-01-09 at 1.59.25 PM.png
> Adjust Jenkins jobs for `hive-1.2/2.3` combination
> --------------------------------------------------
>
> Key: SPARK-29988
> URL: https://issues.apache.org/jira/browse/SPARK-29988
> Project: Spark
> Issue Type: Sub-task
> Components: Project Infra
> Affects Versions: 3.0.0
> Reporter: Dongjoon Hyun
> Assignee: Shane Knapp
> Priority: Major
> Attachments: Screen Shot 2020-01-09 at 1.59.25 PM.png
>
>
> We need to rename the following Jenkins jobs first.
> spark-master-test-sbt-hadoop-2.7 -> spark-master-test-sbt-hadoop-2.7-hive-1.2
> spark-master-test-sbt-hadoop-3.2 -> spark-master-test-sbt-hadoop-3.2-hive-2.3
> spark-master-test-maven-hadoop-2.7 -> spark-master-test-maven-hadoop-2.7-hive-1.2
> spark-master-test-maven-hadoop-3.2 -> spark-master-test-maven-hadoop-3.2-hive-2.3
> Also, we need to add `-Phive-1.2` for the existing `hadoop-2.7` jobs.
> {code}
> -Phive \
> + -Phive-1.2 \
> {code}
> And, we need to add `-Phive-2.3` for the existing `hadoop-3.2` jobs.
> {code}
> -Phive \
> + -Phive-2.3 \
> {code}
> Now now, I added the above `-Phive-1.2` and `-Phive-2.3` to the Jenkins manually. (This should be added to SCM of AmpLab Jenkins.)
> After SPARK-29981, we need to create two new jobs.
> - spark-master-test-sbt-hadoop-2.7-hive-2.3
> - spark-master-test-maven-hadoop-2.7-hive-2.3
> This is for preparation for Apache Spark 3.0.0.
> We may drop all `*-hive-1.2` jobs at Apache Spark 3.1.0.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org