You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Mate Szalay-Beko (Jira)" <ji...@apache.org> on 2021/01/28 10:54:00 UTC
[jira] [Commented] (HBASE-24815) hbase-connectors mvn install error
[ https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17273496#comment-17273496 ]
Mate Szalay-Beko commented on HBASE-24815:
------------------------------------------
I just faced this build issue when trying to build hbase-connector with spark3 and scala 2.12
I think the issue is caused by https://issues.apache.org/jira/browse/SPARK-26043
I'll try to fix this by copying these now-private utility functions from the spark class. That way we could compile both with spark2 and spark3.
> hbase-connectors mvn install error
> ----------------------------------
>
> Key: HBASE-24815
> URL: https://issues.apache.org/jira/browse/HBASE-24815
> Project: HBase
> Issue Type: Bug
> Components: hbase-connectors
> Reporter: leookok
> Assignee: Mate Szalay-Beko
> Priority: Blocker
>
> *when maven command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216: overloaded method value addTaskCompletionListener with alternatives:
> (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
> (listener: org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
> does not take type parameters
> {color:red}[ERROR] {color}one error found
> *but use the spark.version=2.4.0 is ok*
> mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
>
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error
> {color:red}[ERROR]{color} [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439: object SparkHadoopUtil in package deploy cannot be accessed in package org.apache.spark.deploy
> [ERROR] [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487: not found: value SparkHadoopUtil
> {color:red}[ERROR]{color} two errors found
> go to the [spark @github|https://github.com/apache/spark/blob/e1ea806b3075d279b5f08a29fe4c1ad6d3c4191a/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala]
> define SparkHadoopUtil to private[spark]
> {code:java}
> private[spark] class SparkHadoopUtil extends Logging {}
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)