You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "leookok (Jira)" <ji...@apache.org> on 2020/08/04 08:57:00 UTC

[jira] [Created] (HBASE-24815) hbase-connectors mvn install error

leookok created HBASE-24815:
-------------------------------

             Summary: hbase-connectors mvn install error
                 Key: HBASE-24815
                 URL: https://issues.apache.org/jira/browse/HBASE-24815
             Project: HBase
          Issue Type: Bug
          Components: hbase-connectors
            Reporter: leookok


*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216: overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
  (listener: org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439: object SparkHadoopUtil in package deploy cannot be accessed in package org.apache.spark.deploy
[ERROR] [Error] F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487: not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found







--
This message was sent by Atlassian Jira
(v8.3.4#803005)