You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "abhinav (JIRA)" <ji...@apache.org> on 2019/01/07 09:32:00 UTC

[jira] [Commented] (SPARK-13928) Move org.apache.spark.Logging into org.apache.spark.internal.Logging

    [ https://issues.apache.org/jira/browse/SPARK-13928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16735604#comment-16735604 ] 

abhinav commented on SPARK-13928:
---------------------------------

I am using HBASE 1.2 and Spark 2.3 and scala 2.11.11.  While initializing HBaseContext( val hbaseContext=new HBaseContext(spark.sparkContext, hbaseConf)) , I am getting below compiler error

 

Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class 
 org.apache.hadoop.hbase.spark.HBaseContext'. Make sure that type Logging is in your classpath and check 
 for conflicting dependencies with -Ylog-classpath. A full rebuild may help if 'HBaseContext.class' was 
 compiled against an incompatible version of org.apache.spark.

> Move org.apache.spark.Logging into org.apache.spark.internal.Logging
> --------------------------------------------------------------------
>
>                 Key: SPARK-13928
>                 URL: https://issues.apache.org/jira/browse/SPARK-13928
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Reynold Xin
>            Assignee: Wenchen Fan
>            Priority: Major
>             Fix For: 2.0.0
>
>
> Logging was made private in Spark 2.0. If we move it, then users would be able to create a Logging trait themselves to avoid changing their own code. Alternatively, we can also provide in a compatibility package that adds logging.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org