You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liyunzhang (JIRA)" <ji...@apache.org> on 2017/11/30 08:07:00 UTC

[jira] [Created] (SPARK-22661) Fix the putAll compile error when compiling with scala-2.12 and jdk9

liyunzhang created SPARK-22661:
----------------------------------

             Summary: Fix the putAll compile error when compiling with scala-2.12 and jdk9
                 Key: SPARK-22661
                 URL: https://issues.apache.org/jira/browse/SPARK-22661
             Project: Spark
          Issue Type: Improvement
          Components: Build
    Affects Versions: 2.2.0
            Reporter: liyunzhang


Based on SPARK-22660 and get the following error
{code}
error] /home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:415: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String])
[error]     properties.putAll(propsMap.asJava)
[error]                ^
[error] /home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:427: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String])
[error]       props.putAll(outputSerdeProps.toMap.asJava)
[error]             ^

{code}

The key type is Object instead of String, which is unsafe.




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org