You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2021/08/27 19:31:00 UTC

[jira] [Resolved] (SPARK-35151) Suppress `symbol literal is deprecated` compilation warnings in Scala 2.13

     [ https://issues.apache.org/jira/browse/SPARK-35151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-35151.
----------------------------------
    Resolution: Duplicate

Not exactly a duplicate, but I think this is really one issue to resolve one way

> Suppress `symbol literal is deprecated` compilation warnings in Scala 2.13
> --------------------------------------------------------------------------
>
>                 Key: SPARK-35151
>                 URL: https://issues.apache.org/jira/browse/SPARK-35151
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.2.0
>            Reporter: Yang Jie
>            Priority: Minor
>
> Add compile args to suppress  compilation warnings  as follows:
>  
> {code:java}
> [warn] /home/kou/work/oss/spark-scala-2.13/examples/src/main/scala/org/apache/spark/examples/sql/SimpleTypedAggregator.scala:34:38: [deprecation @  | origin= | version=2.13.0] symbol literal is deprecated; use Symbol("id") instead
> [warn]     val ds = spark.range(20).select(('id % 3).as("key"), 'id).as[(Long, Long)]
> [warn]                                      ^
> [warn] /home/kou/work/oss/spark-scala-2.13/examples/src/main/scala/org/apache/spark/examples/sql/SimpleTypedAggregator.scala:34:58: [deprecation @  | origin= | version=2.13.0] symbol literal is deprecated; use Symbol("id") instead
> [warn]     val ds = spark.range(20).select(('id % 3).as("key"), 'id).as[(Long, Long)]
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org