You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/04/02 16:44:00 UTC
[jira] [Resolved] (SPARK-38661) [TESTS] Replace 'abc & Symbol("abc") symbols with $"abc" in tests
[ https://issues.apache.org/jira/browse/SPARK-38661?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean R. Owen resolved SPARK-38661.
----------------------------------
Fix Version/s: 3.4.0
Resolution: Fixed
Issue resolved by pull request 35976
[https://github.com/apache/spark/pull/35976]
> [TESTS] Replace 'abc & Symbol("abc") symbols with $"abc" in tests
> -----------------------------------------------------------------
>
> Key: SPARK-38661
> URL: https://issues.apache.org/jira/browse/SPARK-38661
> Project: Spark
> Issue Type: Improvement
> Components: Tests
> Affects Versions: 3.2.1
> Reporter: Martin Tzvetanov Grigorov
> Assignee: Martin Tzvetanov Grigorov
> Priority: Minor
> Fix For: 3.4.0
>
>
> This ticket is a follow up of SPARK-38351.
>
> When building with Scala 2.13 many test classes produce warnings like:
> {code:java}
> [warn] /home/runner/work/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/BaseScriptTransformationSuite.scala:562:11: [deprecation @ | origin= | version=2.13.0] symbol literal is deprecated; use Symbol("d") instead
> [warn] 'd.cast("string"),
> [warn] ^
> [warn] /home/runner/work/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/BaseScriptTransformationSuite.scala:563:11: [deprecation @ | origin= | version=2.13.0] symbol literal is deprecated; use Symbol("e") instead
> [warn] 'e.cast("string")).collect())
> {code}
> For easier migration to Scala 3.x later it would be good to fix this warnings!
>
> Also as suggested by [https://github.com/HeartSaVioR] it would be good to use Spark's $"abc" syntax for columns.
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org