You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2023/10/28 09:50:00 UTC

[jira] [Updated] (SPARK-45690) Clean up type use of `BufferedIterator/CanBuildFrom/Traversable`

     [ https://issues.apache.org/jira/browse/SPARK-45690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ASF GitHub Bot updated SPARK-45690:
-----------------------------------
    Labels: pull-request-available  (was: )

> Clean up type use of `BufferedIterator/CanBuildFrom/Traversable`
> ----------------------------------------------------------------
>
>                 Key: SPARK-45690
>                 URL: https://issues.apache.org/jira/browse/SPARK-45690
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 4.0.0
>            Reporter: Yang Jie
>            Priority: Major
>              Labels: pull-request-available
>
> * type BufferedIterator in package scala is deprecated (since 2.13.0)
>  * type CanBuildFrom in package generic is deprecated (since 2.13.0)
>  * type Traversable in package scala is deprecated (since 2.13.0)
>  
> {code:java}
> [warn] /Users/yangjie01/SourceCode/git/spark-mine-sbt/sql/core/src/main/scala/org/apache/spark/sql/execution/GroupedIterator.scala:67:12: type BufferedIterator in package scala is deprecated (since 2.13.0): Use scala.collection.BufferedIterator instead of scala.BufferedIterator
> [warn] Applicable -Wconf / @nowarn filters for this warning: msg=<part of the message>, cat=deprecation, site=org.apache.spark.sql.execution.GroupedIterator.input, origin=scala.BufferedIterator, version=2.13.0
> [warn]     input: BufferedIterator[InternalRow],
> [warn]            ^ {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org