You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yang Jie (Jira)" <ji...@apache.org> on 2023/10/26 13:36:00 UTC
[jira] [Resolved] (SPARK-45659) Add `since` field to Java API marked as `@Deprecated`.
[ https://issues.apache.org/jira/browse/SPARK-45659?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yang Jie resolved SPARK-45659.
------------------------------
Fix Version/s: 4.0.0
Resolution: Fixed
Issue resolved by pull request 43522
[https://github.com/apache/spark/pull/43522]
> Add `since` field to Java API marked as `@Deprecated`.
> ------------------------------------------------------
>
> Key: SPARK-45659
> URL: https://issues.apache.org/jira/browse/SPARK-45659
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core, SQL, SS
> Affects Versions: 4.0.0
> Reporter: Yang Jie
> Assignee: Yang Jie
> Priority: Minor
> Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Spark 3.0.0:
> - SPARK-26861
> - org.apache.spark.sql.expressions.javalang.typed
> - SPARK-27606
> - org.apache.spark.sql.catalyst.expressions.ExpressionDescription#extended:
> - org.apache.spark.sql.catalyst.expressions.ExpressionInfo#ExpressionInfo(String, String, String, String, String)
> Spark 3.2.0
> - SPARK-33717
> - org.apache.spark.launcher.SparkLauncher#DEPRECATED_CHILD_CONNECTION_TIMEOUT
> - SPARK-33779
> - org.apache.spark.sql.connector.write.WriteBuilder#buildForBatch
> - org.apache.spark.sql.connector.write.WriteBuilder#buildForStreaming
> Spark 3.4.0
> - SPARK-39805
> - org.apache.spark.sql.streaming.Trigger
> - SPARK-42398
> - org.apache.spark.sql.connector.catalog.TableCatalog#createTable(Identifier, StructType, Transform[], Map<String,String>)
> - org.apache.spark.sql.connector.catalog.StagingTableCatalog#stageCreate(Identifier, StructType, Transform[], Map<String,String>)
> - org.apache.spark.sql.connector.catalog.Table#schema
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org