You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2018/11/01 14:18:00 UTC

[jira] [Created] (SPARK-25908) Remove old deprecated items in Spark 3

Sean Owen created SPARK-25908:
---------------------------------

             Summary: Remove old deprecated items in Spark 3
                 Key: SPARK-25908
                 URL: https://issues.apache.org/jira/browse/SPARK-25908
             Project: Spark
          Issue Type: Task
          Components: Spark Core, SQL
    Affects Versions: 3.0.0
            Reporter: Sean Owen
            Assignee: Sean Owen


There are many deprecated methods and classes in Spark. They _can_ be removed in Spark 3, and for those that have been deprecated a long time (i.e. since Spark <= 2.0), we should probably do so. This addresses most of these cases, the easiest ones, those that are easy to remove and are old:

- Remove some AccumulableInfo .apply() methods
- Remove non-label-specific multiclass precision/recall/fScore in favor of accuracy
- Remove toDegrees/toRadians in favor of degrees/radians
- Remove approxCountDistinct in favor of approx_count_distinct
- Remove unused Python StorageLevel constants
- Remove Dataset unionAll in favor of union
- Remove unused multiclass option in libsvm parsing
- Remove references to deprecated spark configs like spark.yarn.am.port
- Remove TaskContext.isRunningLocally
- Remove ShuffleMetrics.shuffle* methods
- Remove BaseReadWrite.context in favor of session
- Remove Column.!== in favor of =!=
- Remove Dataset.explode
- Remove Dataset.registerTempTable
- Remove SQLContext.getOrCreate, setActive, clearActive, constructors

Not touched yet:

- everything else in MLLib
- HiveContext
- Anything deprecated more recently than 2.0.0, generally



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org