You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "holdenk (JIRA)" <ji...@apache.org> on 2016/07/28 18:03:20 UTC
[jira] [Created] (SPARK-16773) Post Spark 2.0 deprecation cleanup
holdenk created SPARK-16773:
-------------------------------
Summary: Post Spark 2.0 deprecation cleanup
Key: SPARK-16773
URL: https://issues.apache.org/jira/browse/SPARK-16773
Project: Spark
Issue Type: Improvement
Components: ML, MLlib, PySpark, Spark Core, SQL
Reporter: holdenk
As part of the 2.0 release we deprecated a number of different internal components (one of the largest ones being the old accumulator API), and also upgraded our default build to Scala 2.11.
This has added a large number of deprecation warnings (internal and external) - some of which can be worked around - and some of which can't (mostly in the Scala 2.10 -> 2.11 reflection API and various tests).
We should attempt to limit the number of warnings in our build so that we can notice new ones and thoughtfully consider if they are warranted.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org