You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/05/11 09:34:04 UTC
[jira] [Assigned] (SPARK-20554) Remove usage of
scala.language.reflectiveCalls
[ https://issues.apache.org/jira/browse/SPARK-20554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-20554:
------------------------------------
Assignee: Apache Spark
> Remove usage of scala.language.reflectiveCalls
> ----------------------------------------------
>
> Key: SPARK-20554
> URL: https://issues.apache.org/jira/browse/SPARK-20554
> Project: Spark
> Issue Type: Improvement
> Components: ML, Spark Core, SQL, Structured Streaming
> Affects Versions: 2.1.0
> Reporter: Sean Owen
> Assignee: Apache Spark
> Priority: Minor
>
> In several parts of the code we have imported {{scala.language.reflectiveCalls}} to suppress a warning about, well, reflective calls. I know from cleaning up build warnings in 2.2 that in almost all cases of this are inadvertent and masking a type problem.
> Example, in HiveDDLSuite:
> {code}
> val expectedTablePath =
> if (dbPath.isEmpty) {
> hiveContext.sessionState.catalog.defaultTablePath(tableIdentifier)
> } else {
> new Path(new Path(dbPath.get), tableIdentifier.table)
> }
> val filesystemPath = new Path(expectedTablePath.toString)
> {code}
> This shouldn't really work because one branch returns a URI and the other a Path. In this case it only needs an object with a toString method and can make this work with structural types and reflection.
> Obviously, the intent was to add ".toURI" to the second branch though to make both a URI!
> I think we should probably clean this up by taking out all imports of reflectiveCalls, and re-evaluating all of the warnings. There may be a few legit usages.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org