You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/09/27 22:03:34 UTC
[jira] [Commented] (SPARK-2517) Remove as many compilation warning
messages as possible
[ https://issues.apache.org/jira/browse/SPARK-2517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14150753#comment-14150753 ]
Sean Owen commented on SPARK-2517:
----------------------------------
[~rxin] I think you resolved this? I don't see these warnings anymore. (Hurrah.)
> Remove as many compilation warning messages as possible
> -------------------------------------------------------
>
> Key: SPARK-2517
> URL: https://issues.apache.org/jira/browse/SPARK-2517
> Project: Spark
> Issue Type: Improvement
> Reporter: Reynold Xin
> Assignee: Yin Huai
> Priority: Minor
>
> We should probably treat warnings as failures in Jenkins.
> Some examples:
> {code}
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/util/FileAppenderSuite.scala:138: abstract type ExpectedAppender is unchecked since it is eliminated by erasure
> [warn] assert(appender.isInstanceOf[ExpectedAppender])
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/util/FileAppenderSuite.scala:143: abstract type ExpectedRollingPolicy is unchecked since it is eliminated by erasure
> [warn] rollingPolicy.isInstanceOf[ExpectedRollingPolicy]
> [warn] ^
> {code}
> {code}
> [warn] /scratch/rxin/spark/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala:386: method connect in class IOManager is deprecated: use the new implementation in package akka.io instead
> [warn] override def preStart = IOManager(context.system).connect(new InetSocketAddress(port))
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:207: non-variable type argument String in type pattern Map[String,Any] is unchecked since it is eliminated by erasure
> [warn] case (key: String, struct: Map[String, Any]) => {
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:238: non-variable type argument String in type pattern java.util.Map[String,Object] is unchecked since it is eliminated by erasure
> [warn] case map: java.util.Map[String, Object] =>
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:243: non-variable type argument Object in type pattern java.util.List[Object] is unchecked since it is eliminated by erasure
> [warn] case list: java.util.List[Object] =>
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:323: non-variable type argument String in type pattern Map[String,Any] is unchecked since it is eliminated by erasure
> [warn] case value: Map[String, Any] => toJsonObjectString(value)
> [warn] ^
> [info] Compiling 2 Scala sources to /scratch/rxin/spark/repl/target/scala-2.10/test-classes...
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala:382: method mapWith in class RDD is deprecated: use mapPartitionsWithIndex
> [warn] val randoms = ones.mapWith(
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala:400: method flatMapWith in class RDD is deprecated: use mapPartitionsWithIndex and flatMap
> [warn] val randoms = ones.flatMapWith(
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala:421: method filterWith in class RDD is deprecated: use mapPartitionsWithIndex and filter
> [warn] val sample = ints.filterWith(
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/serializer/ProactiveClosureSerializationSuite.scala:76: method mapWith in class RDD is deprecated: use mapPartitionsWithIndex
> [warn] x.mapWith(x => x.toString)((x,y)=>x + uc.op(y))
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/serializer/ProactiveClosureSerializationSuite.scala:82: method filterWith in class RDD is deprecated: use mapPartitionsWithIndex and filter
> [warn] x.filterWith(x => x.toString)((x,y)=>uc.pred(y))
> [warn] ^
> [warn] /scratch/rxin/spark/core/src/test/scala/org/apache/spark/util/VectorSuite.scala:29: class Vector in package util is deprecated: Use Vectors.dense from Spark's mllib.linalg package instead.
> [warn] def verifyVector(vector: Vector, expectedLength: Int) = {
> [warn] ^
> [warn] one warning found
> {code}
> {code}
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:238: non-variable type argument String in type pattern java.util.Map[String,Object] is unchecked since it is eliminated by erasure
> [warn] case map: java.util.Map[String, Object] =>
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:243: non-variable type argument Object in type pattern java.util.List[Object] is unchecked since it is eliminated by erasure
> [warn] case list: java.util.List[Object] =>
> [warn] ^
> [warn] /scratch/rxin/spark/sql/core/src/main/scala/org/apache/spark/sql/json/JsonRDD.scala:323: non-variable type argument String in type pattern Map[String,Any] is unchecked since it is eliminated by erasure
> [warn] case value: Map[String, Any] => toJsonObjectString(value)
> [warn] ^
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org