You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Wong (JIRA)" <ji...@apache.org> on 2017/08/04 00:20:00 UTC

[jira] [Created] (SPARK-21631) Building Spark with SBT unsuccessful when source code in Mllib is modified, But with MVN is ok

Sean Wong created SPARK-21631:
---------------------------------

             Summary: Building Spark with SBT unsuccessful when source code in Mllib is modified, But with MVN is ok
                 Key: SPARK-21631
                 URL: https://issues.apache.org/jira/browse/SPARK-21631
             Project: Spark
          Issue Type: Bug
          Components: Build, MLlib
    Affects Versions: 2.1.1
         Environment: ubuntu 14.04

Spark 2.1.1

MVN 3.3.9

scala 2.11.8
            Reporter: Sean Wong


I added 
import org.apache.spark.internal.Logging
at the head of LinearRegression.scala file

Then, I try to build Spark using SBT.
However, here is the error:
*[info] Done packaging.
java.lang.RuntimeException: errors exist
        at scala.sys.package$.error(package.scala:27)
        at org.scalastyle.sbt.Tasks$.onHasErrors$1(Plugin.scala:132)
        at org.scalastyle.sbt.Tasks$.doScalastyleWithConfig$1(Plugin.scala:187)
        at org.scalastyle.sbt.Tasks$.doScalastyle(Plugin.scala:195)
        at SparkBuild$$anonfun$cachedScalaStyle$1$$anonfun$17.apply(SparkBuild.scala:205)
        at SparkBuild$$anonfun$cachedScalaStyle$1$$anonfun$17.apply(SparkBuild.scala:192)
        at sbt.FileFunction$$anonfun$cached$1.apply(Tracked.scala:235)
        at sbt.FileFunction$$anonfun$cached$1.apply(Tracked.scala:235)
        at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3$$anonfun$apply$4.apply(Tracked.scala:249)
        at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3$$anonfun$apply$4.apply(Tracked.scala:245)
        at sbt.Difference.apply(Tracked.scala:224)
        at sbt.Difference.apply(Tracked.scala:206)
        at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3.apply(Tracked.scala:245)
        at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3.apply(Tracked.scala:244)
        at sbt.Difference.apply(Tracked.scala:224)
        at sbt.Difference.apply(Tracked.scala:200)
        at sbt.FileFunction$$anonfun$cached$2.apply(Tracked.scala:244)
        at sbt.FileFunction$$anonfun$cached$2.apply(Tracked.scala:242)
        at SparkBuild$$anonfun$cachedScalaStyle$1.apply(SparkBuild.scala:212)
        at SparkBuild$$anonfun$cachedScalaStyle$1.apply(SparkBuild.scala:187)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:237)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (mllib/*:scalaStyleOnCompile) errors exist*

After this, I switch to use MVN to build Spark, Everything is ok and the building is successful.

So is this a bug for SBT building? 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org