You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by 申毅杰 <he...@gmail.com> on 2014/06/11 04:39:45 UTC
Run ScalaTest inside Intellij IDEA
Hi All,
I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass the make phase before test running.
The problems are as follows:
/Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
Error:(44, 35) type mismatch;
found : org.apache.mesos.protobuf.ByteString
required: com.google.protobuf.ByteString
.setData(ByteString.copyFrom(data))
^
/Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
Error:(119, 35) type mismatch;
found : org.apache.mesos.protobuf.ByteString
required: com.google.protobuf.ByteString
.setData(ByteString.copyFrom(createExecArg()))
^
Error:(257, 35) type mismatch;
found : org.apache.mesos.protobuf.ByteString
required: com.google.protobuf.ByteString
.setData(ByteString.copyFrom(task.serializedTask))
^
Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
import projects into IDEA after ’sbt/sbt gen-idea’,
and able to run test in Terminal ’sbt/sbt test’
Are there anything I leave out in order to run/debug testsuite inside IDEA?
Best regards,
Yijie
Re: Run ScalaTest inside Intellij IDEA
Posted by Doris Xin <do...@gmail.com>.
Here's the JIRA on this known issue:
https://issues.apache.org/jira/browse/SPARK-1835
tl;dr: manually delete mesos-0.18.1.jar from lib_managed/jars after
running sbt/sbt
gen-idea. You should be able to run units inside Intellij after doing so.
Doris
On Tue, Jun 17, 2014 at 6:10 PM, Henry Saputra <he...@gmail.com>
wrote:
> I got stuck on this one too after did git pull from master.
>
> Have not been able to resolve it yet =(
>
>
> - Henry
>
> On Wed, Jun 11, 2014 at 6:51 AM, Yijie Shen <he...@gmail.com>
> wrote:
> > Thx Qiuzhuang, the problems disappeared after I add assembly jar at the
> head of list dependencies in *.iml, but while running test in Spark
> SQL(SQLQuerySuite in sql-core), another two error occurs:
> >
> > Error 1:
> > Error:scalac:
> > while compiling:
> /Users/yijie/code/apache.spark.master/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
> > during phase: jvm
> > library version: version 2.10.4
> > compiler version: version 2.10.4
> > reconstructed args: -Xmax-classfile-name 120 -deprecation
> -P:genjavadoc:out=/Users/yijie/code/apache.spark.master/sql/core/target/java
> -feature -classpath
> /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Conte…
> > …
> > ...
> >
> /Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/yijie/code/apache.spark.master/lib_managed/jars/scala-library-2.10.4.jar
> -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> > last tree to typer: Literal(Constant(parquet.io.api.Converter))
> > symbol: null
> > symbol definition: null
> > tpe: Class(classOf[parquet.io.api.Converter])
> > symbol owners:
> > context owners: object TestSQLContext -> package test
> > == Enclosing template or block ==
> > Template( // val <local TestSQLContext>: <notype> in object
> TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > "org.apache.spark.sql.SQLContext" // parents
> > ValDef(
> > private
> > "_"
> > <tpt>
> > <empty>
> > )
> > // 2 statements
> > DefDef( // private def readResolve(): Object in object TestSQLContext
> > <method> private <synthetic>
> > "readResolve"
> > []
> > List(Nil)
> > <tpt> // tree.tpe=Object
> > test.this."TestSQLContext" // object TestSQLContext in package test,
> tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > )
> > DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type
> in object TestSQLContext
> > <method>
> > "<init>"
> > []
> > List(Nil)
> > <tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > Block( // tree.tpe=Unit
> > Apply( // def <init>(sparkContext: org.apache.spark.SparkContext):
> org.apache.spark.sql.SQLContext in class SQLContext,
> tree.tpe=org.apache.spark.sql.SQLContext
> > TestSQLContext.super."<init>" // def <init>(sparkContext:
> org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class
> SQLContext, tree.tpe=(sparkContext:
> org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
> > Apply( // def <init>(master: String,appName: String,conf:
> org.apache.spark.SparkConf): org.apache.spark.SparkContext in class
> SparkContext, tree.tpe=org.apache.spark.SparkContext
> > new org.apache.spark.SparkContext."<init>" // def
> <init>(master: String,appName: String,conf: org.apache.spark.SparkConf):
> org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master:
> String, appName: String, conf:
> org.apache.spark.SparkConf)org.apache.spark.SparkContext
> > // 3 arguments
> > "local"
> > "TestSQLContext"
> > Apply( // def <init>(): org.apache.spark.SparkConf in class
> SparkConf, tree.tpe=org.apache.spark.SparkConf
> > new org.apache.spark.SparkConf."<init>" // def <init>():
> org.apache.spark.SparkConf in class SparkConf,
> tree.tpe=()org.apache.spark.SparkConf
> > Nil
> > )
> > )
> > )
> > ()
> > )
> > )
> > )
> > == Expanded type of tree ==
> > ConstantType(value = Constant(parquet.io.api.Converter))
> > uncaught exception during compilation: java.lang.AssertionError
> >
> > Error 2:
> >
> > Error:scalac: Error: assertion failed: List(object package$DebugNode,
> object package$DebugNode)
> > java.lang.AssertionError: assertion failed: List(object
> package$DebugNode, object package$DebugNode)
> > at
> scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> > at
> scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
> > at
> scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
> > at
> scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
> > at
> scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> > at
> scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> > at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> > at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> > at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> > at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
> > at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
> > at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at
> sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> > at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> > at
> sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> > at
> org.jetbrains.jps.incremental.scala.local.IdeaIncrementalCompiler.compile(IdeaIncrementalCompiler.scala:28)
> > at
> org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:25)
> > at
> org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:64)
> > at
> org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:22)
> > at
> org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
> >
> >
> > On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian <qi...@gmail.com>
> wrote:
> >
> >> I also run into this problem when running examples in IDEA. The issue
> looks that it uses depends on too many jars and that the classpath seems to
> have length limit. So I import the assembly jar and put the head of the
> list dependent path and it works.
> >>
> >> Thanks,
> >> Qiuzhuang
> >>
> >>
> >> On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com>
> wrote:
> >> Hi All,
> >>
> >> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t
> pass the make phase before test running.
> >> The problems are as follows:
> >>
> >>
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
> >> Error:(44, 35) type mismatch;
> >> found : org.apache.mesos.protobuf.ByteString
> >> required: com.google.protobuf.ByteString
> >> .setData(ByteString.copyFrom(data))
> >> ^
> >>
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
> >> Error:(119, 35) type mismatch;
> >> found : org.apache.mesos.protobuf.ByteString
> >> required: com.google.protobuf.ByteString
> >> .setData(ByteString.copyFrom(createExecArg()))
> >> ^
> >> Error:(257, 35) type mismatch;
> >> found : org.apache.mesos.protobuf.ByteString
> >> required: com.google.protobuf.ByteString
> >> .setData(ByteString.copyFrom(task.serializedTask))
> >> ^
> >>
> >> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
> >> import projects into IDEA after ’sbt/sbt gen-idea’,
> >> and able to run test in Terminal ’sbt/sbt test’
> >>
> >> Are there anything I leave out in order to run/debug testsuite inside
> IDEA?
> >>
> >> Best regards,
> >> Yijie
> >>
> >
>
Re: Run ScalaTest inside Intellij IDEA
Posted by Henry Saputra <he...@gmail.com>.
I got stuck on this one too after did git pull from master.
Have not been able to resolve it yet =(
- Henry
On Wed, Jun 11, 2014 at 6:51 AM, Yijie Shen <he...@gmail.com> wrote:
> Thx Qiuzhuang, the problems disappeared after I add assembly jar at the head of list dependencies in *.iml, but while running test in Spark SQL(SQLQuerySuite in sql-core), another two error occurs:
>
> Error 1:
> Error:scalac:
> while compiling: /Users/yijie/code/apache.spark.master/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
> during phase: jvm
> library version: version 2.10.4
> compiler version: version 2.10.4
> reconstructed args: -Xmax-classfile-name 120 -deprecation -P:genjavadoc:out=/Users/yijie/code/apache.spark.master/sql/core/target/java -feature -classpath /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Conte…
> …
> ...
> /Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/yijie/code/apache.spark.master/lib_managed/jars/scala-library-2.10.4.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> last tree to typer: Literal(Constant(parquet.io.api.Converter))
> symbol: null
> symbol definition: null
> tpe: Class(classOf[parquet.io.api.Converter])
> symbol owners:
> context owners: object TestSQLContext -> package test
> == Enclosing template or block ==
> Template( // val <local TestSQLContext>: <notype> in object TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> "org.apache.spark.sql.SQLContext" // parents
> ValDef(
> private
> "_"
> <tpt>
> <empty>
> )
> // 2 statements
> DefDef( // private def readResolve(): Object in object TestSQLContext
> <method> private <synthetic>
> "readResolve"
> []
> List(Nil)
> <tpt> // tree.tpe=Object
> test.this."TestSQLContext" // object TestSQLContext in package test, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> )
> DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type in object TestSQLContext
> <method>
> "<init>"
> []
> List(Nil)
> <tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> Block( // tree.tpe=Unit
> Apply( // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=org.apache.spark.sql.SQLContext
> TestSQLContext.super."<init>" // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
> Apply( // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=org.apache.spark.SparkContext
> new org.apache.spark.SparkContext."<init>" // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master: String, appName: String, conf: org.apache.spark.SparkConf)org.apache.spark.SparkContext
> // 3 arguments
> "local"
> "TestSQLContext"
> Apply( // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=org.apache.spark.SparkConf
> new org.apache.spark.SparkConf."<init>" // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=()org.apache.spark.SparkConf
> Nil
> )
> )
> )
> ()
> )
> )
> )
> == Expanded type of tree ==
> ConstantType(value = Constant(parquet.io.api.Converter))
> uncaught exception during compilation: java.lang.AssertionError
>
> Error 2:
>
> Error:scalac: Error: assertion failed: List(object package$DebugNode, object package$DebugNode)
> java.lang.AssertionError: assertion failed: List(object package$DebugNode, object package$DebugNode)
> at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> at scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
> at scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
> at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
> at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
> at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> at org.jetbrains.jps.incremental.scala.local.IdeaIncrementalCompiler.compile(IdeaIncrementalCompiler.scala:28)
> at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:25)
> at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:64)
> at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:22)
> at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
>
>
> On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian <qi...@gmail.com> wrote:
>
>> I also run into this problem when running examples in IDEA. The issue looks that it uses depends on too many jars and that the classpath seems to have length limit. So I import the assembly jar and put the head of the list dependent path and it works.
>>
>> Thanks,
>> Qiuzhuang
>>
>>
>> On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com> wrote:
>> Hi All,
>>
>> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass the make phase before test running.
>> The problems are as follows:
>>
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
>> Error:(44, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(data))
>> ^
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
>> Error:(119, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(createExecArg()))
>> ^
>> Error:(257, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(task.serializedTask))
>> ^
>>
>> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
>> import projects into IDEA after ’sbt/sbt gen-idea’,
>> and able to run test in Terminal ’sbt/sbt test’
>>
>> Are there anything I leave out in order to run/debug testsuite inside IDEA?
>>
>> Best regards,
>> Yijie
>>
>
Re: Run ScalaTest inside Intellij IDEA
Posted by Yijie Shen <he...@gmail.com>.
I got a clean version of the master branch, and do the steps as follows:
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m”
mvn -U -Dhadoop.version=2.2.0 -DskipTests clean package
after these steps, I open the project in IDEA through pom.xml in the root folder, but while run the same test SQLQuerySuite in sql-core, the two errors above still occurs, any ideas?
On Jun 11, 2014, at 10:16 PM, Qiuzhuang Lian <qi...@gmail.com> wrote:
> I run into this issue too today via 'mvn install -DskipTests' command today, then I issue a mvn clean and rebuild and it works.
>
> Thanks,
> Qiuzhuang
>
>
> On Wed, Jun 11, 2014 at 9:51 PM, Yijie Shen <he...@gmail.com> wrote:
> Thx Qiuzhuang, the problems disappeared after I add assembly jar at the head of list dependencies in *.iml, but while running test in Spark SQL(SQLQuerySuite in sql-core), another two error occurs:
>
> Error 1:
> Error:scalac:
> while compiling: /Users/yijie/code/apache.spark.master/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
> during phase: jvm
> library version: version 2.10.4
> compiler version: version 2.10.4
> reconstructed args: -Xmax-classfile-name 120 -deprecation -P:genjavadoc:out=/Users/yijie/code/apache.spark.master/sql/core/target/java -feature -classpath /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Conte…
> …
> ...
> /Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/yijie/code/apache.spark.master/lib_managed/jars/scala-library-2.10.4.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> last tree to typer: Literal(Constant(parquet.io.api.Converter))
> symbol: null
> symbol definition: null
> tpe: Class(classOf[parquet.io.api.Converter])
> symbol owners:
> context owners: object TestSQLContext -> package test
> == Enclosing template or block ==
> Template( // val <local TestSQLContext>: <notype> in object TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> "org.apache.spark.sql.SQLContext" // parents
> ValDef(
> private
> "_"
> <tpt>
> <empty>
> )
> // 2 statements
> DefDef( // private def readResolve(): Object in object TestSQLContext
> <method> private <synthetic>
> "readResolve"
> []
> List(Nil)
> <tpt> // tree.tpe=Object
> test.this."TestSQLContext" // object TestSQLContext in package test, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> )
> DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type in object TestSQLContext
> <method>
> "<init>"
> []
> List(Nil)
> <tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> Block( // tree.tpe=Unit
> Apply( // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=org.apache.spark.sql.SQLContext
> TestSQLContext.super."<init>" // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
> Apply( // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=org.apache.spark.SparkContext
> new org.apache.spark.SparkContext."<init>" // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master: String, appName: String, conf: org.apache.spark.SparkConf)org.apache.spark.SparkContext
> // 3 arguments
> "local"
> "TestSQLContext"
> Apply( // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=org.apache.spark.SparkConf
> new org.apache.spark.SparkConf."<init>" // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=()org.apache.spark.SparkConf
> Nil
> )
> )
> )
> ()
> )
> )
> )
> == Expanded type of tree ==
> ConstantType(value = Constant(parquet.io.api.Converter))
> uncaught exception during compilation: java.lang.AssertionError
>
> Error 2:
>
> Error:scalac: Error: assertion failed: List(object package$DebugNode, object package$DebugNode)
> java.lang.AssertionError: assertion failed: List(object package$DebugNode, object package$DebugNode)
> at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> at scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
> at scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
> at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
> at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
> at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> at org.jetbrains.jps.incremental.scala.local.IdeaIncrementalCompiler.compile(IdeaIncrementalCompiler.scala:28)
> at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:25)
> at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:64)
> at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:22)
> at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
>
>
> On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian <qi...@gmail.com> wrote:
>
>> I also run into this problem when running examples in IDEA. The issue looks that it uses depends on too many jars and that the classpath seems to have length limit. So I import the assembly jar and put the head of the list dependent path and it works.
>>
>> Thanks,
>> Qiuzhuang
>>
>>
>> On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com> wrote:
>> Hi All,
>>
>> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass the make phase before test running.
>> The problems are as follows:
>>
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
>> Error:(44, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(data))
>> ^
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
>> Error:(119, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(createExecArg()))
>> ^
>> Error:(257, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(task.serializedTask))
>> ^
>>
>> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
>> import projects into IDEA after ’sbt/sbt gen-idea’,
>> and able to run test in Terminal ’sbt/sbt test’
>>
>> Are there anything I leave out in order to run/debug testsuite inside IDEA?
>>
>> Best regards,
>> Yijie
>>
>
>
Re: Run ScalaTest inside Intellij IDEA
Posted by Qiuzhuang Lian <qi...@gmail.com>.
I run into this issue too today via 'mvn install -DskipTests' command
today, then I issue a mvn clean and rebuild and it works.
Thanks,
Qiuzhuang
On Wed, Jun 11, 2014 at 9:51 PM, Yijie Shen <he...@gmail.com>
wrote:
> Thx Qiuzhuang, the problems disappeared after I add assembly jar at the
> head of list dependencies in *.iml, but while running test in Spark
> SQL(SQLQuerySuite in sql-core), another two error occurs:
>
> Error 1:
> Error:scalac:
> while compiling:
> /Users/yijie/code/apache.spark.master/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
> during phase: jvm
> library version: version 2.10.4
> compiler version: version 2.10.4
> reconstructed args: -Xmax-classfile-name 120 -deprecation
> -P:genjavadoc:out=/Users/yijie/code/apache.spark.master/sql/core/target/java
> -feature -classpath
> /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Conte…
> …
> ...
> /Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/yijie/code/apache.spark.master/lib_managed/jars/scala-library-2.10.4.jar
> -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
> last tree to typer: Literal(Constant(parquet.io.api.Converter))
> symbol: null
> symbol definition: null
> tpe: Class(classOf[parquet.io.api.Converter])
> symbol owners:
> context owners: object TestSQLContext -> package test
> == Enclosing template or block ==
> Template( // val <local TestSQLContext>: <notype> in object
> TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> "org.apache.spark.sql.SQLContext" // parents
> ValDef(
> private
> "_"
> <tpt>
> <empty>
> )
> // 2 statements
> DefDef( // private def readResolve(): Object in object TestSQLContext
> <method> private <synthetic>
> "readResolve"
> []
> List(Nil)
> <tpt> // tree.tpe=Object
> test.this."TestSQLContext" // object TestSQLContext in package test,
> tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> )
> DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type
> in object TestSQLContext
> <method>
> "<init>"
> []
> List(Nil)
> <tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> Block( // tree.tpe=Unit
> Apply( // def <init>(sparkContext: org.apache.spark.SparkContext):
> org.apache.spark.sql.SQLContext in class SQLContext,
> tree.tpe=org.apache.spark.sql.SQLContext
> TestSQLContext.super."<init>" // def <init>(sparkContext:
> org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class
> SQLContext, tree.tpe=(sparkContext:
> org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
> Apply( // def <init>(master: String,appName: String,conf:
> org.apache.spark.SparkConf): org.apache.spark.SparkContext in class
> SparkContext, tree.tpe=org.apache.spark.SparkContext
> new org.apache.spark.SparkContext."<init>" // def <init>(master:
> String,appName: String,conf: org.apache.spark.SparkConf):
> org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master:
> String, appName: String, conf:
> org.apache.spark.SparkConf)org.apache.spark.SparkContext
> // 3 arguments
> "local"
> "TestSQLContext"
> Apply( // def <init>(): org.apache.spark.SparkConf in class
> SparkConf, tree.tpe=org.apache.spark.SparkConf
> new org.apache.spark.SparkConf."<init>" // def <init>():
> org.apache.spark.SparkConf in class SparkConf,
> tree.tpe=()org.apache.spark.SparkConf
> Nil
> )
> )
> )
> ()
> )
> )
> )
> == Expanded type of tree ==
> ConstantType(value = Constant(parquet.io.api.Converter))
> uncaught exception during compilation: java.lang.AssertionError
>
> Error 2:
>
> Error:scalac: Error: assertion failed: List(object package$DebugNode,
> object package$DebugNode)
> java.lang.AssertionError: assertion failed: List(object package$DebugNode,
> object package$DebugNode)
> at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> at
> scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
> at
> scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
> at
> scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
> at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
> at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> at
> org.jetbrains.jps.incremental.scala.local.IdeaIncrementalCompiler.compile(IdeaIncrementalCompiler.scala:28)
> at
> org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:25)
> at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:64)
> at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:22)
> at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
>
>
> On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian <qi...@gmail.com>
> wrote:
>
> I also run into this problem when running examples in IDEA. The issue
> looks that it uses depends on too many jars and that the classpath seems to
> have length limit. So I import the assembly jar and put the head of the
> list dependent path and it works.
>
> Thanks,
> Qiuzhuang
>
>
> On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com> wrote:
>
>> Hi All,
>>
>> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass
>> the make phase before test running.
>> The problems are as follows:
>>
>>
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
>> Error:(44, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(data))
>> ^
>>
>> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
>> Error:(119, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(createExecArg()))
>> ^
>> Error:(257, 35) type mismatch;
>> found : org.apache.mesos.protobuf.ByteString
>> required: com.google.protobuf.ByteString
>> .setData(ByteString.copyFrom(task.serializedTask))
>> ^
>>
>> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
>> import projects into IDEA after ’sbt/sbt gen-idea’,
>> and able to run test in Terminal ’sbt/sbt test’
>>
>> Are there anything I leave out in order to run/debug testsuite inside
>> IDEA?
>>
>> Best regards,
>> Yijie
>
>
>
>
Re: Run ScalaTest inside Intellij IDEA
Posted by Yijie Shen <he...@gmail.com>.
Thx Qiuzhuang, the problems disappeared after I add assembly jar at the head of list dependencies in *.iml, but while running test in Spark SQL(SQLQuerySuite in sql-core), another two error occurs:
Error 1:
Error:scalac:
while compiling: /Users/yijie/code/apache.spark.master/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
during phase: jvm
library version: version 2.10.4
compiler version: version 2.10.4
reconstructed args: -Xmax-classfile-name 120 -deprecation -P:genjavadoc:out=/Users/yijie/code/apache.spark.master/sql/core/target/java -feature -classpath /Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Conte…
…
...
/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/jre/classes:/Users/yijie/code/apache.spark.master/lib_managed/jars/scala-library-2.10.4.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar -Xplugin:/Users/yijie/code/apache.spark.master/lib_managed/jars/genjavadoc-plugin_2.10.4-0.5.jar
last tree to typer: Literal(Constant(parquet.io.api.Converter))
symbol: null
symbol definition: null
tpe: Class(classOf[parquet.io.api.Converter])
symbol owners:
context owners: object TestSQLContext -> package test
== Enclosing template or block ==
Template( // val <local TestSQLContext>: <notype> in object TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
"org.apache.spark.sql.SQLContext" // parents
ValDef(
private
"_"
<tpt>
<empty>
)
// 2 statements
DefDef( // private def readResolve(): Object in object TestSQLContext
<method> private <synthetic>
"readResolve"
[]
List(Nil)
<tpt> // tree.tpe=Object
test.this."TestSQLContext" // object TestSQLContext in package test, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
)
DefDef( // def <init>(): org.apache.spark.sql.test.TestSQLContext.type in object TestSQLContext
<method>
"<init>"
[]
List(Nil)
<tpt> // tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
Block( // tree.tpe=Unit
Apply( // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=org.apache.spark.sql.SQLContext
TestSQLContext.super."<init>" // def <init>(sparkContext: org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class SQLContext, tree.tpe=(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
Apply( // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=org.apache.spark.SparkContext
new org.apache.spark.SparkContext."<init>" // def <init>(master: String,appName: String,conf: org.apache.spark.SparkConf): org.apache.spark.SparkContext in class SparkContext, tree.tpe=(master: String, appName: String, conf: org.apache.spark.SparkConf)org.apache.spark.SparkContext
// 3 arguments
"local"
"TestSQLContext"
Apply( // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=org.apache.spark.SparkConf
new org.apache.spark.SparkConf."<init>" // def <init>(): org.apache.spark.SparkConf in class SparkConf, tree.tpe=()org.apache.spark.SparkConf
Nil
)
)
)
()
)
)
)
== Expanded type of tree ==
ConstantType(value = Constant(parquet.io.api.Converter))
uncaught exception during compilation: java.lang.AssertionError
Error 2:
Error:scalac: Error: assertion failed: List(object package$DebugNode, object package$DebugNode)
java.lang.AssertionError: assertion failed: List(object package$DebugNode, object package$DebugNode)
at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
at scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
at scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
at org.jetbrains.jps.incremental.scala.local.IdeaIncrementalCompiler.compile(IdeaIncrementalCompiler.scala:28)
at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:25)
at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:64)
at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:22)
at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)
On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian <qi...@gmail.com> wrote:
> I also run into this problem when running examples in IDEA. The issue looks that it uses depends on too many jars and that the classpath seems to have length limit. So I import the assembly jar and put the head of the list dependent path and it works.
>
> Thanks,
> Qiuzhuang
>
>
> On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com> wrote:
> Hi All,
>
> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass the make phase before test running.
> The problems are as follows:
>
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
> Error:(44, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(data))
> ^
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
> Error:(119, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(createExecArg()))
> ^
> Error:(257, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(task.serializedTask))
> ^
>
> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
> import projects into IDEA after ’sbt/sbt gen-idea’,
> and able to run test in Terminal ’sbt/sbt test’
>
> Are there anything I leave out in order to run/debug testsuite inside IDEA?
>
> Best regards,
> Yijie
>
Re: Run ScalaTest inside Intellij IDEA
Posted by Qiuzhuang Lian <qi...@gmail.com>.
I also run into this problem when running examples in IDEA. The issue looks
that it uses depends on too many jars and that the classpath seems to have
length limit. So I import the assembly jar and put the head of the list
dependent path and it works.
Thanks,
Qiuzhuang
On Wed, Jun 11, 2014 at 10:39 AM, 申毅杰 <he...@gmail.com> wrote:
> Hi All,
>
> I want to run ScalaTest Suite in IDEA directly, but it seems didn’t pass
> the make phase before test running.
> The problems are as follows:
>
>
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
> Error:(44, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(data))
> ^
>
> /Users/yijie/code/apache.spark.master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
> Error:(119, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(createExecArg()))
> ^
> Error:(257, 35) type mismatch;
> found : org.apache.mesos.protobuf.ByteString
> required: com.google.protobuf.ByteString
> .setData(ByteString.copyFrom(task.serializedTask))
> ^
>
> Before I run test in IDEA, I build spark through ’sbt/sbt assembly’,
> import projects into IDEA after ’sbt/sbt gen-idea’,
> and able to run test in Terminal ’sbt/sbt test’
>
> Are there anything I leave out in order to run/debug testsuite inside IDEA?
>
> Best regards,
> Yijie