You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yanzhe Chen <ya...@gmail.com> on 2014/02/26 17:59:00 UTC

Build Spark in IntelliJ IDEA 13

Hi, all

I'm trying to build Spark in IntelliJ IDEA 13.

I clone the latest repo and run sbt/sbt gen-idea in the root folder.
Then import it into IntelliJ IDEA. Scala plugin for IntelliJ IDEA has been
installed.

Everything seems ok until I ran Build > Make Project:

Information: Using javac 1.7.0_51 to compile java sources
Information: java: Errors occurred while compiling module 'spark-core'
Information: Modules "spark-streaming-flume-build", "spark-repl-build",
"spark-graphx-build", "spark-tools-build", "spark-streaming-kafka-build"
and 9 others were fully rebuilt due to project configuration/dependencies
changes
Information: Compilation completed with 1 error and 1 warning in 23 sec
Information: 1 error
Information: 1 warning
Error: java: javacTask: source release 1.7 requires target release 1.7
Warning: scalac: there were 56 feature warning(s); re-run with -feature for
details

I have only JDK 1.7 installed and the settings of Java Compiler are all
targeting to 1.7. So what does this error mean?

Besides, project can be compiled correctly from console and examples can
also run smoothly from console. The reason that I want to build from
IntelliJ IDEA is that I want to do some debugging. Anyone can show me a
better way to debug Spark (like I can step in / out some functions and
check variables in real time) ?

Best,
Yanzhe

Re: Build Spark in IntelliJ IDEA 13

Posted by moxiecui <mo...@gmail.com>.
Hi Chen:

Check the file > Setting > Compiler > Java Compiler, see if there are some
compiler settings to change. 

Hope that will help. 





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Build-Spark-in-IntelliJ-IDEA-13-tp2081p2103.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Build Spark in IntelliJ IDEA 13

Posted by Sean Owen <so...@cloudera.com>.
I also use IntelliJ 13 on a Mac, with only Java 7, and have never seen this.

If you look at the Spark build, you will see that it specifies Java 6, not 7.
Even if you changed java.version in the build, you would not get this
error, since it specifies source and target to be the same value.
In fact it would be fine to specify source/target 7 too, if you wanted
to for your own purposes.

The error says you changed the module source JDK level to be 7, but
are asking it to output Java 6 bytecode.

My guess is you inadvertently set the source language level to 7 in
IntelliJ. Check that. If so, try telling IntelliJ to reimport the
Maven project from the top-level pom.xml and it should override any of
that.
--
Sean Owen | Director, Data Science | London


On Wed, Feb 26, 2014 at 4:59 PM, Yanzhe Chen <ya...@gmail.com> wrote:
> Hi, all
>
> I'm trying to build Spark in IntelliJ IDEA 13.
>
> I clone the latest repo and run sbt/sbt gen-idea in the root folder.
> Then import it into IntelliJ IDEA. Scala plugin for IntelliJ IDEA has been
> installed.
>
> Everything seems ok until I ran Build > Make Project:
>
> Information: Using javac 1.7.0_51 to compile java sources
> Information: java: Errors occurred while compiling module 'spark-core'
> Information: Modules "spark-streaming-flume-build", "spark-repl-build",
> "spark-graphx-build", "spark-tools-build", "spark-streaming-kafka-build" and
> 9 others were fully rebuilt due to project configuration/dependencies
> changes
> Information: Compilation completed with 1 error and 1 warning in 23 sec
> Information: 1 error
> Information: 1 warning
> Error: java: javacTask: source release 1.7 requires target release 1.7
> Warning: scalac: there were 56 feature warning(s); re-run with -feature for
> details
>
> I have only JDK 1.7 installed and the settings of Java Compiler are all
> targeting to 1.7. So what does this error mean?
>
> Besides, project can be compiled correctly from console and examples can
> also run smoothly from console. The reason that I want to build from
> IntelliJ IDEA is that I want to do some debugging. Anyone can show me a
> better way to debug Spark (like I can step in / out some functions and check
> variables in real time) ?
>
> Best,
> Yanzhe

Re: Build Spark in IntelliJ IDEA 13

Posted by Bryn Keller <xo...@xoltar.org>.
Hi Yanzhe,

With Intellij 13, I don't think you need to use gen-idea, it should be able
to import the sbt project directly:

http://blog.jetbrains.com/scala/2013/11/18/built-in-sbt-support-in-intellij-idea-13/#comment-2742

Hope that helps,
Bryn


On Wed, Feb 26, 2014 at 8:59 AM, Yanzhe Chen <ya...@gmail.com> wrote:

> Hi, all
>
> I'm trying to build Spark in IntelliJ IDEA 13.
>
> I clone the latest repo and run sbt/sbt gen-idea in the root folder.
> Then import it into IntelliJ IDEA. Scala plugin for IntelliJ IDEA has been
> installed.
>
> Everything seems ok until I ran Build > Make Project:
>
> Information: Using javac 1.7.0_51 to compile java sources
> Information: java: Errors occurred while compiling module 'spark-core'
> Information: Modules "spark-streaming-flume-build", "spark-repl-build",
> "spark-graphx-build", "spark-tools-build", "spark-streaming-kafka-build"
> and 9 others were fully rebuilt due to project configuration/dependencies
> changes
> Information: Compilation completed with 1 error and 1 warning in 23 sec
> Information: 1 error
> Information: 1 warning
> Error: java: javacTask: source release 1.7 requires target release 1.7
> Warning: scalac: there were 56 feature warning(s); re-run with -feature
> for details
>
> I have only JDK 1.7 installed and the settings of Java Compiler are all
> targeting to 1.7. So what does this error mean?
>
> Besides, project can be compiled correctly from console and examples can
> also run smoothly from console. The reason that I want to build from
> IntelliJ IDEA is that I want to do some debugging. Anyone can show me a
> better way to debug Spark (like I can step in / out some functions and
> check variables in real time) ?
>
> Best,
> Yanzhe
>