You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Christian Kadner (JIRA)" <ji...@apache.org> on 2015/05/12 19:03:01 UTC

[jira] [Comment Edited] (SPARK-4128) Create instructions on fully building Spark in Intellij

    [ https://issues.apache.org/jira/browse/SPARK-4128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14540217#comment-14540217 ] 

Christian Kadner edited comment on SPARK-4128 at 5/12/15 5:02 PM:
------------------------------------------------------------------

Hi Sean,

while there is still a section covering the IntelliJ setup, what is missing are these steps, or an updated version of it, which I had to do for 1.3.0, 1.3.1, 1.4.0 in order to get a successfully Make of the project.

<<part of Patrick's deleted paragraph - start>>
    ...
    At the top of the leftmost pane, make sure the "Project/Packages" selector is set to "Packages".
    Right click on any package and click “Open Module Settings” - you will be able to modify any of the modules here.
    A few of the modules need to be modified slightly from the default import.
        Add sources to the following modules: Under “Sources” tab add a source on the right. 
            spark-hive: add v0.13.1/src/main/scala
            spark-hive-thriftserver v0.13.1/src/main/scala
            spark-repl: scala-2.10/src/main/scala and scala-2.10/src/test/scala
        For spark-yarn click “Add content root” and navigate in the filesystem to yarn/common directory of Spark
    ...
<<part of Patrick's deleted paragraph - end>>


I suggest to add an updated version of that to the wiki, since some of the Modules are setup in a way that similar non-obvious manual steps are required to make them compile.


was (Author: ckadner):
Hi Sean,

while there is still a section covering the IntelliJ setup, what is missing are these steps, or an updated version of it, which I had to do for 1.3.0, 1.3.1, 1.4.0 in order to get a successfully Make of the project.

<<part of Patrick's deleted paragraph - start>>
    ...
    At the top of the leftmost pane, make sure the "Project/Packages" selector is set to "Packages".
    Right click on any package and click “Open Module Settings” - you will be able to modify any of the modules here.
    A few of the modules need to be modified slightly from the default import.
        Add sources to the following modules: Under “Sources” tab add a source on the right. 
            spark-hive: add v0.13.1/src/main/scala
            spark-hive-thriftserver v0.13.1/src/main/scala
            spark-repl: scala-2.10/src/main/scala and scala-2.10/src/test/scala
        For spark-yarn click “Add content root” and navigate in the filesystem to yarn/common directory of Spark
<<part of Patrick's deleted paragraph - end>>


I suggest to add an updated version of that to the wiki, since some of the Modules are setup in a way that similar non-obvious manual steps are required to make them compile.

> Create instructions on fully building Spark in Intellij
> -------------------------------------------------------
>
>                 Key: SPARK-4128
>                 URL: https://issues.apache.org/jira/browse/SPARK-4128
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>            Reporter: Patrick Wendell
>            Assignee: Patrick Wendell
>            Priority: Blocker
>             Fix For: 1.2.0
>
>
> With some of our more complicated modules, I'm not sure whether Intellij correctly understands all source locations. Also, we might require specifying some profiles for the build to work directly. We should document clearly how to start with vanilla Spark master and get the entire thing building in Intellij.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org