You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by MEETHU MATHEW <me...@yahoo.co.in> on 2014/11/10 11:38:24 UTC

Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Hi,
This question was asked  earlier  and I did it in the way specified..I am getting java.lang.ClassNotFoundException..
Can somebody explain all the steps required to build a spark app using IntelliJ (latest version)starting from creating the project to running it..I searched a lot but couldnt find an appropriate documentation..
Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

|   |
|   |   |   |   |   |
| Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?Don’t try to use spark-core as an archetype. Instead just create a plain Scala project (noarchetype) and add a Maven dependency on spark-core. That should be all you need.  |
|  |
| View on mail-archives.apache.org | Preview by Yahoo |
|  |
|   |

   Thanks & Regards,
Meethu M

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Posted by MEETHU MATHEW <me...@yahoo.co.in>.
Hi,I could find the solution and like to share it.I added the following dependencies to the pom.xml and it worked..     <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>scala-compiler</artifactId>      <version>${scala.version}</version>    </dependency>    <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>scala-reflect</artifactId>      <version>${scala.version}</version>    </dependency>    <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>jline</artifactId>      <version>${scala.version}</version>    </dependency>    <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>scala-actors</artifactId>      <version>${scala.version}</version>    </dependency>    <dependency>      <groupId>org.scala-lang</groupId>      <artifactId>scalap</artifactId>      <version>${scala.version}</version>    </dependency> Thanks & Regards,
Meethu M 

     On Tuesday, 11 November 2014 4:47 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:
   

  Hi ,
I was able to build a spark app in IntelliJ using sbt..Now I am trying to build it using maven and I am getting build failed.I created a maven project by adding following archetype.Because simple-archetype was using scala 2.8 version and I am running 2.10.3
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>Now I added spark dependency as 
<dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-core_2.10</artifactId>      <version>1.1.0</version>    </dependency>
The Build is failing.The error message is likeINFO] Building SampleMVN 1.0-SNAPSHOT[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ SampleMVN ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] skip non existing resourceDirectory /home/meethu/Intellij/SampleMVN/src/main/resources[INFO] [INFO] --- maven-compiler-plugin:2.0.2:compile (default-compile) @ SampleMVN ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- scala-maven-plugin:3.1.3:compile (default) @ SampleMVN ---[WARNING]  Expected all dependencies to require Scala version: 2.10.3[WARNING]  mvn.SampleMVN:SampleMVN:1.0-SNAPSHOT requires scala version: 2.10.3[WARNING]  com.twitter:chill_2.10:0.3.6 requires scala version: 2.10.3[WARNING]  org.spark-project.akka:akka-actor_2.10:2.2.3-shaded-protobuf requires scala version: 2.10.2[WARNING] Multiple versions of scala libraries detected![INFO] /home/meethu/Intellij/SampleMVN/src/main/scala:-1: info: compiling[INFO] Compiling 2 source files to /home/meethu/Intellij/SampleMVN/target/classes at 1415700093124[ERROR] error: error while loading <root>, error in opening zip file[ERROR] error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.[ERROR]  at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)[ERROR]  at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)[ERROR]  at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)[ERROR]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)[ERROR]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)[ERROR]  at scala.tools.nsc.Driver.process(Driver.scala:54)[ERROR]  at scala.tools.nsc.Driver.main(Driver.scala:67)[ERROR]  at scala.tools.nsc.Main.main(Main.scala)[ERROR]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[ERROR]  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)[ERROR]  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[ERROR]  at java.lang.reflect.Method.invoke(Method.java:606)[ERROR]  at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)[ERROR]  at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
Can anyone help to resolve this issue.?
Thanks & Regards,
Meethu M 

     On Tuesday, 11 November 2014 1:50 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:
   

 Hi Akhil,
It worked.."You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file." This  helped really Thanks & Regards,
Meethu M 

     On Monday, 10 November 2014 4:58 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
   

 Hi Meethu,
You can install IntelliJ IDE rather than eclipse which has a lot more community support and features. Once you install it, follow this simple post to get started with scala. Now you have an IDE which works with Scala, Then for Spark Follow the below steps:
1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it

2. After sbt plugin install, restart intellij and Start New Scala sbt project (File -> New Project -> Scala -> SBT


3. Now open up the build.sbt file and add the all the dependencies(Here i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)

4. Now Create a new Scala class in src -> main -> scala and type in your code.
5. Right click and hit Run :)



Let me know how it goes. You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file.

ThanksBest Regards
On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:

Hi,
This question was asked  earlier  and I did it in the way specified..I am getting java.lang.ClassNotFoundException..
Can somebody explain all the steps required to build a spark app using IntelliJ (latest version)starting from creating the project to running it..I searched a lot but couldnt find an appropriate documentation..
Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

|   |
|   |   |   |   |   |
| Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?Don’t try to use spark-core as an archetype. Instead just create a plain Scala project (noarchetype) and add a Maven dependency on spark-core. That should be all you need.  |
|  |
| View on mail-archives.apache.org | Preview by Yahoo |
|  |
|   |

   Thanks & Regards,
Meethu M



    

    

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org

   

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Posted by MEETHU MATHEW <me...@yahoo.co.in>.
 Hi ,
I was able to build a spark app in IntelliJ using sbt..Now I am trying to build it using maven and I am getting build failed.I created a maven project by adding following archetype.Because simple-archetype was using scala 2.8 version and I am running 2.10.3
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.3</version>Now I added spark dependency as 
<dependency>      <groupId>org.apache.spark</groupId>      <artifactId>spark-core_2.10</artifactId>      <version>1.1.0</version>    </dependency>
The Build is failing.The error message is likeINFO] Building SampleMVN 1.0-SNAPSHOT[INFO] ------------------------------------------------------------------------[INFO] [INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ SampleMVN ---[INFO] Using 'UTF-8' encoding to copy filtered resources.[INFO] skip non existing resourceDirectory /home/meethu/Intellij/SampleMVN/src/main/resources[INFO] [INFO] --- maven-compiler-plugin:2.0.2:compile (default-compile) @ SampleMVN ---[INFO] Nothing to compile - all classes are up to date[INFO] [INFO] --- scala-maven-plugin:3.1.3:compile (default) @ SampleMVN ---[WARNING]  Expected all dependencies to require Scala version: 2.10.3[WARNING]  mvn.SampleMVN:SampleMVN:1.0-SNAPSHOT requires scala version: 2.10.3[WARNING]  com.twitter:chill_2.10:0.3.6 requires scala version: 2.10.3[WARNING]  org.spark-project.akka:akka-actor_2.10:2.2.3-shaded-protobuf requires scala version: 2.10.2[WARNING] Multiple versions of scala libraries detected![INFO] /home/meethu/Intellij/SampleMVN/src/main/scala:-1: info: compiling[INFO] Compiling 2 source files to /home/meethu/Intellij/SampleMVN/target/classes at 1415700093124[ERROR] error: error while loading <root>, error in opening zip file[ERROR] error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.[ERROR]  at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)[ERROR]  at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)[ERROR]  at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)[ERROR]  at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)[ERROR]  at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)[ERROR]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)[ERROR]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)[ERROR]  at scala.tools.nsc.Driver.process(Driver.scala:54)[ERROR]  at scala.tools.nsc.Driver.main(Driver.scala:67)[ERROR]  at scala.tools.nsc.Main.main(Main.scala)[ERROR]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[ERROR]  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)[ERROR]  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[ERROR]  at java.lang.reflect.Method.invoke(Method.java:606)[ERROR]  at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)[ERROR]  at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
Can anyone help to resolve this issue.?
Thanks & Regards,
Meethu M 

     On Tuesday, 11 November 2014 1:50 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:
   

 Hi Akhil,
It worked.."You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file." This  helped really Thanks & Regards,
Meethu M 

     On Monday, 10 November 2014 4:58 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
   

 Hi Meethu,
You can install IntelliJ IDE rather than eclipse which has a lot more community support and features. Once you install it, follow this simple post to get started with scala. Now you have an IDE which works with Scala, Then for Spark Follow the below steps:
1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it

2. After sbt plugin install, restart intellij and Start New Scala sbt project (File -> New Project -> Scala -> SBT


3. Now open up the build.sbt file and add the all the dependencies(Here i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)

4. Now Create a new Scala class in src -> main -> scala and type in your code.
5. Right click and hit Run :)



Let me know how it goes. You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file.

ThanksBest Regards
On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:

Hi,
This question was asked  earlier  and I did it in the way specified..I am getting java.lang.ClassNotFoundException..
Can somebody explain all the steps required to build a spark app using IntelliJ (latest version)starting from creating the project to running it..I searched a lot but couldnt find an appropriate documentation..
Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

|   |
|   |   |   |   |   |
| Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?Don’t try to use spark-core as an archetype. Instead just create a plain Scala project (noarchetype) and add a Maven dependency on spark-core. That should be all you need.  |
|  |
| View on mail-archives.apache.org | Preview by Yahoo |
|  |
|   |

   Thanks & Regards,
Meethu M



    

   

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Posted by MEETHU MATHEW <me...@yahoo.co.in>.
Hi Akhil,
It worked.."You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file." This  helped really Thanks & Regards,
Meethu M 

     On Monday, 10 November 2014 4:58 PM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
   

 Hi Meethu,
You can install IntelliJ IDE rather than eclipse which has a lot more community support and features. Once you install it, follow this simple post to get started with scala. Now you have an IDE which works with Scala, Then for Spark Follow the below steps:
1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it

2. After sbt plugin install, restart intellij and Start New Scala sbt project (File -> New Project -> Scala -> SBT


3. Now open up the build.sbt file and add the all the dependencies(Here i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)

4. Now Create a new Scala class in src -> main -> scala and type in your code.
5. Right click and hit Run :)



Let me know how it goes. You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file.

ThanksBest Regards
On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <me...@yahoo.co.in> wrote:

Hi,
This question was asked  earlier  and I did it in the way specified..I am getting java.lang.ClassNotFoundException..
Can somebody explain all the steps required to build a spark app using IntelliJ (latest version)starting from creating the project to running it..I searched a lot but couldnt find an appropriate documentation..
Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

|   |
|   |   |   |   |   |
| Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?Don’t try to use spark-core as an archetype. Instead just create a plain Scala project (noarchetype) and add a Maven dependency on spark-core. That should be all you need.  |
|  |
| View on mail-archives.apache.org | Preview by Yahoo |
|  |
|   |

   Thanks & Regards,
Meethu M



   

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Posted by Ilya Ganelin <il...@gmail.com>.
Hey Akhil. This is a great guide. I stumbled through this myself at some
point. Any chance we could get this on the website as part of the
documentation?
On Nov 11, 2014 8:13 AM, "Akhil Das" <ak...@sigmoidanalytics.com> wrote:

> Hi Meethu,
>
> You can install IntelliJ IDE
> <http://www.jetbrains.com/idea/features/scala.html> rather than eclipse
> which has a lot more community support and features. Once you install it,
> follow this simple post
> <http://confluence.jetbrains.com/display/IntelliJIDEA/Scala> to get
> started with scala. Now you have an IDE which works with Scala, Then for
> Spark Follow the below steps:
>
> *1. Install sbt plugins: *Goto File -> Settings -> Plugins -> Install
> IntelliJ Plugins -> Search for sbt and install it
> [image: Inline image 1]
>
>
> *2. After sbt plugin install, restart intellij and Start New Scala sbt
> project (File -> New Project -> Scala -> SBT*
>
> *[image: Inline image 2]*
>
> *[image: Inline image 3]*
>
> *3. Now open up the build.sbt file and add the all the dependencies(Here
> i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)*
>
> *[image: Inline image 4]*
>
> *4. Now Create a new Scala class in src -> main -> scala and type in your
> code.*
>
> *[image: Inline image 6]*
> *5. Right click and hit Run :)*
>
> *[image: Inline image 7]*
>
> *[image: Inline image 8]*
>
> Let me know how it goes. You might want to restart intelliJ sometime to
> get the dependencies pulled from the build.sbt file.
>
>
> Thanks
> Best Regards
>
> On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <me...@yahoo.co.in>
> wrote:
>
>> Hi,
>>
>> This question was asked  earlier  and I did it in the way specified..I am
>> getting java.lang.ClassNotFoundException..
>>
>> Can somebody explain all the steps required to build a spark app using
>> IntelliJ (latest version)starting from creating the project to running
>> it..I searched a lot but couldnt find an appropriate documentation..
>>
>> Re: Is there a step-by-step instruction on how to build Spark App with
>> IntelliJ IDEA?
>> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
>>
>>
>>
>>
>>
>>
>> Re: Is there a step-by-step instruction on how to build Spark App with
>> IntelliJ IDEA?
>> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
>> Don’t try to use spark-core as an archetype. Instead just create a plain
>> Scala project (no archetype) and add a Maven dependency on spark-core. That
>> should be all you need.
>> View on mail-archives.apache.org
>> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
>> Preview by Yahoo
>>
>>
>>
>> Thanks & Regards,
>> Meethu M
>>
>
>

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Hi Meethu,

You can install IntelliJ IDE
<http://www.jetbrains.com/idea/features/scala.html> rather than eclipse
which has a lot more community support and features. Once you install it,
follow this simple post
<http://confluence.jetbrains.com/display/IntelliJIDEA/Scala> to get started
with scala. Now you have an IDE which works with Scala, Then for Spark
Follow the below steps:

*1. Install sbt plugins: *Goto File -> Settings -> Plugins -> Install
IntelliJ Plugins -> Search for sbt and install it
[image: Inline image 1]


*2. After sbt plugin install, restart intellij and Start New Scala sbt
project (File -> New Project -> Scala -> SBT*

*[image: Inline image 2]*

*[image: Inline image 3]*

*3. Now open up the build.sbt file and add the all the dependencies(Here
i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)*

*[image: Inline image 4]*

*4. Now Create a new Scala class in src -> main -> scala and type in your
code.*

*[image: Inline image 6]*
*5. Right click and hit Run :)*

*[image: Inline image 7]*

*[image: Inline image 8]*

Let me know how it goes. You might want to restart intelliJ sometime to get
the dependencies pulled from the build.sbt file.


Thanks
Best Regards

On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <me...@yahoo.co.in>
wrote:

> Hi,
>
> This question was asked  earlier  and I did it in the way specified..I am
> getting java.lang.ClassNotFoundException..
>
> Can somebody explain all the steps required to build a spark app using
> IntelliJ (latest version)starting from creating the project to running
> it..I searched a lot but couldnt find an appropriate documentation..
>
> Re: Is there a step-by-step instruction on how to build Spark App with
> IntelliJ IDEA?
> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
>
>
>
>
>
>
> Re: Is there a step-by-step instruction on how to build Spark App with
> IntelliJ IDEA?
> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
> Don’t try to use spark-core as an archetype. Instead just create a plain
> Scala project (no archetype) and add a Maven dependency on spark-core. That
> should be all you need.
> View on mail-archives.apache.org
> <http://mail-archives.apache.org/mod_mbox/incubator-spark-user/201406.mbox/%3C59F4EB3F-9CA9-4E99-8BBC-FE7F6FF5D383@gmail.com%3E>
> Preview by Yahoo
>
>
>
> Thanks & Regards,
> Meethu M
>