You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Stephen Boesch <ja...@gmail.com> on 2016/06/23 02:25:18 UTC

Building Spark 2.X in Intellij

Building inside intellij is an ever moving target. Anyone have the magical
procedures to get it going for 2.X?

There are numerous library references that - although included in the
pom.xml build - are for some reason not found when processed within
Intellij.

Re: Building Spark 2.X in Intellij

Posted by Praveen R <pr...@sigmoidanalytics.com>.
I had some errors like SqlBaseParser class missing, and figured out I
needed to get these classes from SqlBase.g4 using antlr4. It works fine now.

On Thu, Jun 23, 2016 at 9:20 AM, Jeff Zhang <zj...@gmail.com> wrote:

> It works well with me. You can try reimport it into intellij.
>
> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com>
> wrote:
>
>>
>> Building inside intellij is an ever moving target. Anyone have the
>> magical procedures to get it going for 2.X?
>>
>> There are numerous library references that - although included in the
>> pom.xml build - are for some reason not found when processed within
>> Intellij.
>>
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Building Spark 2.X in Intellij

Posted by Stephen Boesch <ja...@gmail.com>.
I just checked out completely fresh directory and created new IJ project.
Then followed your tip for adding the avro source.

Here is an additional set of errors

Error:(31, 12) object razorvine is not a member of package net
import net.razorvine.pickle.{IObjectPickler, Opcodes, Pickler}
           ^
Error:(779, 49) not found: type IObjectPickler
  class PythonMessageAndMetadataPickler extends IObjectPickler {
                                                ^
Error:(783, 7) not found: value Pickler
      Pickler.registerCustomPickler(classOf[PythonMessageAndMetadata], this)
      ^
Error:(784, 7) not found: value Pickler
      Pickler.registerCustomPickler(this.getClass, this)
      ^
Error:(787, 57) not found: type Pickler
    def pickle(obj: Object, out: OutputStream, pickler: Pickler) {
                                                        ^
Error:(789, 19) not found: value Opcodes
        out.write(Opcodes.GLOBAL)
                  ^
Error:(794, 19) not found: value Opcodes
        out.write(Opcodes.MARK)
                  ^
Error:(800, 19) not found: value Opcodes
        out.write(Opcodes.TUPLE)
                  ^
Error:(801, 19) not found: value Opcodes
        out.write(Opcodes.REDUCE)
                  ^

2016-06-22 23:49 GMT-07:00 Stephen Boesch <ja...@gmail.com>:

> Thanks Jeff - I remember that now from long time ago.  After making that
> change the next errors are:
>
> Error:scalac: missing or invalid dependency detected while loading class
> file 'RDDOperationScope.class'.
> Could not access term fasterxml in package com,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'RDDOperationScope.class' was compiled against
> an incompatible version of com.
> Error:scalac: missing or invalid dependency detected while loading class
> file 'RDDOperationScope.class'.
> Could not access term jackson in value com.fasterxml,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'RDDOperationScope.class' was compiled against
> an incompatible version of com.fasterxml.
> Error:scalac: missing or invalid dependency detected while loading class
> file 'RDDOperationScope.class'.
> Could not access term annotation in value com.jackson,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'RDDOperationScope.class' was compiled against
> an incompatible version of com.jackson.
> Error:scalac: missing or invalid dependency detected while loading class
> file 'RDDOperationScope.class'.
> Could not access term JsonInclude in value com.annotation,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'RDDOperationScope.class' was compiled against
> an incompatible version of com.annotation.
> Warning:scalac: Class org.jboss.netty.channel.ChannelFactory not found -
> continuing with a stub.
>
>
> 2016-06-22 23:39 GMT-07:00 Jeff Zhang <zj...@gmail.com>:
>
>> You need to
>> spark/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro
>> under build path, this is the only thing you need to do manually if I
>> remember correctly.
>>
>>
>>
>> On Thu, Jun 23, 2016 at 2:30 PM, Stephen Boesch <ja...@gmail.com>
>> wrote:
>>
>>> Hi Jeff,
>>>   I'd like to understand what may be different. I have rebuilt and
>>> reimported many times.  Just now I blew away the .idea/* and *.iml to start
>>> from scratch.  I just opened the $SPARK_HOME directory from intellij File |
>>> Open  .  After it finished the initial import I tried to run one of the
>>> Examples - and it fails in the build:
>>>
>>> Here are the errors I see:
>>>
>>> Error:(45, 66) not found: type SparkFlumeProtocol
>>>   val transactionTimeout: Int, val backOffInterval: Int) extends
>>> SparkFlumeProtocol with Logging {
>>>                                                                  ^
>>> Error:(70, 39) not found: type EventBatch
>>>   override def getEventBatch(n: Int): EventBatch = {
>>>                                       ^
>>> Error:(85, 13) not found: type EventBatch
>>>         new EventBatch("Spark sink has been stopped!", "",
>>> java.util.Collections.emptyList())
>>>             ^
>>>
>>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
>>> Error:(80, 22) not found: type EventBatch
>>>   def getEventBatch: EventBatch = {
>>>                      ^
>>> Error:(48, 37) not found: type EventBatch
>>>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
>>> Error", "",
>>>                                     ^
>>> Error:(48, 54) not found: type EventBatch
>>>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
>>> Error", "",
>>>                                                      ^
>>> Error:(115, 41) not found: type SparkSinkEvent
>>>         val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
>>>                                         ^
>>> Error:(146, 28) not found: type EventBatch
>>>           eventBatch = new EventBatch("", seqNum, events)
>>>                            ^
>>>
>>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
>>> Error:(25, 27) not found: type EventBatch
>>>   def isErrorBatch(batch: EventBatch): Boolean = {
>>>                           ^
>>>
>>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
>>> Error:(86, 51) not found: type SparkFlumeProtocol
>>>     val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
>>> handler.get)
>>>                                                   ^
>>>
>>>
>>> Note: this is just the first batch of errors.
>>>
>>>
>>>
>>>
>>> 2016-06-22 20:50 GMT-07:00 Jeff Zhang <zj...@gmail.com>:
>>>
>>>> It works well with me. You can try reimport it into intellij.
>>>>
>>>> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com>
>>>> wrote:
>>>>
>>>>>
>>>>> Building inside intellij is an ever moving target. Anyone have the
>>>>> magical procedures to get it going for 2.X?
>>>>>
>>>>> There are numerous library references that - although included in the
>>>>> pom.xml build - are for some reason not found when processed within
>>>>> Intellij.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>

Re: Building Spark 2.X in Intellij

Posted by Stephen Boesch <ja...@gmail.com>.
Thanks Jeff - I remember that now from long time ago.  After making that
change the next errors are:

Error:scalac: missing or invalid dependency detected while loading class
file 'RDDOperationScope.class'.
Could not access term fasterxml in package com,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'RDDOperationScope.class' was compiled against
an incompatible version of com.
Error:scalac: missing or invalid dependency detected while loading class
file 'RDDOperationScope.class'.
Could not access term jackson in value com.fasterxml,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'RDDOperationScope.class' was compiled against
an incompatible version of com.fasterxml.
Error:scalac: missing or invalid dependency detected while loading class
file 'RDDOperationScope.class'.
Could not access term annotation in value com.jackson,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'RDDOperationScope.class' was compiled against
an incompatible version of com.jackson.
Error:scalac: missing or invalid dependency detected while loading class
file 'RDDOperationScope.class'.
Could not access term JsonInclude in value com.annotation,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'RDDOperationScope.class' was compiled against
an incompatible version of com.annotation.
Warning:scalac: Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.


2016-06-22 23:39 GMT-07:00 Jeff Zhang <zj...@gmail.com>:

> You need to
> spark/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro
> under build path, this is the only thing you need to do manually if I
> remember correctly.
>
>
>
> On Thu, Jun 23, 2016 at 2:30 PM, Stephen Boesch <ja...@gmail.com> wrote:
>
>> Hi Jeff,
>>   I'd like to understand what may be different. I have rebuilt and
>> reimported many times.  Just now I blew away the .idea/* and *.iml to start
>> from scratch.  I just opened the $SPARK_HOME directory from intellij File |
>> Open  .  After it finished the initial import I tried to run one of the
>> Examples - and it fails in the build:
>>
>> Here are the errors I see:
>>
>> Error:(45, 66) not found: type SparkFlumeProtocol
>>   val transactionTimeout: Int, val backOffInterval: Int) extends
>> SparkFlumeProtocol with Logging {
>>                                                                  ^
>> Error:(70, 39) not found: type EventBatch
>>   override def getEventBatch(n: Int): EventBatch = {
>>                                       ^
>> Error:(85, 13) not found: type EventBatch
>>         new EventBatch("Spark sink has been stopped!", "",
>> java.util.Collections.emptyList())
>>             ^
>>
>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
>> Error:(80, 22) not found: type EventBatch
>>   def getEventBatch: EventBatch = {
>>                      ^
>> Error:(48, 37) not found: type EventBatch
>>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
>> Error", "",
>>                                     ^
>> Error:(48, 54) not found: type EventBatch
>>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
>> Error", "",
>>                                                      ^
>> Error:(115, 41) not found: type SparkSinkEvent
>>         val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
>>                                         ^
>> Error:(146, 28) not found: type EventBatch
>>           eventBatch = new EventBatch("", seqNum, events)
>>                            ^
>>
>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
>> Error:(25, 27) not found: type EventBatch
>>   def isErrorBatch(batch: EventBatch): Boolean = {
>>                           ^
>>
>> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
>> Error:(86, 51) not found: type SparkFlumeProtocol
>>     val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
>> handler.get)
>>                                                   ^
>>
>>
>> Note: this is just the first batch of errors.
>>
>>
>>
>>
>> 2016-06-22 20:50 GMT-07:00 Jeff Zhang <zj...@gmail.com>:
>>
>>> It works well with me. You can try reimport it into intellij.
>>>
>>> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com>
>>> wrote:
>>>
>>>>
>>>> Building inside intellij is an ever moving target. Anyone have the
>>>> magical procedures to get it going for 2.X?
>>>>
>>>> There are numerous library references that - although included in the
>>>> pom.xml build - are for some reason not found when processed within
>>>> Intellij.
>>>>
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Building Spark 2.X in Intellij

Posted by Jeff Zhang <zj...@gmail.com>.
You need to
spark/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro
under build path, this is the only thing you need to do manually if I
remember correctly.



On Thu, Jun 23, 2016 at 2:30 PM, Stephen Boesch <ja...@gmail.com> wrote:

> Hi Jeff,
>   I'd like to understand what may be different. I have rebuilt and
> reimported many times.  Just now I blew away the .idea/* and *.iml to start
> from scratch.  I just opened the $SPARK_HOME directory from intellij File |
> Open  .  After it finished the initial import I tried to run one of the
> Examples - and it fails in the build:
>
> Here are the errors I see:
>
> Error:(45, 66) not found: type SparkFlumeProtocol
>   val transactionTimeout: Int, val backOffInterval: Int) extends
> SparkFlumeProtocol with Logging {
>                                                                  ^
> Error:(70, 39) not found: type EventBatch
>   override def getEventBatch(n: Int): EventBatch = {
>                                       ^
> Error:(85, 13) not found: type EventBatch
>         new EventBatch("Spark sink has been stopped!", "",
> java.util.Collections.emptyList())
>             ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
> Error:(80, 22) not found: type EventBatch
>   def getEventBatch: EventBatch = {
>                      ^
> Error:(48, 37) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
>                                     ^
> Error:(48, 54) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
>                                                      ^
> Error:(115, 41) not found: type SparkSinkEvent
>         val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
>                                         ^
> Error:(146, 28) not found: type EventBatch
>           eventBatch = new EventBatch("", seqNum, events)
>                            ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
> Error:(25, 27) not found: type EventBatch
>   def isErrorBatch(batch: EventBatch): Boolean = {
>                           ^
>
> /git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
> Error:(86, 51) not found: type SparkFlumeProtocol
>     val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
> handler.get)
>                                                   ^
>
>
> Note: this is just the first batch of errors.
>
>
>
>
> 2016-06-22 20:50 GMT-07:00 Jeff Zhang <zj...@gmail.com>:
>
>> It works well with me. You can try reimport it into intellij.
>>
>> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com>
>> wrote:
>>
>>>
>>> Building inside intellij is an ever moving target. Anyone have the
>>> magical procedures to get it going for 2.X?
>>>
>>> There are numerous library references that - although included in the
>>> pom.xml build - are for some reason not found when processed within
>>> Intellij.
>>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>


-- 
Best Regards

Jeff Zhang

Re: Building Spark 2.X in Intellij

Posted by Stephen Boesch <ja...@gmail.com>.
Hi Jeff,
  I'd like to understand what may be different. I have rebuilt and
reimported many times.  Just now I blew away the .idea/* and *.iml to start
from scratch.  I just opened the $SPARK_HOME directory from intellij File |
Open  .  After it finished the initial import I tried to run one of the
Examples - and it fails in the build:

Here are the errors I see:

Error:(45, 66) not found: type SparkFlumeProtocol
  val transactionTimeout: Int, val backOffInterval: Int) extends
SparkFlumeProtocol with Logging {
                                                                 ^
Error:(70, 39) not found: type EventBatch
  override def getEventBatch(n: Int): EventBatch = {
                                      ^
Error:(85, 13) not found: type EventBatch
        new EventBatch("Spark sink has been stopped!", "",
java.util.Collections.emptyList())
            ^
/git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
Error:(80, 22) not found: type EventBatch
  def getEventBatch: EventBatch = {
                     ^
Error:(48, 37) not found: type EventBatch
  @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
Error", "",
                                    ^
Error:(48, 54) not found: type EventBatch
  @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
Error", "",
                                                     ^
Error:(115, 41) not found: type SparkSinkEvent
        val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
                                        ^
Error:(146, 28) not found: type EventBatch
          eventBatch = new EventBatch("", seqNum, events)
                           ^
/git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
Error:(25, 27) not found: type EventBatch
  def isErrorBatch(batch: EventBatch): Boolean = {
                          ^
/git/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
Error:(86, 51) not found: type SparkFlumeProtocol
    val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
handler.get)
                                                  ^


Note: this is just the first batch of errors.




2016-06-22 20:50 GMT-07:00 Jeff Zhang <zj...@gmail.com>:

> It works well with me. You can try reimport it into intellij.
>
> On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com>
> wrote:
>
>>
>> Building inside intellij is an ever moving target. Anyone have the
>> magical procedures to get it going for 2.X?
>>
>> There are numerous library references that - although included in the
>> pom.xml build - are for some reason not found when processed within
>> Intellij.
>>
>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Building Spark 2.X in Intellij

Posted by Jeff Zhang <zj...@gmail.com>.
It works well with me. You can try reimport it into intellij.

On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch <ja...@gmail.com> wrote:

>
> Building inside intellij is an ever moving target. Anyone have the magical
> procedures to get it going for 2.X?
>
> There are numerous library references that - although included in the
> pom.xml build - are for some reason not found when processed within
> Intellij.
>



-- 
Best Regards

Jeff Zhang