You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by zgalic <zd...@fer.hr> on 2014/02/06 15:04:18 UTC
Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Hi Spark users,
Trying to run any of examples failed due to JavaZeroMQStreamSuite.java
compilation errors:
Information:Using javac 1.7.0_21 to compile java sources
Information:java: Errors occurred while compiling module
'spark-streaming-zeromq'
Information:Compilation completed with 7 errors and 12 warnings in 13 min 8
sec
Information:7 errors
Information:12 warnings
Warning:scalac: there were 56 feature warning(s); re-run with -feature for
details
Warning:scalac: there were 35 feature warning(s); re-run with -feature for
details
Warning:scalac: there were 6 feature warning(s); re-run with -feature for
details
Warning:scalac: there were 4 feature warning(s); re-run with -feature for
details
/home/zgalic/spark-0.9.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/PairFlatMapFunction.java
Warning:Warning:java: Some input files use unchecked or unsafe
operations.
Warning:Warning:java: Recompile with -Xlint:unchecked for details.
/home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala
Warning:Warning:line (51)method mapPartitionsWithSplit in class RDD is
deprecated: use mapPartitionsWithIndex
val partitionSumsWithSplit = nums.mapPartitionsWithSplit {
^
/home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
Warning:Warning:java:
/home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
uses unchecked or unsafe operations.
Warning:Warning:java: Recompile with -Xlint:unchecked for details.
/home/zgalic/spark-0.9.0-incubating/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala
Warning:Warning:line (305)method connect in class IOManager is
deprecated: use the new implementation in package akka.io instead
override def preStart = IOManager(context.system).connect(new
InetSocketAddress(port))
^
/home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
Warning:Warning:java:
/home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
uses unchecked or unsafe operations.
Warning:Warning:java: Recompile with -Xlint:unchecked for details.
/home/zgalic/spark-0.9.0-incubating/external/zeromq/src/test/java/org/apache/spark/streaming/zeromq/JavaZeroMQStreamSuite.java
Error:Error:line (24)java: package org.apache.spark.api.java.function
does not exist
Error:Error:line (25)java: package org.apache.spark.storage does not
exist
Error:Error:line (35)java: cannot find symbol
symbol: class Function
location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
Error:Error:line (35)java: cannot find symbol
symbol: class Function
location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
Error:Error:line (42)java: cannot access
org.apache.spark.api.java.function.Function
class file for org.apache.spark.api.java.function.Function not found
Error:Error:line (45)java: cannot find symbol
symbol: variable StorageLevel
location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
Error:Error:line (47)java: cannot find symbol
symbol: variable StorageLevel
location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by Tathagata Das <ta...@gmail.com>.
If you are using a sbt project file to link to spark streaming, then it is
actually simpler. Here is an example sbt file that links to Spark Streaming
and Spark Streaming's twitter functionality (for Spark 0.9).
https://github.com/amplab/training/blob/ampcamp4/streaming/scala/build.sbt
Instead of spark-streaming-twitter, in your case it will be
spark-streaming-flume or spark-streaming-zeromq .
I hope this sbt configuration works with your IDE.
TD
On Mon, Feb 10, 2014 at 5:21 AM, zgalic <zd...@fer.hr> wrote:
> Hm, I'm not sbt expert :-(
>
> Do you mean something like this
>
> lazy val streaming-flume = Project("streaming-flume",
> file("streaming-flume"), settings = flumeSettings) dependsOn(core)
>
> in BuildScala.sbt ?
>
> Thanks.
> ZG
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282p1354.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
problem running multiple executors on large machine
Posted by Yadid Ayzenberg <ya...@media.mit.edu>.
Hi All,
I have a setup which consists of 8 small machines (1 core) and 8G RAM
and 1 large machine (8 cores) with 100G RAM. Is there a way to enable
spark to run multiple executors on the large machine, and a single
executor on each of the small machines ?
Alternatively, is is possible to run a single executor that will utilize
all cores and available memory on the large machine as well as executors
with less memory on the smaller machines?
I tried configuring spark-env.sh on the large machine, but java -Xmx is
configured uniformly for the entire cluster.
Is there any way to configure -Xmx separately for each machine ?
Thanks,
Yadid
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by zgalic <zd...@fer.hr>.
Hm, I'm not sbt expert :-(
Do you mean something like this
lazy val streaming-flume = Project("streaming-flume",
file("streaming-flume"), settings = flumeSettings) dependsOn(core)
in BuildScala.sbt ?
Thanks.
ZG
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282p1354.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by Tathagata Das <ta...@gmail.com>.
Maybe somehow its not finding core Spark classes. Can you try adding the
spark-core project as an explicit dependency on the spark-streaming-XYZ
projects?
TD
On Fri, Feb 7, 2014 at 8:29 AM, zgalic <zd...@fer.hr> wrote:
> Thanks !
>
> Unfortunately, "sbt/sbt clean" didn't help
>
> The steps are trivial:
>
> sbt/sbt clean
> sbt/sbt assembly
> Import Project in IntelliJ (first time with auto-import enabled, second
> time
> auto-import is disabled)
>
> I've deleted module 'spark-streaming-zeromq' and tried to run LocalPi, but
> got the same errors,
> this time in 'JavaTwitterStreamSuite.java'.
>
> Also, there is an error tooltip:
>
> Cannot resolve symbol StorageLevel
> --------------------------------------------------------
> Unused import statement import.org.apache.spark.storage.StorageLevel
>
>
> In the previous case, compiling 'JavaZeroMQStreamSuite.java' generates
> similar error for
>
> import org.apache.spark.api.java.function.Function;
> import org.apache.spark.storage.StorageLevel;
>
> Any idea ?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282p1307.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by zgalic <zd...@fer.hr>.
Thanks !
Unfortunately, "sbt/sbt clean" didn't help
The steps are trivial:
sbt/sbt clean
sbt/sbt assembly
Import Project in IntelliJ (first time with auto-import enabled, second time
auto-import is disabled)
I've deleted module 'spark-streaming-zeromq' and tried to run LocalPi, but
got the same errors,
this time in 'JavaTwitterStreamSuite.java'.
Also, there is an error tooltip:
Cannot resolve symbol StorageLevel
--------------------------------------------------------
Unused import statement import.org.apache.spark.storage.StorageLevel
In the previous case, compiling 'JavaZeroMQStreamSuite.java' generates
similar error for
import org.apache.spark.api.java.function.Function;
import org.apache.spark.storage.StorageLevel;
Any idea ?
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282p1307.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by Josh Rosen <ro...@gmail.com>.
This sounds like a classpath issue in IntelliJ, so I'd also try refreshing
the SBT project in IntelliJ, or re-importing it.
On Thu, Feb 6, 2014 at 11:02 AM, Tathagata Das
<ta...@gmail.com>wrote:
> Does a "sbt/sbt clean" help? If it doesnt and the problem occurs
> repeatedely, can you tell us what is the sequence of commands you are using
> (from a clean github clone) so that we can reproduce the problem?
> TD
>
>
> On Thu, Feb 6, 2014 at 6:04 AM, zgalic <zd...@fer.hr> wrote:
>
>> Hi Spark users,
>>
>> Trying to run any of examples failed due to JavaZeroMQStreamSuite.java
>> compilation errors:
>>
>>
>> Information:Using javac 1.7.0_21 to compile java sources
>> Information:java: Errors occurred while compiling module
>> 'spark-streaming-zeromq'
>> Information:Compilation completed with 7 errors and 12 warnings in 13 min
>> 8
>> sec
>> Information:7 errors
>> Information:12 warnings
>> Warning:scalac: there were 56 feature warning(s); re-run with -feature for
>> details
>> Warning:scalac: there were 35 feature warning(s); re-run with -feature for
>> details
>> Warning:scalac: there were 6 feature warning(s); re-run with -feature for
>> details
>> Warning:scalac: there were 4 feature warning(s); re-run with -feature for
>> details
>>
>> /home/zgalic/spark-0.9.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/PairFlatMapFunction.java
>> Warning:Warning:java: Some input files use unchecked or unsafe
>> operations.
>> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>>
>> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala
>> Warning:Warning:line (51)method mapPartitionsWithSplit in class RDD is
>> deprecated: use mapPartitionsWithIndex
>> val partitionSumsWithSplit = nums.mapPartitionsWithSplit {
>> ^
>>
>> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
>> Warning:Warning:java:
>>
>> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
>> uses unchecked or unsafe operations.
>> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>>
>> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala
>> Warning:Warning:line (305)method connect in class IOManager is
>> deprecated: use the new implementation in package akka.io instead
>> override def preStart = IOManager(context.system).connect(new
>> InetSocketAddress(port))
>> ^
>>
>> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
>> Warning:Warning:java:
>>
>> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
>> uses unchecked or unsafe operations.
>> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>>
>> /home/zgalic/spark-0.9.0-incubating/external/zeromq/src/test/java/org/apache/spark/streaming/zeromq/JavaZeroMQStreamSuite.java
>> Error:Error:line (24)java: package org.apache.spark.api.java.function
>> does not exist
>> Error:Error:line (25)java: package org.apache.spark.storage does not
>> exist
>> Error:Error:line (35)java: cannot find symbol
>> symbol: class Function
>> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
>> Error:Error:line (35)java: cannot find symbol
>> symbol: class Function
>> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
>> Error:Error:line (42)java: cannot access
>> org.apache.spark.api.java.function.Function
>> class file for org.apache.spark.api.java.function.Function not found
>> Error:Error:line (45)java: cannot find symbol
>> symbol: variable StorageLevel
>> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
>> Error:Error:line (47)java: cannot find symbol
>> symbol: variable StorageLevel
>> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>
Re: Errors occurred while compiling module 'spark-streaming-zeromq'
(IntelliJ IDEA 13.0.2)
Posted by Tathagata Das <ta...@gmail.com>.
Does a "sbt/sbt clean" help? If it doesnt and the problem occurs
repeatedely, can you tell us what is the sequence of commands you are using
(from a clean github clone) so that we can reproduce the problem?
TD
On Thu, Feb 6, 2014 at 6:04 AM, zgalic <zd...@fer.hr> wrote:
> Hi Spark users,
>
> Trying to run any of examples failed due to JavaZeroMQStreamSuite.java
> compilation errors:
>
>
> Information:Using javac 1.7.0_21 to compile java sources
> Information:java: Errors occurred while compiling module
> 'spark-streaming-zeromq'
> Information:Compilation completed with 7 errors and 12 warnings in 13 min 8
> sec
> Information:7 errors
> Information:12 warnings
> Warning:scalac: there were 56 feature warning(s); re-run with -feature for
> details
> Warning:scalac: there were 35 feature warning(s); re-run with -feature for
> details
> Warning:scalac: there were 6 feature warning(s); re-run with -feature for
> details
> Warning:scalac: there were 4 feature warning(s); re-run with -feature for
> details
>
> /home/zgalic/spark-0.9.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/PairFlatMapFunction.java
> Warning:Warning:java: Some input files use unchecked or unsafe
> operations.
> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>
> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/rdd/RDDSuite.scala
> Warning:Warning:line (51)method mapPartitionsWithSplit in class RDD is
> deprecated: use mapPartitionsWithIndex
> val partitionSumsWithSplit = nums.mapPartitionsWithSplit {
> ^
>
> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
> Warning:Warning:java:
>
> /home/zgalic/spark-0.9.0-incubating/core/src/test/scala/org/apache/spark/JavaAPISuite.java
> uses unchecked or unsafe operations.
> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>
> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala
> Warning:Warning:line (305)method connect in class IOManager is
> deprecated: use the new implementation in package akka.io instead
> override def preStart = IOManager(context.system).connect(new
> InetSocketAddress(port))
> ^
>
> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
> Warning:Warning:java:
>
> /home/zgalic/spark-0.9.0-incubating/streaming/src/test/java/org/apache/spark/streaming/JavaAPISuite.java
> uses unchecked or unsafe operations.
> Warning:Warning:java: Recompile with -Xlint:unchecked for details.
>
> /home/zgalic/spark-0.9.0-incubating/external/zeromq/src/test/java/org/apache/spark/streaming/zeromq/JavaZeroMQStreamSuite.java
> Error:Error:line (24)java: package org.apache.spark.api.java.function
> does not exist
> Error:Error:line (25)java: package org.apache.spark.storage does not
> exist
> Error:Error:line (35)java: cannot find symbol
> symbol: class Function
> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
> Error:Error:line (35)java: cannot find symbol
> symbol: class Function
> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
> Error:Error:line (42)java: cannot access
> org.apache.spark.api.java.function.Function
> class file for org.apache.spark.api.java.function.Function not found
> Error:Error:line (45)java: cannot find symbol
> symbol: variable StorageLevel
> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
> Error:Error:line (47)java: cannot find symbol
> symbol: variable StorageLevel
> location: class org.apache.spark.streaming.zeromq.JavaZeroMQStreamSuite
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Errors-occurred-while-compiling-module-spark-streaming-zeromq-IntelliJ-IDEA-13-0-2-tp1282.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>