You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Richard Hillegas <rh...@us.ibm.com> on 2015/10/15 18:47:22 UTC

Network-related environemental problem when running JDBCSuite


I am seeing what look like environmental errors when I try to run a test on
a clean local branch which has been sync'd to the head of the development
trunk. I would appreciate advice about how to debug or hack around this
problem. For the record, the test ran cleanly last week. This is the
experiment I am running:

# build
mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver clean
package

# run one suite
mvn -Dhadoop.version=2.4.0 -DwildcardSuites=JDBCSuite

The test bombs out before getting to JDBCSuite. I see this summary at the
end...

[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS
[  2.023 s]
[INFO] Spark Project Test Tags ............................ SUCCESS
[  1.924 s]
[INFO] Spark Project Launcher ............................. SUCCESS
[  5.837 s]
[INFO] Spark Project Networking ........................... SUCCESS
[ 12.498 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [01:28
min]
[INFO] Spark Project Unsafe ............................... SUCCESS [01:09
min]
[INFO] Spark Project Core ................................. SUCCESS [02:45
min]
[INFO] Spark Project Bagel ................................ SUCCESS
[ 30.182 s]
[INFO] Spark Project GraphX ............................... SUCCESS
[ 59.002 s]
[INFO] Spark Project Streaming ............................ FAILURE [06:21
min]
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External MQTT Assembly ............... SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 13:37 min
[INFO] Finished at: 2015-10-15T09:03:06-07:00
[INFO] Final Memory: 69M/793M
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (default-test)
on project spark-streaming_2.10: There are test failures.
[ERROR]
[ERROR] Please refer
to /Users/rhillegas/spark/spark/streaming/target/surefire-reports for the
individual test results.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn <goals> -rf :spark-streaming_2.10



>From the logs in streaming/target/surefire-reports, it appears that the
following tests failed...

org.apache.spark.streaming.JavaAPISuite.txt
org.apache.spark.streaming.JavaReceiverAPISuite.txt

...with this error:

java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
'sparkDriver' failed after 100 retries!
	at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:393)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:389)
	at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
	at scala.util.Try$.apply(Try.scala:161)
	at scala.util.Success.map(Try.scala:206)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
$mcV$sp(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run
(BatchingExecutor.scala:90)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
(AbstractDispatcher.scala:397)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec
(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
(ForkJoinWorkerThread.java:107)


It is suggested that there might be a problem with my /etc/hosts, according
to
http://stackoverflow.com/questions/29906686/failed-to-bind-to-spark-master-using-a-remote-cluster-with-two-workers
. But /etc/hosts looks fine to me:

bash-3.2$ cat /etc/hosts
##
# Host Database
#
# localhost is used to configure the loopback interface
# when the system is booting.  Do not change this entry.
##
127.0.0.1	localhost
255.255.255.255	broadcasthost
::1             localhost

Is there some environmental variable, config file setting, or JVM system
property which will hack around this problem? Any advice would be
appreciated.


Thanks,
-Rick

Re: Network-related environemental problem when running JDBCSuite

Posted by sg...@cfl.rr.com.
Rick,

Try setting the environment variable SPARK_LOCAL_IP=127.0.0.1 in your spark-env.conf (if not done yet) ...

Regards,

- Steve

From: Richard Hillegas 
Sent: Thursday, October 15, 2015 1:50 PM
To: Richard Hillegas 
Cc: Dev 
Subject: Re: Network-related environemental problem when running JDBCSuite

For the record, I get the same error when I simply try to boot the spark shell:

bash-3.2$ bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information.
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0, shutting down Netty transport
15/10/15 10:49:09 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Failed to bind to: /9.52.158.156:0: Service 'sparkDriver' failed after 16 retries!
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.map(Try.scala:206)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated abrubtly. Attempting to shut down transports
java.net.BindException: Failed to bind to: /9.52.158.156:0: Service 'sparkDriver' failed after 16 retries!
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.map(Try.scala:206)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1323)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:100)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:680)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql

Thanks,
Rick Hillegas



Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 09:47:22 AM:

> From: Richard Hillegas/San Francisco/IBM@IBMUS
> To: Dev <de...@spark.apache.org>
> Date: 10/15/2015 09:47 AM
> Subject: Network-related environemental problem when running JDBCSuite
> 
> I am seeing what look like environmental errors when I try to run a 
> test on a clean local branch which has been sync'd to the head of 
> the development trunk. I would appreciate advice about how to debug 
> or hack around this problem. For the record, the test ran cleanly 
> last week. This is the experiment I am running:
> 
> # build
> mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver clean package
> 
> # run one suite
> mvn -Dhadoop.version=2.4.0 -DwildcardSuites=JDBCSuite
> 
> The test bombs out before getting to JDBCSuite. I see this summary 
> at the end...
> 
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Spark Project Parent POM ........................... SUCCESS 
> [  2.023 s]
> [INFO] Spark Project Test Tags ............................ SUCCESS 
> [  1.924 s]
> [INFO] Spark Project Launcher ............................. SUCCESS 
> [  5.837 s]
> [INFO] Spark Project Networking ........................... SUCCESS 
> [ 12.498 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS 
> [01:28 min]
> [INFO] Spark Project Unsafe ............................... SUCCESS 
> [01:09 min]
> [INFO] Spark Project Core ................................. SUCCESS 
> [02:45 min]
> [INFO] Spark Project Bagel ................................ SUCCESS 
> [ 30.182 s]
> [INFO] Spark Project GraphX ............................... SUCCESS 
> [ 59.002 s]
> [INFO] Spark Project Streaming ............................ FAILURE 
> [06:21 min]
> [INFO] Spark Project Catalyst ............................. SKIPPED
> [INFO] Spark Project SQL .................................. SKIPPED
> [INFO] Spark Project ML Library ........................... SKIPPED
> [INFO] Spark Project Tools ................................ SKIPPED
> [INFO] Spark Project Hive ................................. SKIPPED
> [INFO] Spark Project REPL ................................. SKIPPED
> [INFO] Spark Project Assembly ............................. SKIPPED
> [INFO] Spark Project External Twitter ..................... SKIPPED
> [INFO] Spark Project External Flume Sink .................. SKIPPED
> [INFO] Spark Project External Flume ....................... SKIPPED
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> [INFO] Spark Project External MQTT ........................ SKIPPED
> [INFO] Spark Project External MQTT Assembly ............... SKIPPED
> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> [INFO] Spark Project External Kafka ....................... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 13:37 min
> [INFO] Finished at: 2015-10-15T09:03:06-07:00
> [INFO] Final Memory: 69M/793M
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-
> surefire-plugin:2.18.1:test (default-test) on project spark-
> streaming_2.10: There are test failures.
> [ERROR] 
> [ERROR] Please refer to /Users/rhillegas/spark/spark/streaming/
> target/surefire-reports for the individual test results.
> [ERROR] -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with
> the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible 
> solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> MojoFailureException
> [ERROR] 
> [ERROR] After correcting the problems, you can resume the build with
> the command
> [ERROR]   mvn <goals> -rf :spark-streaming_2.10
> 
> 
> 
> From the logs in streaming/target/surefire-reports, it appears that 
> the following tests failed...
> 
> org.apache.spark.streaming.JavaAPISuite.txt
> org.apache.spark.streaming.JavaReceiverAPISuite.txt
> 
> ...with this error:
> 
> java.net.BindException: Failed to bind to: /9.52.158.156:0: Service 
> 'sparkDriver' failed after 100 retries!
> at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:393)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:389)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> at scala.util.Try$.apply(Try.scala:161)
> at scala.util.Success.map(Try.scala:206)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> (BatchingExecutor.scala:55)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply$mcV$sp(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> at akka.dispatch.BatchingExecutor$BlockableBatch.run
> (BatchingExecutor.scala:90)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> (AbstractDispatcher.scala:397)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> (ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> (ForkJoinWorkerThread.java:107)
> 
> 
> It is suggested that there might be a problem with my /etc/hosts, 
> according to http://stackoverflow.com/questions/29906686/failed-to-
> bind-to-spark-master-using-a-remote-cluster-with-two-workers. But /
> etc/hosts looks fine to me:
> 
> bash-3.2$ cat /etc/hosts
> ##
> # Host Database
> #
> # localhost is used to configure the loopback interface
> # when the system is booting.  Do not change this entry.
> ##
> 127.0.0.1 localhost
> 255.255.255.255 broadcasthost
> ::1             localhost 
> 
> Is there some environmental variable, config file setting, or JVM 
> system property which will hack around this problem? Any advice 
> would be appreciated.
> 
> 
> Thanks,
> -Rick

Re: Network-related environemental problem when running JDBCSuite

Posted by Richard Hillegas <rh...@us.ibm.com>.
Thanks for everyone's patience with this email thread. I have fixed my
environmental problem and my tests run cleanly now. This seems to be a
problem which afflicts modern JVMs on Mac OSX (and maybe other unix
variants). The following can happen on these platforms:

  InetAddress.getLocalHost().isReachable( 2000 ) == false

If this happens to you, the fix is to add the following line to /etc/hosts:

127.0.0.1	localhost $yourMachineName

where $yourMachineName is the result of the hostname command. For more
information, see
http://stackoverflow.com/questions/1881546/inetaddress-getlocalhost-throws-unknownhostexception

Thanks,
-Rick




Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 11:15:29 AM:

> From: Richard Hillegas/San Francisco/IBM@IBMUS
> To: Dev <de...@spark.apache.org>
> Date: 10/15/2015 11:16 AM
> Subject: Re: Network-related environemental problem when running
JDBCSuite

>
> Continuing this lively conversation with myself (hopefully this
> archived thread may be useful to someone else in the future):
>
> I set the following environment variable as recommended by this page:
> http://stackoverflow.com/questions/29906686/failed-to-bind-to-spark-
> master-using-a-remote-cluster-with-two-workers
>
> export SPARK_LOCAL_IP=127.0.0.1
>
> Then I got errors related to booting the metastore_db. So I deleted
> that directory. After that I was able to run spark-shell again.
>
> Now let's see if this hack fixes the tests...
>
>
> Thanks,
> Rick Hillegas
>
>
>
> Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 10:50:55 AM:
>
> > From: Richard Hillegas/San Francisco/IBM@IBMUS
> > To: Richard Hillegas/San Francisco/IBM@IBMUS
> > Cc: Dev <de...@spark.apache.org>
> > Date: 10/15/2015 10:51 AM
> > Subject: Re: Network-related environemental problem when running
JDBCSuite
> >
> > For the record, I get the same error when I simply try to boot the
> > spark shell:
> >
> > bash-3.2$ bin/spark-shell
> > log4j:WARN No appenders could be found for logger
> > (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> > log4j:WARN Please initialize the log4j system properly.
> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
> > for more info.
> > Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-
> > repl.properties
> > To adjust logging level use sc.setLogLevel("INFO")
> > Welcome to
> >       ____              __
> >      / __/__  ___ _____/ /__
> >     _\ \/ _ \/ _ `/ __/  '_/
> >    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
> >       /_/
> >
> > Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,
> Java 1.8.0_60)
> > Type in expressions to have them evaluated.
> > Type :help for more information.
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> > on port 0. Attempting port 1.
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> > 156:0, shutting down Netty transport
> > 15/10/15 10:49:09 ERROR SparkContext: Error initializing SparkContext.
> > java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> > 'sparkDriver' failed after 16 retries!
> > at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:393)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:389)
> > at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> > at scala.util.Try$.apply(Try.scala:161)
> > at scala.util.Success.map(Try.scala:206)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> > at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> > (BatchingExecutor.scala:55)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply$mcV$sp(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
> > at akka.dispatch.BatchingExecutor$BlockableBatch.run
> > (BatchingExecutor.scala:90)
> > at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> > at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> > (AbstractDispatcher.scala:397)
> > at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> > at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> > (ForkJoinPool.java:1339)
> > at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> > at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> > (ForkJoinWorkerThread.java:107)
> > 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> > terminated abrubtly. Attempting to shut down transports
> > java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> > 'sparkDriver' failed after 16 retries!
> > at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:393)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:389)
> > at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> > at scala.util.Try$.apply(Try.scala:161)
> > at scala.util.Success.map(Try.scala:206)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> > at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> > (BatchingExecutor.scala:55)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply$mcV$sp(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
> > at akka.dispatch.BatchingExecutor$BlockableBatch.run
> > (BatchingExecutor.scala:90)
> > at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> > at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> > (AbstractDispatcher.scala:397)
> > at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> > at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> > (ForkJoinPool.java:1339)
> > at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> > at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> > (ForkJoinWorkerThread.java:107)
> >
> > java.lang.NullPointerException
> > at org.apache.spark.sql.SQLContext$.createListenerAndUI
> (SQLContext.scala:1323)
> > at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:100)
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance
> > (NativeConstructorAccessorImpl.java:62)
> > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance
> > (DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> > at org.apache.spark.repl.SparkILoop.createSQLContext
(SparkILoop.scala:1028)
> > at $iwC$$iwC.<init>(<console>:9)
> > at $iwC.<init>(<console>:18)
> > at <init>(<console>:20)
> > at .<init>(<console>:24)
> > at .<clinit>(<console>)
> > at .<init>(<console>:7)
> > at .<clinit>(<console>)
> > at $print(<console>)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at sun.reflect.NativeMethodAccessorImpl.invoke
> > (NativeMethodAccessorImpl.java:62)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke
> > (DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:497)
> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call
> (SparkIMain.scala:1065)
> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun
> (SparkIMain.scala:1340)
> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1
(SparkIMain.scala:840)
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> > at org.apache.spark.repl.SparkILoop.reallyInterpret$1
(SparkILoop.scala:857)
> > at org.apache.spark.repl.SparkILoop.interpretStartingWith
> > (SparkILoop.scala:902)
> > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
> > $1.apply(SparkILoopInit.scala:132)
> > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
> > $1.apply(SparkILoopInit.scala:124)
> > at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> > at org.apache.spark.repl.SparkILoopInit$class.initializeSpark
> > (SparkILoopInit.scala:124)
> > at org.apache.spark.repl.SparkILoop.initializeSpark
(SparkILoop.scala:64)
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> > $SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp
> > (SparkILoop.scala:974)
> > at org.apache.spark.repl.SparkILoopInit$class.runThunks
> > (SparkILoopInit.scala:159)
> > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> > at org.apache.spark.repl.SparkILoopInit$class.postInitialization
> > (SparkILoopInit.scala:108)
> > at org.apache.spark.repl.SparkILoop.postInitialization
(SparkILoop.scala:64)
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> > $SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> > $SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> > $SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader
> > (ScalaClassLoader.scala:135)
> > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop
> > $$process(SparkILoop.scala:945)
> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> > at org.apache.spark.repl.Main$.main(Main.scala:31)
> > at org.apache.spark.repl.Main.main(Main.scala)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at sun.reflect.NativeMethodAccessorImpl.invoke
> > (NativeMethodAccessorImpl.java:62)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke
> > (DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:497)
> > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
> > $SparkSubmit$$runMain(SparkSubmit.scala:680)
> > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
(SparkSubmit.scala:180)
> > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> > <console>:10: error: not found: value sqlContext
> >        import sqlContext.implicits._
> >               ^
> > <console>:10: error: not found: value sqlContext
> >        import sqlContext.sql
> >
> > Thanks,
> > Rick Hillegas
> >
> >
> >
> > Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 09:47:22
AM:
> >
> > > From: Richard Hillegas/San Francisco/IBM@IBMUS
> > > To: Dev <de...@spark.apache.org>
> > > Date: 10/15/2015 09:47 AM
> > > Subject: Network-related environemental problem when running
JDBCSuite
> > >
> > > I am seeing what look like environmental errors when I try to run a
> > > test on a clean local branch which has been sync'd to the head of
> > > the development trunk. I would appreciate advice about how to debug
> > > or hack around this problem. For the record, the test ran cleanly
> > > last week. This is the experiment I am running:
> > >
> > > # build
> > > mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver
> clean package
> > >
> > > # run one suite
> > > mvn -Dhadoop.version=2.4.0 -DwildcardSuites=JDBCSuite
> > >
> > > The test bombs out before getting to JDBCSuite. I see this summary
> > > at the end...
> > >
> > > [INFO]
> > >
------------------------------------------------------------------------
> > > [INFO] Reactor Summary:
> > > [INFO]
> > > [INFO] Spark Project Parent POM ........................... SUCCESS
> > > [  2.023 s]
> > > [INFO] Spark Project Test Tags ............................ SUCCESS
> > > [  1.924 s]
> > > [INFO] Spark Project Launcher ............................. SUCCESS
> > > [  5.837 s]
> > > [INFO] Spark Project Networking ........................... SUCCESS
> > > [ 12.498 s]
> > > [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS
> > > [01:28 min]
> > > [INFO] Spark Project Unsafe ............................... SUCCESS
> > > [01:09 min]
> > > [INFO] Spark Project Core ................................. SUCCESS
> > > [02:45 min]
> > > [INFO] Spark Project Bagel ................................ SUCCESS
> > > [ 30.182 s]
> > > [INFO] Spark Project GraphX ............................... SUCCESS
> > > [ 59.002 s]
> > > [INFO] Spark Project Streaming ............................ FAILURE
> > > [06:21 min]
> > > [INFO] Spark Project Catalyst ............................. SKIPPED
> > > [INFO] Spark Project SQL .................................. SKIPPED
> > > [INFO] Spark Project ML Library ........................... SKIPPED
> > > [INFO] Spark Project Tools ................................ SKIPPED
> > > [INFO] Spark Project Hive ................................. SKIPPED
> > > [INFO] Spark Project REPL ................................. SKIPPED
> > > [INFO] Spark Project Assembly ............................. SKIPPED
> > > [INFO] Spark Project External Twitter ..................... SKIPPED
> > > [INFO] Spark Project External Flume Sink .................. SKIPPED
> > > [INFO] Spark Project External Flume ....................... SKIPPED
> > > [INFO] Spark Project External Flume Assembly .............. SKIPPED
> > > [INFO] Spark Project External MQTT ........................ SKIPPED
> > > [INFO] Spark Project External MQTT Assembly ............... SKIPPED
> > > [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> > > [INFO] Spark Project External Kafka ....................... SKIPPED
> > > [INFO] Spark Project Examples ............................. SKIPPED
> > > [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> > > [INFO]
> > >
------------------------------------------------------------------------
> > > [INFO] BUILD FAILURE
> > > [INFO]
> > >
------------------------------------------------------------------------
> > > [INFO] Total time: 13:37 min
> > > [INFO] Finished at: 2015-10-15T09:03:06-07:00
> > > [INFO] Final Memory: 69M/793M
> > > [INFO]
> > >
------------------------------------------------------------------------
> > > [ERROR] Failed to execute goal org.apache.maven.plugins:maven-
> > > surefire-plugin:2.18.1:test (default-test) on project spark-
> > > streaming_2.10: There are test failures.
> > > [ERROR]
> > > [ERROR] Please refer to /Users/rhillegas/spark/spark/streaming/
> > > target/surefire-reports for the individual test results.
> > > [ERROR] -> [Help 1]
> > > [ERROR]
> > > [ERROR] To see the full stack trace of the errors, re-run Maven with
> > > the -e switch.
> > > [ERROR] Re-run Maven using the -X switch to enable full debug
logging.
> > > [ERROR]
> > > [ERROR] For more information about the errors and possible
> > > solutions, please read the following articles:
> > > [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> > > MojoFailureException
> > > [ERROR]
> > > [ERROR] After correcting the problems, you can resume the build with
> > > the command
> > > [ERROR]   mvn <goals> -rf :spark-streaming_2.10
> > >
> > >
> > >
> > > From the logs in streaming/target/surefire-reports, it appears that
> > > the following tests failed...
> > >
> > > org.apache.spark.streaming.JavaAPISuite.txt
> > > org.apache.spark.streaming.JavaReceiverAPISuite.txt
> > >
> > > ...with this error:
> > >
> > > java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> > > 'sparkDriver' failed after 100 retries!
> > > at org.jboss.netty.bootstrap.ServerBootstrap.bind
> (ServerBootstrap.java:272)
> > > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > > $1.apply(NettyTransport.scala:393)
> > > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > > $1.apply(NettyTransport.scala:389)
> > > at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> > > at scala.util.Try$.apply(Try.scala:161)
> > > at scala.util.Success.map(Try.scala:206)
> > > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > > at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> > > at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> > > (BatchingExecutor.scala:55)
> > > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > > $1.apply$mcV$sp(BatchingExecutor.scala:91)
> > > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > > $1.apply(BatchingExecutor.scala:91)
> > > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > > $1.apply(BatchingExecutor.scala:91)
> > > at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
> > > at akka.dispatch.BatchingExecutor$BlockableBatch.run
> > > (BatchingExecutor.scala:90)
> > > at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> > > at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> > > (AbstractDispatcher.scala:397)
> > > at scala.concurrent.forkjoin.ForkJoinTask.doExec
(ForkJoinTask.java:260)
> > > at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> > > (ForkJoinPool.java:1339)
> > > at scala.concurrent.forkjoin.ForkJoinPool.runWorker
> (ForkJoinPool.java:1979)
> > > at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> > > (ForkJoinWorkerThread.java:107)
> > >
> > >
> > > It is suggested that there might be a problem with my /etc/hosts,
> > > according to http://stackoverflow.com/questions/29906686/failed-to-
> > > bind-to-spark-master-using-a-remote-cluster-with-two-workers. But /
> > > etc/hosts looks fine to me:
> > >
> > > bash-3.2$ cat /etc/hosts
> > > ##
> > > # Host Database
> > > #
> > > # localhost is used to configure the loopback interface
> > > # when the system is booting.  Do not change this entry.
> > > ##
> > > 127.0.0.1 localhost
> > > 255.255.255.255 broadcasthost
> > > ::1             localhost
> > >
> > > Is there some environmental variable, config file setting, or JVM
> > > system property which will hack around this problem? Any advice
> > > would be appreciated.
> > >
> > >
> > > Thanks,
> > > -Rick

Re: Network-related environemental problem when running JDBCSuite

Posted by Richard Hillegas <rh...@us.ibm.com>.
Continuing this lively conversation with myself (hopefully this archived
thread may be useful to someone else in the future):

I set the following environment variable as recommended by this page:
http://stackoverflow.com/questions/29906686/failed-to-bind-to-spark-master-using-a-remote-cluster-with-two-workers

export SPARK_LOCAL_IP=127.0.0.1

Then I got errors related to booting the metastore_db. So I deleted that
directory. After that I was able to run spark-shell again.

Now let's see if this hack fixes the tests...


Thanks,
Rick Hillegas



Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 10:50:55 AM:

> From: Richard Hillegas/San Francisco/IBM@IBMUS
> To: Richard Hillegas/San Francisco/IBM@IBMUS
> Cc: Dev <de...@spark.apache.org>
> Date: 10/15/2015 10:51 AM
> Subject: Re: Network-related environemental problem when running
JDBCSuite

>
> For the record, I get the same error when I simply try to boot the
> spark shell:
>
> bash-3.2$ bin/spark-shell
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
> for more info.
> Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-
> repl.properties
> To adjust logging level use sc.setLogLevel("INFO")
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
>       /_/
>
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_60)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind
> on port 0. Attempting port 1.
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> 15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.
> 156:0, shutting down Netty transport
> 15/10/15 10:49:09 ERROR SparkContext: Error initializing SparkContext.
> java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> 'sparkDriver' failed after 16 retries!
> at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:393)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:389)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> at scala.util.Try$.apply(Try.scala:161)
> at scala.util.Success.map(Try.scala:206)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> (BatchingExecutor.scala:55)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply$mcV$sp(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> at akka.dispatch.BatchingExecutor$BlockableBatch.run
> (BatchingExecutor.scala:90)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> (AbstractDispatcher.scala:397)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> (ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> (ForkJoinWorkerThread.java:107)
> 15/10/15 10:49:09 ERROR Remoting: Remoting system has been
> terminated abrubtly. Attempting to shut down transports
> java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> 'sparkDriver' failed after 16 retries!
> at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:393)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:389)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> at scala.util.Try$.apply(Try.scala:161)
> at scala.util.Success.map(Try.scala:206)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> (BatchingExecutor.scala:55)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply$mcV$sp(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> at akka.dispatch.BatchingExecutor$BlockableBatch.run
> (BatchingExecutor.scala:90)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> (AbstractDispatcher.scala:397)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> (ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> (ForkJoinWorkerThread.java:107)
>
> java.lang.NullPointerException
> at org.apache.spark.sql.SQLContext$.createListenerAndUI
(SQLContext.scala:1323)
> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:100)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance
> (NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance
> (DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at org.apache.spark.repl.SparkILoop.createSQLContext
(SparkILoop.scala:1028)
> at $iwC$$iwC.<init>(<console>:9)
> at $iwC.<init>(<console>:18)
> at <init>(<console>:20)
> at .<init>(<console>:24)
> at .<clinit>(<console>)
> at .<init>(<console>:7)
> at .<clinit>(<console>)
> at $print(<console>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke
> (NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke
> (DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call
(SparkIMain.scala:1065)
> at org.apache.spark.repl.SparkIMain$Request.loadAndRun
(SparkIMain.scala:1340)
> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at org.apache.spark.repl.SparkILoop.reallyInterpret$1
(SparkILoop.scala:857)
> at org.apache.spark.repl.SparkILoop.interpretStartingWith
> (SparkILoop.scala:902)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
> $1.apply(SparkILoopInit.scala:132)
> at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
> $1.apply(SparkILoopInit.scala:124)
> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> at org.apache.spark.repl.SparkILoopInit$class.initializeSpark
> (SparkILoopInit.scala:124)
> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> $SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp
> (SparkILoop.scala:974)
> at org.apache.spark.repl.SparkILoopInit$class.runThunks
> (SparkILoopInit.scala:159)
> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> at org.apache.spark.repl.SparkILoopInit$class.postInitialization
> (SparkILoopInit.scala:108)
> at org.apache.spark.repl.SparkILoop.postInitialization
(SparkILoop.scala:64)
> at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> $SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> $SparkILoop$$process$1.apply(SparkILoop.scala:945)
> at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
> $SparkILoop$$process$1.apply(SparkILoop.scala:945)
> at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader
> (ScalaClassLoader.scala:135)
> at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop
> $$process(SparkILoop.scala:945)
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> at org.apache.spark.repl.Main$.main(Main.scala:31)
> at org.apache.spark.repl.Main.main(Main.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke
> (NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke
> (DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
> $SparkSubmit$$runMain(SparkSubmit.scala:680)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> <console>:10: error: not found: value sqlContext
>        import sqlContext.implicits._
>               ^
> <console>:10: error: not found: value sqlContext
>        import sqlContext.sql
>
> Thanks,
> Rick Hillegas
>
>
>
> Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 09:47:22 AM:
>
> > From: Richard Hillegas/San Francisco/IBM@IBMUS
> > To: Dev <de...@spark.apache.org>
> > Date: 10/15/2015 09:47 AM
> > Subject: Network-related environemental problem when running JDBCSuite
> >
> > I am seeing what look like environmental errors when I try to run a
> > test on a clean local branch which has been sync'd to the head of
> > the development trunk. I would appreciate advice about how to debug
> > or hack around this problem. For the record, the test ran cleanly
> > last week. This is the experiment I am running:
> >
> > # build
> > mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver clean
package
> >
> > # run one suite
> > mvn -Dhadoop.version=2.4.0 -DwildcardSuites=JDBCSuite
> >
> > The test bombs out before getting to JDBCSuite. I see this summary
> > at the end...
> >
> > [INFO]
> >
------------------------------------------------------------------------
> > [INFO] Reactor Summary:
> > [INFO]
> > [INFO] Spark Project Parent POM ........................... SUCCESS
> > [  2.023 s]
> > [INFO] Spark Project Test Tags ............................ SUCCESS
> > [  1.924 s]
> > [INFO] Spark Project Launcher ............................. SUCCESS
> > [  5.837 s]
> > [INFO] Spark Project Networking ........................... SUCCESS
> > [ 12.498 s]
> > [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS
> > [01:28 min]
> > [INFO] Spark Project Unsafe ............................... SUCCESS
> > [01:09 min]
> > [INFO] Spark Project Core ................................. SUCCESS
> > [02:45 min]
> > [INFO] Spark Project Bagel ................................ SUCCESS
> > [ 30.182 s]
> > [INFO] Spark Project GraphX ............................... SUCCESS
> > [ 59.002 s]
> > [INFO] Spark Project Streaming ............................ FAILURE
> > [06:21 min]
> > [INFO] Spark Project Catalyst ............................. SKIPPED
> > [INFO] Spark Project SQL .................................. SKIPPED
> > [INFO] Spark Project ML Library ........................... SKIPPED
> > [INFO] Spark Project Tools ................................ SKIPPED
> > [INFO] Spark Project Hive ................................. SKIPPED
> > [INFO] Spark Project REPL ................................. SKIPPED
> > [INFO] Spark Project Assembly ............................. SKIPPED
> > [INFO] Spark Project External Twitter ..................... SKIPPED
> > [INFO] Spark Project External Flume Sink .................. SKIPPED
> > [INFO] Spark Project External Flume ....................... SKIPPED
> > [INFO] Spark Project External Flume Assembly .............. SKIPPED
> > [INFO] Spark Project External MQTT ........................ SKIPPED
> > [INFO] Spark Project External MQTT Assembly ............... SKIPPED
> > [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> > [INFO] Spark Project External Kafka ....................... SKIPPED
> > [INFO] Spark Project Examples ............................. SKIPPED
> > [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> > [INFO]
> >
------------------------------------------------------------------------
> > [INFO] BUILD FAILURE
> > [INFO]
> >
------------------------------------------------------------------------
> > [INFO] Total time: 13:37 min
> > [INFO] Finished at: 2015-10-15T09:03:06-07:00
> > [INFO] Final Memory: 69M/793M
> > [INFO]
> >
------------------------------------------------------------------------
> > [ERROR] Failed to execute goal org.apache.maven.plugins:maven-
> > surefire-plugin:2.18.1:test (default-test) on project spark-
> > streaming_2.10: There are test failures.
> > [ERROR]
> > [ERROR] Please refer to /Users/rhillegas/spark/spark/streaming/
> > target/surefire-reports for the individual test results.
> > [ERROR] -> [Help 1]
> > [ERROR]
> > [ERROR] To see the full stack trace of the errors, re-run Maven with
> > the -e switch.
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > [ERROR]
> > [ERROR] For more information about the errors and possible
> > solutions, please read the following articles:
> > [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> > MojoFailureException
> > [ERROR]
> > [ERROR] After correcting the problems, you can resume the build with
> > the command
> > [ERROR]   mvn <goals> -rf :spark-streaming_2.10
> >
> >
> >
> > From the logs in streaming/target/surefire-reports, it appears that
> > the following tests failed...
> >
> > org.apache.spark.streaming.JavaAPISuite.txt
> > org.apache.spark.streaming.JavaReceiverAPISuite.txt
> >
> > ...with this error:
> >
> > java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> > 'sparkDriver' failed after 100 retries!
> > at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:393)
> > at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> > $1.apply(NettyTransport.scala:389)
> > at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> > at scala.util.Try$.apply(Try.scala:161)
> > at scala.util.Success.map(Try.scala:206)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> > at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> > at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> > (BatchingExecutor.scala:55)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply$mcV$sp(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> > $1.apply(BatchingExecutor.scala:91)
> > at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
> > at akka.dispatch.BatchingExecutor$BlockableBatch.run
> > (BatchingExecutor.scala:90)
> > at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> > at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> > (AbstractDispatcher.scala:397)
> > at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> > at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> > (ForkJoinPool.java:1339)
> > at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> > at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> > (ForkJoinWorkerThread.java:107)
> >
> >
> > It is suggested that there might be a problem with my /etc/hosts,
> > according to http://stackoverflow.com/questions/29906686/failed-to-
> > bind-to-spark-master-using-a-remote-cluster-with-two-workers. But /
> > etc/hosts looks fine to me:
> >
> > bash-3.2$ cat /etc/hosts
> > ##
> > # Host Database
> > #
> > # localhost is used to configure the loopback interface
> > # when the system is booting.  Do not change this entry.
> > ##
> > 127.0.0.1 localhost
> > 255.255.255.255 broadcasthost
> > ::1             localhost
> >
> > Is there some environmental variable, config file setting, or JVM
> > system property which will hack around this problem? Any advice
> > would be appreciated.
> >
> >
> > Thanks,
> > -Rick

Re: Network-related environemental problem when running JDBCSuite

Posted by Richard Hillegas <rh...@us.ibm.com>.
For the record, I get the same error when I simply try to boot the spark
shell:

bash-3.2$ bin/spark-shell
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's repl log4j profile:
org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_60)
Type in expressions to have them evaluated.
Type :help for more information.
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 WARN Utils: Service 'sparkDriver' could not bind on port
0. Attempting port 1.
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
15/10/15 10:49:09 ERROR NettyTransport: failed to bind to /9.52.158.156:0,
shutting down Netty transport
15/10/15 10:49:09 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
'sparkDriver' failed after 16 retries!
	at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:393)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:389)
	at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
	at scala.util.Try$.apply(Try.scala:161)
	at scala.util.Success.map(Try.scala:206)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
$mcV$sp(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run
(BatchingExecutor.scala:90)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
(AbstractDispatcher.scala:397)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec
(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
(ForkJoinWorkerThread.java:107)
15/10/15 10:49:09 ERROR Remoting: Remoting system has been terminated
abrubtly. Attempting to shut down transports
java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
'sparkDriver' failed after 16 retries!
	at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:393)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply
(NettyTransport.scala:389)
	at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
	at scala.util.Try$.apply(Try.scala:161)
	at scala.util.Success.map(Try.scala:206)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
$mcV$sp(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
(BatchingExecutor.scala:91)
	at scala.concurrent.BlockContext$.withBlockContext
(BlockContext.scala:72)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run
(BatchingExecutor.scala:90)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
(AbstractDispatcher.scala:397)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec
(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
(ForkJoinWorkerThread.java:107)

java.lang.NullPointerException
	at org.apache.spark.sql.SQLContext$.createListenerAndUI
(SQLContext.scala:1323)
	at org.apache.spark.sql.hive.HiveContext.<init>
(HiveContext.scala:100)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance
(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance
(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
	at org.apache.spark.repl.SparkILoop.createSQLContext
(SparkILoop.scala:1028)
	at $iwC$$iwC.<init>(<console>:9)
	at $iwC.<init>(<console>:18)
	at <init>(<console>:20)
	at .<init>(<console>:24)
	at .<clinit>(<console>)
	at .<init>(<console>:7)
	at .<clinit>(<console>)
	at $print(<console>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call
(SparkIMain.scala:1065)
	at org.apache.spark.repl.SparkIMain$Request.loadAndRun
(SparkIMain.scala:1340)
	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1
(SparkIMain.scala:840)
	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
	at org.apache.spark.repl.SparkILoop.reallyInterpret$1
(SparkILoop.scala:857)
	at org.apache.spark.repl.SparkILoop.interpretStartingWith
(SparkILoop.scala:902)
	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
$1.apply(SparkILoopInit.scala:132)
	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark
$1.apply(SparkILoopInit.scala:124)
	at org.apache.spark.repl.SparkIMain.beQuietDuring
(SparkIMain.scala:324)
	at org.apache.spark.repl.SparkILoopInit$class.initializeSpark
(SparkILoopInit.scala:124)
	at org.apache.spark.repl.SparkILoop.initializeSpark
(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp
(SparkILoop.scala:974)
	at org.apache.spark.repl.SparkILoopInit$class.runThunks
(SparkILoopInit.scala:159)
	at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoopInit$class.postInitialization
(SparkILoopInit.scala:108)
	at org.apache.spark.repl.SparkILoop.postInitialization
(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
$SparkILoop$$process$1.apply(SparkILoop.scala:945)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl
$SparkILoop$$process$1.apply(SparkILoop.scala:945)
	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader
(ScalaClassLoader.scala:135)
	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$
$process(SparkILoop.scala:945)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
	at org.apache.spark.repl.Main$.main(Main.scala:31)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
$SparkSubmit$$runMain(SparkSubmit.scala:680)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql

Thanks,
Rick Hillegas



Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 09:47:22 AM:

> From: Richard Hillegas/San Francisco/IBM@IBMUS
> To: Dev <de...@spark.apache.org>
> Date: 10/15/2015 09:47 AM
> Subject: Network-related environemental problem when running JDBCSuite
>
> I am seeing what look like environmental errors when I try to run a
> test on a clean local branch which has been sync'd to the head of
> the development trunk. I would appreciate advice about how to debug
> or hack around this problem. For the record, the test ran cleanly
> last week. This is the experiment I am running:
>
> # build
> mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver clean
package
>
> # run one suite
> mvn -Dhadoop.version=2.4.0 -DwildcardSuites=JDBCSuite
>
> The test bombs out before getting to JDBCSuite. I see this summary
> at the end...
>
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM ........................... SUCCESS
> [  2.023 s]
> [INFO] Spark Project Test Tags ............................ SUCCESS
> [  1.924 s]
> [INFO] Spark Project Launcher ............................. SUCCESS
> [  5.837 s]
> [INFO] Spark Project Networking ........................... SUCCESS
> [ 12.498 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS
> [01:28 min]
> [INFO] Spark Project Unsafe ............................... SUCCESS
> [01:09 min]
> [INFO] Spark Project Core ................................. SUCCESS
> [02:45 min]
> [INFO] Spark Project Bagel ................................ SUCCESS
> [ 30.182 s]
> [INFO] Spark Project GraphX ............................... SUCCESS
> [ 59.002 s]
> [INFO] Spark Project Streaming ............................ FAILURE
> [06:21 min]
> [INFO] Spark Project Catalyst ............................. SKIPPED
> [INFO] Spark Project SQL .................................. SKIPPED
> [INFO] Spark Project ML Library ........................... SKIPPED
> [INFO] Spark Project Tools ................................ SKIPPED
> [INFO] Spark Project Hive ................................. SKIPPED
> [INFO] Spark Project REPL ................................. SKIPPED
> [INFO] Spark Project Assembly ............................. SKIPPED
> [INFO] Spark Project External Twitter ..................... SKIPPED
> [INFO] Spark Project External Flume Sink .................. SKIPPED
> [INFO] Spark Project External Flume ....................... SKIPPED
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> [INFO] Spark Project External MQTT ........................ SKIPPED
> [INFO] Spark Project External MQTT Assembly ............... SKIPPED
> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> [INFO] Spark Project External Kafka ....................... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 13:37 min
> [INFO] Finished at: 2015-10-15T09:03:06-07:00
> [INFO] Final Memory: 69M/793M
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-
> surefire-plugin:2.18.1:test (default-test) on project spark-
> streaming_2.10: There are test failures.
> [ERROR]
> [ERROR] Please refer to /Users/rhillegas/spark/spark/streaming/
> target/surefire-reports for the individual test results.
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with
> the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible
> solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> MojoFailureException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with
> the command
> [ERROR]   mvn <goals> -rf :spark-streaming_2.10
>
>
>
> From the logs in streaming/target/surefire-reports, it appears that
> the following tests failed...
>
> org.apache.spark.streaming.JavaAPISuite.txt
> org.apache.spark.streaming.JavaReceiverAPISuite.txt
>
> ...with this error:
>
> java.net.BindException: Failed to bind to: /9.52.158.156:0: Service
> 'sparkDriver' failed after 100 retries!
> at org.jboss.netty.bootstrap.ServerBootstrap.bind
(ServerBootstrap.java:272)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:393)
> at akka.remote.transport.netty.NettyTransport$$anonfun$listen
> $1.apply(NettyTransport.scala:389)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
> at scala.util.Try$.apply(Try.scala:161)
> at scala.util.Success.map(Try.scala:206)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch
> (BatchingExecutor.scala:55)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply$mcV$sp(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run
> $1.apply(BatchingExecutor.scala:91)
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> at akka.dispatch.BatchingExecutor$BlockableBatch.run
> (BatchingExecutor.scala:90)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
> (AbstractDispatcher.scala:397)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask
> (ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker
(ForkJoinPool.java:1979)
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run
> (ForkJoinWorkerThread.java:107)
>
>
> It is suggested that there might be a problem with my /etc/hosts,
> according to http://stackoverflow.com/questions/29906686/failed-to-
> bind-to-spark-master-using-a-remote-cluster-with-two-workers. But /
> etc/hosts looks fine to me:
>
> bash-3.2$ cat /etc/hosts
> ##
> # Host Database
> #
> # localhost is used to configure the loopback interface
> # when the system is booting.  Do not change this entry.
> ##
> 127.0.0.1 localhost
> 255.255.255.255 broadcasthost
> ::1             localhost
>
> Is there some environmental variable, config file setting, or JVM
> system property which will hack around this problem? Any advice
> would be appreciated.
>
>
> Thanks,
> -Rick