You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mark Grover (JIRA)" <ji...@apache.org> on 2016/01/05 19:06:39 UTC

[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again

    [ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15083482#comment-15083482 ] 

Mark Grover commented on SPARK-12426:
-------------------------------------

Sean and Josh,
I got to the bottom of this. This is because docker sucks when bubbling up the error that docker engine is not running on the machine running the unit tests. The instructions for installing docker engine on various OSs are at https://docs.docker.com/engine/installation/
Once installed the docker service needs to be started, if it's not already running. On Linux, this is simply {{sudo service docker start}} and then our docker integration tests pass.

Sorry that I didn't get a chance to look into it around 1.6 rc time, holidays got in the way.

I am thinking of adding this info on [this wiki page|https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ]. Please let me know if you think there is a better place, that's the best I could find. I don't seem to have access to edit that page, can one of you please give me access?

Also, I was trying to search in the code for any puppet recipes we maintain for the setting up build slaves. In order, if our Jenkins infra were wiped out, how do we make sure docker-engine is installed and running? How do we maintain keep track of build dependencies? Thanks in advance!

> Docker JDBC integration tests are failing again
> -----------------------------------------------
>
>                 Key: SPARK-12426
>                 URL: https://issues.apache.org/jira/browse/SPARK-12426
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Tests
>    Affects Versions: 1.6.0
>            Reporter: Mark Grover
>
> The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to be failing again on my machine (Ubuntu Precise). This was the same box that I tested my previous commit on. Also, I am not confident this failure has much to do with Spark, since a well known commit where the tests were passing, fails now, in the same environment.
> [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing on his Ubuntu 15 box as well.
> Here's the error, fyi:
> {code}
> 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
> *** RUN ABORTED ***
>   com.spotify.docker.client.DockerException: java.util.concurrent.ExecutionException: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
>   at com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141)
>   at com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082)
>   at com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)
>   at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
>   at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
>   ...
>   Cause: java.util.concurrent.ExecutionException: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
>   at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
>   at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
>   at jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
>   at com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080)
>   at com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)
>   at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
>   at org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)
>   ...
>   Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: java.io.IOException: No such file or directory
>   at org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481)
>   at org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)
>   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>   at jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
>   at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
>   at jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)
>   at jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)
>   at org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
>   at org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:177)
>   ...
>   Cause: java.io.IOException: No such file or directory
>   at jnr.unixsocket.UnixSocketChannel.doConnect(UnixSocketChannel.java:94)
>   at jnr.unixsocket.UnixSocketChannel.connect(UnixSocketChannel.java:102)
>   at com.spotify.docker.client.ApacheUnixSocket.connect(ApacheUnixSocket.java:73)
>   at com.spotify.docker.client.UnixConnectionSocketFactory.connectSocket(UnixConnectionSocketFactory.java:74)
>   at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)
>   at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
>   at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
>   at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
>   at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
>   at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
>   ...
> 15/12/18 10:12:50 INFO ShutdownHookManager: Shutdown hook called
> 15/12/18 10:12:50 INFO ShutdownHookManager: Deleting directory /root/spark/docker-integration-tests/target/tmp/spark-43a76284-b0c5-4570-9576-d65b21f45058
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 26.787 s
> [INFO] Finished at: 2015-12-18T10:12:50-08:00
> [INFO] Final Memory: 57M/579M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project spark-docker-integration-tests_2.10: There are test failures -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> {code}
> And, here's the way to reproduce:
> {code}
> build/mvn -pl docker-integration-tests package
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org