You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Josh Mahonin (JIRA)" <ji...@apache.org> on 2017/09/05 19:29:00 UTC

[jira] [Commented] (PHOENIX-4159) phoenix-spark tests are failing

    [ https://issues.apache.org/jira/browse/PHOENIX-4159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16154162#comment-16154162 ] 

Josh Mahonin commented on PHOENIX-4159:
---------------------------------------

[~samarthjain] I can confirm I see the same issue locally as well running 'mvn verify' within the phoenix-spark folder (macOS Sierra)

I suspect the issue is due to how the phoenix-spark integration tests extend the underlying base integration test classes:

https://github.com/apache/phoenix/blob/master/phoenix-spark/src/it/scala/org/apache/phoenix/spark/AbstractPhoenixSparkIT.scala#L26-L39

I recall there was some impedance mismatch between Scala and Java abstract classes, so the wrapper code above was put in place in order to allow the various phoenix-spark integration tests to properly extend those tests. This worked for quite a while, but it's prone to breaking if 'BaseTest' or 'BaseHBaseManagedTimeIT' end up changing, which is what I expect has happened here.

I can try spend a bit of time digging into it this week, but if you or anyone else has any guidance as to which classes should be invoked instead, that would be wonderful.

> phoenix-spark tests are failing
> -------------------------------
>
>                 Key: PHOENIX-4159
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-4159
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Samarth Jain
>
> In few of the runs where we were able to get successful test runs for phoenix-core, we ran into failures for the phoenix-spark module. 
> Sample run - https://builds.apache.org/job/Phoenix-master/1762/console
> [~jmahonin] - would you mind taking a look. Copy pasting here a possibly relevant stacktrace in case the link is no longer working:
> {code}
> Formatting using clusterid: testClusterID
> 1    [ScalaTest-4] ERROR org.apache.hadoop.hdfs.MiniDFSCluster  - IOE creating namenodes. Permissions dump:
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/data': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/data
> 	permissions: ----
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace': 
> 	absolute:/home/jenkins/jenkins-slave/workspace
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave': 
> 	absolute:/home/jenkins/jenkins-slave
> 	permissions: drwx
> path '/home/jenkins': 
> 	absolute:/home/jenkins
> 	permissions: drwx
> path '/home': 
> 	absolute:/home
> 	permissions: dr-x
> path '/': 
> 	absolute:/
> 	permissions: dr-x
> java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
> 	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
> 	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
> 	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
> 	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
> 	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:625)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1022)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:903)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:885)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:872)
> 	at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:522)
> 	at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:442)
> 	at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:424)
> 	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:495)
> 	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:490)
> 	at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
> 	at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
> 	at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
> 	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.run(AbstractPhoenixSparkIT.scala:44)
> 	at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> Exception encountered when invoking run on a nested suite - java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current *** ABORTED ***
>   java.lang.RuntimeException: java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
>   at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:525)
>   at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:442)
>   at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:424)
>   at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:495)
>   at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:490)
>   at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
>   at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
>   at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
>   at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
>   ...
>   Cause: java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
>   at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
>   at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
>   at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
>   at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
>   at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
>   at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
>   at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)