You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@sentry.apache.org by "Dapeng Sun (JIRA)" <ji...@apache.org> on 2015/09/01 07:50:48 UTC

[jira] [Updated] (SENTRY-522) [Unit Test] TestExportImportPrivileges failed due to error "Couldn't access new HiveServer: "

     [ https://issues.apache.org/jira/browse/SENTRY-522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dapeng Sun updated SENTRY-522:
------------------------------
    Fix Version/s:     (was: 1.6.0)
                   1.7.0

> [Unit Test] TestExportImportPrivileges failed due to error "Couldn't access new HiveServer: "
> ---------------------------------------------------------------------------------------------
>
>                 Key: SENTRY-522
>                 URL: https://issues.apache.org/jira/browse/SENTRY-522
>             Project: Sentry
>          Issue Type: Bug
>    Affects Versions: 1.5.0
>            Reporter: Lenni Kuff
>            Priority: Blocker
>             Fix For: 1.7.0
>
>
> Failure:
> https://builds.apache.org/view/S-Z/view/Sentry/job/Sentry-jdk-1.7/97/org.apache.sentry$sentry-tests-hive/testReport/junit/org.apache.sentry.tests.e2e.hive/TestExportImportPrivileges/org_apache_sentry_tests_e2e_hive_TestExportImportPrivileges/
> {code}
> Error Message
> Couldn't access new HiveServer: jdbc:hive2://localhost:34410/default
> Stacktrace
> java.util.concurrent.TimeoutException: Couldn't access new HiveServer: jdbc:hive2://localhost:34410/default
> 	at org.apache.sentry.tests.e2e.hive.hiveserver.AbstractHiveServer.waitForStartup(AbstractHiveServer.java:80)
> 	at org.apache.sentry.tests.e2e.hive.hiveserver.InternalHiveServer.start(InternalHiveServer.java:47)
> 	at org.apache.sentry.tests.e2e.hive.AbstractTestWithStaticConfiguration.setupTestStaticConfiguration(AbstractTestWithStaticConfiguration.java:229)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> 	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
> 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
> 	at org.junit.runners.ParentRunner.run(ParentRunner.java:292)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> 	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> 	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> 	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> {code}
> Seems like there may have been an issue with HDFS:
> {code}
> java.lang.IllegalStateException: Unable to finalize edits file /tmp/1415867520531-0/dfs/name2/current/edits_inprogress_0000000000000000001
> 	at org.apache.hadoop.hdfs.server.namenode.FileJournalManager.finalizeLogSegment(FileJournalManager.java:141)
> 	at org.apache.hadoop.hdfs.server.namenode.JournalSet$4.apply(JournalSet.java:236)
> 	at org.apache.hadoop.hdfs.server.namenode.JournalSet.mapJournalsAndReportErrors(JournalSet.java:393)
> 	at org.apache.hadoop.hdfs.server.namenode.JournalSet.finalizeLogSegment(JournalSet.java:231)
> 	at org.apache.hadoop.hdfs.server.namenode.FSEditLog.endCurrentLogSegment(FSEditLog.java:1216)
> 	at org.apache.hadoop.hdfs.server.namenode.FSEditLog.close(FSEditLog.java:356)
> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1224)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.stopActiveServices(NameNode.java:1638)
> 	at org.apache.hadoop.hdfs.server.namenode.ha.ActiveState.exitState(ActiveState.java:70)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:803)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1592)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1571)
> 	at org.apache.sentry.tests.e2e.hive.fs.MiniDFS.tearDown(MiniDFS.java:76)
> 	at org.apache.sentry.tests.e2e.hive.AbstractTestWithStaticConfiguration.tearDownTestStaticConfiguration(AbstractTestWithStaticConfiguration.java:478)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> 	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37)
> 	at org.junit.runners.ParentRunner.run(ParentRunner.java:292)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> 	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> 	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> 	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> 	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> Caused by: java.io.IOException: renameTo(src=/tmp/1415867520531-0/dfs/name2/current/edits_inprogress_0000000000000000001, dst=/tmp/1415867520531-0/dfs/name2/current/edits_0000000000000000001-0000000000000000007) failed.
> 	at org.apache.hadoop.io.nativeio.NativeIO.renameTo(NativeIO.java:826)
> 	at org.apache.hadoop.hdfs.server.namenode.FileJournalManager.finalizeLogSegment(FileJournalManager.java:138)
> 	... 28 more
> {code}
> Marking as a blocker since this is a unit test failure.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)