You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by felixcheung <gi...@git.apache.org> on 2016/07/13 08:16:28 UTC

[GitHub] spark pull request #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession in...

GitHub user felixcheung opened a pull request:

    https://github.com/apache/spark/pull/14177

    [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

    ## What changes were proposed in this pull request?
    
    Fix R SparkSession init/stop, and warnings of reusing existing Spark Context
    
    
    ## How was this patch tested?
    
    unit tests
    
    @shivaram 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/felixcheung/spark rsessiontest

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/14177.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #14177
    
----
commit 72fffbb593de289fb4434c730c592e04b50fb13f
Author: Felix Cheung <fe...@hotmail.com>
Date:   2016-07-13T05:42:01Z

    fix session start/stop in tests

commit 614a63e091a8164696a4316564bdae53257953de
Author: Felix Cheung <fe...@hotmail.com>
Date:   2016-07-13T06:56:56Z

    fix test

commit 1a86e857ab954620fb33dde8667f3a2a7d5138dc
Author: Felix Cheung <fe...@hotmail.com>
Date:   2016-07-13T07:56:09Z

    fix style

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62419 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62419/consoleFull)** for PR 14177 at commit [`bec4b33`](https://github.com/apache/spark/commit/bec4b3372d8e861d8b3f7c04cf4675a02918808f).
     * This patch **fails SparkR unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Hmm ok - The only difference in the patch I tried out locally is that I had the `sleep` in the loop test case. Did you remove that for some other reason ? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    I didn't think that should be needed since SparkSession was created with enableHiveSupport = F. Let me try that too.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62417 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62417/consoleFull)** for PR 14177 at commit [`03f163a`](https://github.com/apache/spark/commit/03f163a660f7a4abd9d99449524396ad91830e24).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    I'd hit these errors fairly randomly if hive = T, even when stop is called
    ```
    java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@1522765a, see the next exception for details.
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    	at java.sql.DriverManager.getConnection(DriverManager.java:664)
    	at java.sql.DriverManager.getConnection(DriverManager.java:208)
    	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349)
    	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
    	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    	at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
    	at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
    	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
    	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
    	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
    	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
    	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
    	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
    	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
    	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
    	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
    	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
    	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
    	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
    	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:171)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
    	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
    	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
    	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
    	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:371)
    	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
    	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:287)
    	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:287)
    	at sun.reflect.GeneratedMethodAccessor1348.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)
    	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@1522765a, see the next exception for details.
    	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
    	... 115 more
    Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /opt/spark-2.0.0-bin-hadoop2.6/R/metastore_db.
    	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
    	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
    	at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
    	... 112 more
    ============= begin nested exception, level (1) ===========
    java.sql.SQLException: Another instance of Derby may have already booted the database /opt/spark-2.0.0-bin-hadoop2.6/R/metastore_db.
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    	at java.sql.DriverManager.getConnection(DriverManager.java:664)
    	at java.sql.DriverManager.getConnection(DriverManager.java:208)
    	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:349)
    	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
    	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    	at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
    	at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
    	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
    	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
    	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
    	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
    	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
    	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
    	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
    	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
    	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
    	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
    	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
    	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
    	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
    	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:171)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
    	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
    	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
    	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
    	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:371)
    	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
    	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:287)
    	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:287)
    	at sun.reflect.GeneratedMethodAccessor1348.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)
    	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /opt/spark-2.0.0-bin-hadoop2.6/R/metastore_db.
    	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
    	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
    	at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
    	at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
    	at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
    	at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
    	at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
    	at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
    	... 112 more
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    LGTM. Merging into master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    So I just tried the setup where I only have `enableHiveMetastore=F` for the test case we are uncommenting and the `sparkR.session.stop` added to the other test files as in this PR. That seems to work across 3-4 test runs. Is the the setup that leads to errors listed above ? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62422 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62422/consoleFull)** for PR 14177 at commit [`12899c5`](https://github.com/apache/spark/commit/12899c516547a2f5064639386c5a42530a345ec6).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    No luck, but I push that change to see if it works better in Jenkins - is that what you are referring to?
    
    I'm consistently getting these errors:
    ```
    Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
      java.lang.reflect.InvocationTargetException
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
    	at org.apache.spark.sql.hive.HiveSharedState.externa
    Calls: test_package ... with_reporter -> force -> source_file -> eval -> eval
    
    java.lang.reflect.InvocationTargetException
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
    	at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
    	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
    	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
    	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
    	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
    	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:527)
    	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:291)
    	at org.apache.spark.sql.api.r.SQLUtils$.createDF(SQLUtils.scala:139)
    	at org.apache.spark.sql.api.r.SQLUtils.createDF(SQLUtils.scala)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:141)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:86)
    	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)
    	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:171)
    	... 47 more
    Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
    	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
    	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
    	... 48 more
    Caused by: java.lang.reflect.InvocationTargetException
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
    	... 54 more
    Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
    java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2435f119, see the next exception for details.
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
    	at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
    	at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
    	at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
    	at java.sql.DriverManager.getConnection(DriverManager.java:664)
    	at java.sql.DriverManager.getConnection(DriverManager.java:208)
    	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
    	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    	at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
    	at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
    	at org.datanucleus.api.jdo.JDOPersiste
    1: createDataFrame(data.frame(dummy = 1:i)) at /opt/spark-2.0.0-bin-hadoop2.6/R/lib/SparkR/tests/testthat/test_context.R:69
    2: dispatchFunc("createDataFrame(data, schema = NULL, samplingRatio = 1.0)", x, ...)
    3: f(x, ...)
    4: callJStatic("org.apache.spark.sql.api.r.SQLUtils", "createDF", srdd, schema$jobj,
           sparkSession)
    5: invokeJava(isStatic = TRUE, className, methodName, ...)
    6: stop(readString(conn))
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Sounds good to me! Thanks!
    
    
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Hive metastore didn't shutdown and restart cleanly was the original problem. it seemed to work better when SparkSession is starting with enableHiveSupport off. we do still have one in test_sparkSQL.R though.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Jenkins failed with:
    ```
    123456789a.bcdefghijklmnopqrstuvwxyzABCDS...EFGHIJKLMNOPQRSTUVW
    [Stage 61:>                                                         (0 + 0) / 2]
                                                                                    
    XYZSFFFFFFFFFFFFError in invokeJava(isStatic = TRUE, className, methodName, ...) : 
      java.lang.reflect.InvocationTargetException
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
    	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
    	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
    	at org.apache.spark.sql.hive.HiveSharedState.externa
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62419/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62422/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Does the hive metastore not shutdown properly even if we do `sparkSession.stop()` in all the test files ? The reason I'm trying to avoid having `enableHiveMetastore=F` in most test files is that Hive enabled is true by default and hence closer to what users will see. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62417 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62417/consoleFull)** for PR 14177 at commit [`03f163a`](https://github.com/apache/spark/commit/03f163a660f7a4abd9d99449524396ad91830e24).
     * This patch **fails SparkR unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62418 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62418/consoleFull)** for PR 14177 at commit [`c7e7592`](https://github.com/apache/spark/commit/c7e7592090a2e3ef84029bc9112ce27abf085e40).
     * This patch **fails SparkR unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62422 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62422/consoleFull)** for PR 14177 at commit [`12899c5`](https://github.com/apache/spark/commit/12899c516547a2f5064639386c5a42530a345ec6).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62417/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62418/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by felixcheung <gi...@git.apache.org>.
Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Agreed - this could be a bug in SQL/Hive, I'd be interested in digging into it a bit more later next week.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62226 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62226/consoleFull)** for PR 14177 at commit [`1a86e85`](https://github.com/apache/spark/commit/1a86e857ab954620fb33dde8667f3a2a7d5138dc).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession in...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/14177


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    I just realized that my local build was not using the hive profile. If this fails on Jenkins let's just go back to the original PR. Also I wonder if this is something we should notify the SQL commiters about


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62418 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62418/consoleFull)** for PR 14177 at commit [`c7e7592`](https://github.com/apache/spark/commit/c7e7592090a2e3ef84029bc9112ce27abf085e40).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/62226/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    Is there a reason why `enableHiveSupport = F` is required in all the test cases ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62226 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62226/consoleFull)** for PR 14177 at commit [`1a86e85`](https://github.com/apache/spark/commit/1a86e857ab954620fb33dde8667f3a2a7d5138dc).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    **[Test build #62419 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/62419/consoleFull)** for PR 14177 at commit [`bec4b33`](https://github.com/apache/spark/commit/bec4b3372d8e861d8b3f7c04cf4675a02918808f).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #14177: [SPARK-16027][SPARKR] Fix R tests SparkSession init/stop

Posted by shivaram <gi...@git.apache.org>.
Github user shivaram commented on the issue:

    https://github.com/apache/spark/pull/14177
  
    @felixcheung I plan to merge this into master but skip branch-2.0 as I dont want to introduce new test errors if we have another RC. Let me know if that sounds good


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org