You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2017/09/13 14:21:00 UTC
[jira] [Commented] (AMBARI-21937) Ambari server schema upgrade
failed while creating DRUID_SUPERSET component
[ https://issues.apache.org/jira/browse/AMBARI-21937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16164718#comment-16164718 ]
Hadoop QA commented on AMBARI-21937:
------------------------------------
{color:red}-1 overall{color}. Here are the results of testing the latest attachment
http://issues.apache.org/jira/secure/attachment/12886865/AMBARI-21937.patch
against trunk revision .
{color:red}-1 patch{color}. The patch command could not apply the patch.
Console output: https://builds.apache.org/job/Ambari-trunk-test-patch/12210//console
This message is automatically generated.
> Ambari server schema upgrade failed while creating DRUID_SUPERSET component
> ---------------------------------------------------------------------------
>
> Key: AMBARI-21937
> URL: https://issues.apache.org/jira/browse/AMBARI-21937
> Project: Ambari
> Issue Type: Bug
> Components: ambari-server
> Affects Versions: 2.6.0
> Reporter: Vivek Sharma
> Assignee: Nishant Bangarwa
> Priority: Blocker
> Labels: upgrade
> Fix For: trunk, 2.6.0
>
> Attachments: AMBARI-21937.patch
>
>
> *STR*
> # Deployed cluster with Ambari version: 2.5.2.0-298 and HDP version: 2.6.2.0-205
> # Upgrade Ambari packages and then schema (ambari-server upgrade) to target Version: 2.6.0.0-77 | Hash: 450aae3dceabb270e0e267c16e2bb198db809541
> *Result*
> Schema upgrade failed with below error:
> {code}
> 12 Sep 2017 01:09:19,339 INFO [Stack Version Loading Thread] LatestRepoCallable:106 - Loaded uri http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json in 177ms
> 12 Sep 2017 01:09:19,361 ERROR [main] SchemaUpgradeHelper:230 - Upgrade failed.
> com.google.inject.ProvisionException: Guice provision errors:
> 1) Error injecting method, java.lang.RuntimeException: Trying to create a ServiceComponent not recognized in stack info, clusterName=cl1, serviceName=DRUID, componentName=DRUID_SUPERSET, stackInfo=HDP-2.6
> at org.apache.ambari.server.state.cluster.ClustersImpl.loadClustersAndHosts(ClustersImpl.java:173)
> at org.apache.ambari.server.state.cluster.ClustersImpl.class(ClustersImpl.java:95)
> while locating org.apache.ambari.server.state.cluster.ClustersImpl
> while locating org.apache.ambari.server.state.Clusters
> 1 error
> at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:987)
> at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013)
> at org.apache.ambari.server.upgrade.AbstractUpgradeCatalog.addNewConfigurationsFromXml(AbstractUpgradeCatalog.java:365)
> at org.apache.ambari.server.upgrade.UpgradeCatalog260.executeDMLUpdates(UpgradeCatalog260.java:392)
> at org.apache.ambari.server.upgrade.AbstractUpgradeCatalog.upgradeData(AbstractUpgradeCatalog.java:938)
> at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.executeDMLUpdates(SchemaUpgradeHelper.java:228)
> at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.main(SchemaUpgradeHelper.java:421)
> Caused by: java.lang.RuntimeException: Trying to create a ServiceComponent not recognized in stack info, clusterName=cl1, serviceName=DRUID, componentName=DRUID_SUPERSET, stackInfo=HDP-2.6
> at org.apache.ambari.server.state.ServiceComponentImpl.updateComponentInfo(ServiceComponentImpl.java:141)
> at org.apache.ambari.server.state.ServiceComponentImpl.<init>(ServiceComponentImpl.java:170)
> at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)
> at com.google.inject.internal.ProxyFactory$ProxyConstructor.newInstance(ProxyFactory.java:260)
> at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
> at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254)
> at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978)
> at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031)
> at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974)
> at com.google.inject.assistedinject.FactoryProvider2.invoke(FactoryProvider2.java:632)
> at com.sun.proxy.$Proxy19.createExisting(Unknown Source)
> at org.apache.ambari.server.state.ServiceImpl.<init>(ServiceImpl.java:162)
> at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)
> at com.google.inject.internal.ProxyFactory$ProxyConstructor.newInstance(ProxyFactory.java:260)
> at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
> at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254)
> at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978)
> at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031)
> at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974)
> at com.google.inject.assistedinject.FactoryProvider2.invoke(FactoryProvider2.java:632)
> at com.sun.proxy.$Proxy15.createExisting(Unknown Source)
> at org.apache.ambari.server.state.cluster.ClusterImpl.loadServices(ClusterImpl.java:427)
> at org.apache.ambari.server.state.cluster.ClusterImpl.<init>(ClusterImpl.java:318)
> at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)
> at com.google.inject.internal.ProxyFactory$ProxyConstructor.newInstance(ProxyFactory.java:260)
> at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
> at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254)
> {code}
> Looks like the issue is due to newly introduced service DRUID_SUPERSET, that was part of Druid service itself earlier
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)