You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by GitBox <gi...@apache.org> on 2019/04/02 09:47:53 UTC
[GitHub] [incubator-druid] hueiyuanSu opened a new issue #7399:
druid-google-extension dependencies problem
hueiyuanSu opened a new issue #7399: druid-google-extension dependencies problem
URL: https://github.com/apache/incubator-druid/issues/7399
stderr:
Traceback (most recent call last):
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 995, in restart
self.status(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/druid_node.py", line 120, in status
check_process_status(pid_file)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/check_process_status.py", line 43, in check_process_status
raise ComponentIsNotRunning()
ComponentIsNotRunning
The above exception was the cause of the following exception:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/historical.py", line 28, in <module>
DruidHistorical().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 1006, in restart
self.start(env, upgrade_type=upgrade_type)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/druid_node.py", line 60, in start
self.configure(env, upgrade_type=upgrade_type)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/druid_node.py", line 45, in configure
druid(upgrade_type=upgrade_type, nodeType=self.nodeType)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/druid.py", line 147, in druid
pulldeps()
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/DRUID/package/scripts/druid.py", line 290, in pulldeps
user=params.druid_user
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'source /usr/hdp/current/druid-historical/conf/druid-env.sh ; java -classpath '/usr/hdp/current/druid-historical/lib/*' -Ddruid.extensions.loadList=[] -Ddruid.extensions.directory=/usr/hdp/current/druid-historical/extensions -Ddruid.extensions.hadoopDependenciesDir=/usr/hdp/current/druid-historical/hadoop-dependencies io.druid.cli.Main tools pull-deps -c io.druid.extensions.contrib:druid-google-extensions:0.12.1 --no-default-hadoop -r http://repo.hortonworks.com/content/repositories/releases/' returned 1. su: warning: cannot change directory to /var/lib/druid: No such file or directory
Apr 02, 2019 9:45:17 AM org.hibernate.validator.internal.util.Version <clinit>
INFO: HV000001: Hibernate Validator 5.1.3.Final
2019-04-02T09:45:17,995 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.common.config.NullValueHandlingConfig] from props[druid.generic.] as [io.druid.common.config.NullValueHandlingConfig@69653e16]
2019-04-02T09:45:18,049 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='/usr/hdp/current/druid-historical/extensions', hadoopDependenciesDir='/usr/hdp/current/druid-historical/hadoop-dependencies', hadoopContainerDruidClasspath='null', addExtensionsToHadoopContainer=false, loadList=[]}]
2019-04-02T09:45:18,353 INFO [main] io.druid.cli.PullDependencies - Start pull-deps with local repository [/var/lib/druid/.m2/repository] and remote repositories [[http://repo.hortonworks.com/content/repositories/releases/]]
2019-04-02T09:45:18,353 INFO [main] io.druid.cli.PullDependencies - Start downloading dependencies for extension coordinates: [[io.druid.extensions.contrib:druid-google-extensions:0.12.1]]
2019-04-02T09:45:18,355 INFO [main] io.druid.cli.PullDependencies - Directory [/usr/hdp/current/druid-historical/extensions/druid-google-extensions] already exists, skipping creating a directory
2019-04-02T09:45:18,358 INFO [main] io.druid.cli.PullDependencies - Start downloading extension [io.druid.extensions.contrib:druid-google-extensions:jar:0.12.1]
2019-04-02T09:45:18,395 ERROR [main] io.druid.cli.PullDependencies - Unable to resolve artifacts for [io.druid.extensions.contrib:druid-google-extensions:jar:0.12.1 (runtime) -> [] < [ (https://repo1.maven.org/maven2/, releases+snapshots), (http://172.27.44.203/nexus/content/groups/public, releases+snapshots), (http://repo.hortonworks.com/content/repositories/releases/, releases+snapshots)]].
java.lang.NullPointerException
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:361) ~[aether-impl-0.9.0.M2.jar:?]
at io.tesla.aether.internal.DefaultTeslaAether.resolveArtifacts(DefaultTeslaAether.java:289) ~[tesla-aether-0.0.5.jar:0.0.5]
at io.druid.cli.PullDependencies.downloadExtension(PullDependencies.java:349) [druid-services-0.12.1.3.1.0.0-78.jar:0.12.1.3.1.0.0-78]
at io.druid.cli.PullDependencies.run(PullDependencies.java:248) [druid-services-0.12.1.3.1.0.0-78.jar:0.12.1.3.1.0.0-78]
at io.druid.cli.Main.main(Main.java:116) [druid-services-0.12.1.3.1.0.0-78.jar:0.12.1.3.1.0.0-78]
Exception in thread "main" java.lang.NullPointerException
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:361)
at io.tesla.aether.internal.DefaultTeslaAether.resolveArtifacts(DefaultTeslaAether.java:289)
at io.druid.cli.PullDependencies.downloadExtension(PullDependencies.java:349)
at io.druid.cli.PullDependencies.run(PullDependencies.java:248)
at io.druid.cli.Main.main(Main.java:116)
stdout:
2019-04-02 09:45:14,716 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.0.0-78 -> 3.1.0.0-78
2019-04-02 09:45:14,736 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2019-04-02 09:45:14,975 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.0.0-78 -> 3.1.0.0-78
2019-04-02 09:45:14,980 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2019-04-02 09:45:14,982 - Group['hdfs'] {}
2019-04-02 09:45:14,983 - Group['hadoop'] {}
2019-04-02 09:45:14,983 - Group['users'] {}
2019-04-02 09:45:14,983 - Group['knox'] {}
2019-04-02 09:45:14,983 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,984 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,985 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,986 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,987 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,988 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-04-02 09:45:14,988 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-04-02 09:45:14,989 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-04-02 09:45:14,990 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,990 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-04-02 09:45:14,991 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-04-02 09:45:14,992 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-04-02 09:45:14,993 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-04-02 09:45:15,001 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-04-02 09:45:15,002 - Group['hdfs'] {}
2019-04-02 09:45:15,002 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-04-02 09:45:15,003 - FS Type: HDFS
2019-04-02 09:45:15,003 - Directory['/etc/hadoop'] {'mode': 0755}
2019-04-02 09:45:15,018 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-04-02 09:45:15,019 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-04-02 09:45:15,048 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2019-04-02 09:45:15,058 - Skipping Execute[('setenforce', '0')] due to not_if
2019-04-02 09:45:15,059 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2019-04-02 09:45:15,061 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2019-04-02 09:45:15,061 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2019-04-02 09:45:15,062 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2019-04-02 09:45:15,065 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2019-04-02 09:45:15,067 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2019-04-02 09:45:15,073 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2019-04-02 09:45:15,083 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-04-02 09:45:15,084 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2019-04-02 09:45:15,085 - File['/usr/hdp/3.1.0.0-78/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2019-04-02 09:45:15,089 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2019-04-02 09:45:15,095 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2019-04-02 09:45:15,100 - Skipping unlimited key JCE policy check and setup since the Java VM is not managed by Ambari
2019-04-02 09:45:15,569 - Using hadoop conf dir: /usr/hdp/3.1.0.0-78/hadoop/conf
2019-04-02 09:45:15,571 - Execute['source /usr/hdp/current/druid-historical/conf/druid-env.sh ; /usr/hdp/current/druid-historical/bin/node.sh historical stop'] {'only_if': 'ambari-sudo.sh -H -E test -f /var/run/druid/historical.pid && ambari-sudo.sh -H -E pgrep -F /var/run/druid/historical.pid', 'user': 'druid'}
2019-04-02 09:45:15,578 - Skipping Execute['source /usr/hdp/current/druid-historical/conf/druid-env.sh ; /usr/hdp/current/druid-historical/bin/node.sh historical stop'] due to only_if
2019-04-02 09:45:15,578 - Pid file /var/run/druid/historical.pid is empty or does not exist
2019-04-02 09:45:15,581 - Directory['/var/log/druid'] {'owner': 'druid', 'create_parents': True, 'group': 'hadoop', 'recursive_ownership': True, 'mode': 0755}
2019-04-02 09:45:15,583 - Directory['/var/run/druid'] {'owner': 'druid', 'group': 'hadoop', 'create_parents': True, 'recursive_ownership': True, 'mode': 0755}
2019-04-02 09:45:15,583 - Directory['/usr/hdp/current/druid-historical/conf'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,584 - Changing permission for /usr/hdp/current/druid-historical/conf from 755 to 700
2019-04-02 09:45:15,585 - Directory['/usr/hdp/current/druid-historical/conf/_common'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,585 - Changing permission for /usr/hdp/current/druid-historical/conf/_common from 755 to 700
2019-04-02 09:45:15,586 - Directory['/usr/hdp/current/druid-historical/conf/coordinator'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,586 - Changing permission for /usr/hdp/current/druid-historical/conf/coordinator from 755 to 700
2019-04-02 09:45:15,586 - Directory['/usr/hdp/current/druid-historical/conf/broker'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,587 - Changing permission for /usr/hdp/current/druid-historical/conf/broker from 755 to 700
2019-04-02 09:45:15,587 - Directory['/usr/hdp/current/druid-historical/conf/middleManager'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,587 - Changing permission for /usr/hdp/current/druid-historical/conf/middleManager from 755 to 700
2019-04-02 09:45:15,588 - Directory['/usr/hdp/current/druid-historical/conf/historical'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,588 - Changing permission for /usr/hdp/current/druid-historical/conf/historical from 755 to 700
2019-04-02 09:45:15,588 - Directory['/usr/hdp/current/druid-historical/conf/overlord'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,588 - Changing permission for /usr/hdp/current/druid-historical/conf/overlord from 755 to 700
2019-04-02 09:45:15,589 - Directory['/usr/hdp/current/druid-historical/conf/router'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,589 - Changing permission for /usr/hdp/current/druid-historical/conf/router from 755 to 700
2019-04-02 09:45:15,589 - Directory['/apps/druid/segmentCache/info_dir'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,590 - Changing permission for /apps/druid/segmentCache/info_dir from 755 to 700
2019-04-02 09:45:15,590 - Directory['/apps/druid/tasks'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'mode': 0700, 'owner': 'druid', 'recursive_ownership': True}
2019-04-02 09:45:15,590 - Changing permission for /apps/druid/tasks from 755 to 700
2019-04-02 09:45:15,591 - Directory['/apps/druid/segmentCache'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'mode': 0700, 'owner': 'druid', 'recursive_ownership': True}
2019-04-02 09:45:15,591 - Changing permission for /apps/druid/segmentCache from 755 to 700
2019-04-02 09:45:15,595 - File['/usr/hdp/current/druid-historical/conf/druid-env.sh'] {'content': InlineTemplate(...), 'owner': 'druid', 'mode': 0700}
2019-04-02 09:45:15,596 - PropertiesFile['common.runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/_common', 'properties': ...}
2019-04-02 09:45:15,599 - Generating properties file: /usr/hdp/current/druid-historical/conf/_common/common.runtime.properties
2019-04-02 09:45:15,600 - File['/usr/hdp/current/druid-historical/conf/_common/common.runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,617 - Writing File['/usr/hdp/current/druid-historical/conf/_common/common.runtime.properties'] because contents don't match
2019-04-02 09:45:15,618 - Created common.runtime.properties
2019-04-02 09:45:15,620 - File['/usr/hdp/current/druid-historical/conf/_common/druid-log4j.xml'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop', 'mode': 0644}
2019-04-02 09:45:15,620 - Created log4j file
2019-04-02 09:45:15,621 - File['/etc/logrotate.d/druid'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'root', 'mode': 0644}
2019-04-02 09:45:15,622 - Created log rotate file
2019-04-02 09:45:15,622 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/coordinator', 'properties': {u'druid.coordinator.merge.on': u'false', u'druid.service': u'druid/coordinator', u'druid.port': u'8081'}}
2019-04-02 09:45:15,626 - Generating properties file: /usr/hdp/current/druid-historical/conf/coordinator/runtime.properties
2019-04-02 09:45:15,626 - File['/usr/hdp/current/druid-historical/conf/coordinator/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,628 - Writing File['/usr/hdp/current/druid-historical/conf/coordinator/runtime.properties'] because contents don't match
2019-04-02 09:45:15,628 - Created druid-coordinator runtime.properties
2019-04-02 09:45:15,630 - File['/usr/hdp/current/druid-historical/conf/coordinator/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,630 - Created druid-coordinator jvm.config
2019-04-02 09:45:15,630 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/overlord', 'properties': {u'druid.indexer.storage.type': u'metadata', u'druid.indexer.runner.type': u'remote', u'druid.port': u'8090', u'druid.service': u'druid/overlord'}}
2019-04-02 09:45:15,634 - Generating properties file: /usr/hdp/current/druid-historical/conf/overlord/runtime.properties
2019-04-02 09:45:15,634 - File['/usr/hdp/current/druid-historical/conf/overlord/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,636 - Writing File['/usr/hdp/current/druid-historical/conf/overlord/runtime.properties'] because contents don't match
2019-04-02 09:45:15,637 - Created druid-overlord runtime.properties
2019-04-02 09:45:15,638 - File['/usr/hdp/current/druid-historical/conf/overlord/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,639 - Created druid-overlord jvm.config
2019-04-02 09:45:15,639 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/historical', 'properties': ...}
2019-04-02 09:45:15,643 - Generating properties file: /usr/hdp/current/druid-historical/conf/historical/runtime.properties
2019-04-02 09:45:15,643 - File['/usr/hdp/current/druid-historical/conf/historical/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,648 - Writing File['/usr/hdp/current/druid-historical/conf/historical/runtime.properties'] because contents don't match
2019-04-02 09:45:15,648 - Created druid-historical runtime.properties
2019-04-02 09:45:15,650 - File['/usr/hdp/current/druid-historical/conf/historical/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,651 - Created druid-historical jvm.config
2019-04-02 09:45:15,651 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/broker', 'properties': ...}
2019-04-02 09:45:15,654 - Generating properties file: /usr/hdp/current/druid-historical/conf/broker/runtime.properties
2019-04-02 09:45:15,654 - File['/usr/hdp/current/druid-historical/conf/broker/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,661 - Writing File['/usr/hdp/current/druid-historical/conf/broker/runtime.properties'] because contents don't match
2019-04-02 09:45:15,661 - Created druid-broker runtime.properties
2019-04-02 09:45:15,663 - File['/usr/hdp/current/druid-historical/conf/broker/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,664 - Created druid-broker jvm.config
2019-04-02 09:45:15,664 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/middleManager', 'properties': ...}
2019-04-02 09:45:15,667 - Generating properties file: /usr/hdp/current/druid-historical/conf/middleManager/runtime.properties
2019-04-02 09:45:15,667 - File['/usr/hdp/current/druid-historical/conf/middleManager/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,674 - Writing File['/usr/hdp/current/druid-historical/conf/middleManager/runtime.properties'] because contents don't match
2019-04-02 09:45:15,674 - Created druid-middlemanager runtime.properties
2019-04-02 09:45:15,676 - File['/usr/hdp/current/druid-historical/conf/middleManager/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,677 - Created druid-middlemanager jvm.config
2019-04-02 09:45:15,677 - PropertiesFile['runtime.properties'] {'owner': 'druid', 'group': 'hadoop', 'mode': 0600, 'dir': '/usr/hdp/current/druid-historical/conf/router', 'properties': {u'druid.router.http.numConnections': u'20', u'druid.router.tierToBrokerMap': u'{"_default_tier":"druid/broker"}', u'druid.server.http.numThreads': u'50', u'druid.port': u'8888', u'druid.service': u'druid/router'}}
2019-04-02 09:45:15,680 - Generating properties file: /usr/hdp/current/druid-historical/conf/router/runtime.properties
2019-04-02 09:45:15,680 - File['/usr/hdp/current/druid-historical/conf/router/runtime.properties'] {'owner': 'druid', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2019-04-02 09:45:15,683 - Writing File['/usr/hdp/current/druid-historical/conf/router/runtime.properties'] because contents don't match
2019-04-02 09:45:15,684 - Created druid-router runtime.properties
2019-04-02 09:45:15,685 - File['/usr/hdp/current/druid-historical/conf/router/jvm.config'] {'content': InlineTemplate(...), 'owner': 'druid', 'group': 'hadoop'}
2019-04-02 09:45:15,686 - Created druid-router jvm.config
2019-04-02 09:45:15,691 - Directory['/usr/lib/ambari-logsearch-logfeeder/conf'] {'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2019-04-02 09:45:15,691 - Generate Log Feeder config file: /usr/lib/ambari-logsearch-logfeeder/conf/input.config-druid.json
2019-04-02 09:45:15,691 - File['/usr/lib/ambari-logsearch-logfeeder/conf/input.config-druid.json'] {'content': Template('input.config-druid.json.j2'), 'mode': 0644}
2019-04-02 09:45:15,692 - HdfsResource['/user/druid'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'recursive_chown': True, 'owner': 'druid', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/app-logs', u'/tmp'], 'recursive_chmod': True}
2019-04-02 09:45:15,695 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpnWEnu0 2>/tmp/tmp01Dn34''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:15,772 - call returned (0, '')
2019-04-02 09:45:15,772 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":1,"fileId":17850,"group":"hadoop","length":0,"modificationTime":1554119995153,"owner":"druid","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-04-02 09:45:15,773 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid?op=SETOWNER&owner=druid&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmpOp3ugd 2>/tmp/tmpCp5BZd''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:15,849 - call returned (0, '')
2019-04-02 09:45:15,849 - get_user_call_output returned (0, u'200', u'')
2019-04-02 09:45:15,851 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid?op=GETCONTENTSUMMARY&user.name=hdfs'"'"' 1>/tmp/tmpgiu12J 2>/tmp/tmpOCBVoL''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:15,928 - call returned (0, '')
2019-04-02 09:45:15,929 - get_user_call_output returned (0, u'{"ContentSummary":{"directoryCount":2,"fileCount":0,"length":0,"quota":-1,"spaceConsumed":0,"spaceQuota":-1,"typeQuota":{}}}200', u'')
2019-04-02 09:45:15,930 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid?op=LISTSTATUS&user.name=hdfs'"'"' 1>/tmp/tmp_WdQy0 2>/tmp/tmpxnKwVf''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,008 - call returned (0, '')
2019-04-02 09:45:16,008 - get_user_call_output returned (0, u'{"FileStatuses":{"FileStatus":[\n{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17854,"group":"hadoop","length":0,"modificationTime":1554119995153,"owner":"druid","pathSuffix":"logs","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}\n]}}\n200', u'')
2019-04-02 09:45:16,009 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid/logs?op=LISTSTATUS&user.name=hdfs'"'"' 1>/tmp/tmpGUPSo9 2>/tmp/tmp7iPv8g''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,105 - call returned (0, '')
2019-04-02 09:45:16,106 - get_user_call_output returned (0, u'{"FileStatuses":{"FileStatus":[\n\n]}}\n200', u'')
2019-04-02 09:45:16,107 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid/logs?op=SETOWNER&owner=druid&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmpsA1gyK 2>/tmp/tmpPGupEr''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,184 - call returned (0, '')
2019-04-02 09:45:16,185 - get_user_call_output returned (0, u'200', u'')
2019-04-02 09:45:16,186 - HdfsResource['/apps/druid/warehouse'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'druid', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0775}
2019-04-02 09:45:16,187 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/apps/druid/warehouse?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp4QOBth 2>/tmp/tmppxIbnc''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,264 - call returned (0, '')
2019-04-02 09:45:16,265 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17852,"group":"hadoop","length":0,"modificationTime":1554119994446,"owner":"druid","pathSuffix":"","permission":"775","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-04-02 09:45:16,266 - Created Hadoop Directory [/apps/druid/warehouse], with mode [509]
2019-04-02 09:45:16,266 - HdfsResource['/tmp'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hdfs', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0777}
2019-04-02 09:45:16,267 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/tmp?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpc645zO 2>/tmp/tmpvUbAQe''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,347 - call returned (0, '')
2019-04-02 09:45:16,348 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":3,"fileId":16386,"group":"hdfs","length":0,"modificationTime":1554119994845,"owner":"hdfs","pathSuffix":"","permission":"777","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-04-02 09:45:16,349 - Skipping the operation for not managed DFS directory /tmp since immutable_paths contains it.
2019-04-02 09:45:16,349 - HdfsResource['/tmp/druid-indexing'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'druid', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0775}
2019-04-02 09:45:16,351 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/tmp/druid-indexing?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmprPATB8 2>/tmp/tmpeotUzU''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,428 - call returned (0, '')
2019-04-02 09:45:16,429 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17853,"group":"hadoop","length":0,"modificationTime":1554119994845,"owner":"druid","pathSuffix":"","permission":"775","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-04-02 09:45:16,430 - Created Hadoop Directory [/tmp/druid-indexing], with mode [509]
2019-04-02 09:45:16,430 - HdfsResource['/user/druid/logs'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/3.1.0.0-78/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': 'HDFS', 'default_fs': 'hdfs://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'druid', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/3.1.0.0-78/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2019-04-02 09:45:16,431 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET -d '"'"''"'"' -H '"'"'Content-Length: 0'"'"' '"'"'http://druidproduction-m-0-20190401103852.c.adgeek-prod.internal:50070/webhdfs/v1/user/druid/logs?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpw5JhUS 2>/tmp/tmpeLbuI9''] {'logoutput': None, 'quiet': False}
2019-04-02 09:45:16,507 - call returned (0, '')
2019-04-02 09:45:16,507 - get_user_call_output returned (0, u'{"FileStatus":{"accessTime":0,"blockSize":0,"childrenNum":0,"fileId":17854,"group":"hadoop","length":0,"modificationTime":1554119995153,"owner":"druid","pathSuffix":"","permission":"755","replication":0,"storagePolicy":0,"type":"DIRECTORY"}}200', u'')
2019-04-02 09:45:16,508 - Created Hadoop Directory [/user/druid/logs], with mode [493]
2019-04-02 09:45:16,509 - Directory['/usr/hdp/current/druid-historical/extensions'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'druid', 'mode': 0755}
2019-04-02 09:45:16,514 - Directory['/usr/hdp/current/druid-historical/hadoop-dependencies'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'druid', 'recursive_ownership': True}
2019-04-02 09:45:16,516 - Execute['source /usr/hdp/current/druid-historical/conf/druid-env.sh ; java -classpath '/usr/hdp/current/druid-historical/lib/*' -Ddruid.extensions.loadList=[] -Ddruid.extensions.directory=/usr/hdp/current/druid-historical/extensions -Ddruid.extensions.hadoopDependenciesDir=/usr/hdp/current/druid-historical/hadoop-dependencies io.druid.cli.Main tools pull-deps -c io.druid.extensions.contrib:druid-google-extensions:0.12.1 --no-default-hadoop -r http://repo.hortonworks.com/content/repositories/releases/'] {'user': 'druid'}
2019-04-02 09:45:18,431 - Execute['find /var/log/druid -maxdepth 1 -type f -name '*' -exec echo '==> {} <==' \; -exec tail -n 40 {} \;'] {'logoutput': True, 'ignore_failures': True, 'user': 'druid'}
su: warning: cannot change directory to /var/lib/druid: No such file or directory
==> /var/log/druid/historical.log <==
2019-04-02T09:29:39,783 INFO [Thread-54] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.SegmentLoadDropHandler.stop()] on object[io.druid.server.coordination.SegmentLoadDropHandler@6f8667bb].
2019-04-02T09:29:39,783 INFO [Thread-54] io.druid.server.coordination.SegmentLoadDropHandler - Stopping...
2019-04-02T09:29:39,783 INFO [Thread-54] io.druid.server.coordination.CuratorDataSegmentServerAnnouncer - Unannouncing self[DruidServerMetadata{name='druidproduction-w-1-20190401103922.c.adgeek-prod.internal:8083', hostAndPort='druidproduction-w-1-20190401103922.c.adgeek-prod.internal:8083', hostAndTlsPort='null', maxSize=300000000000, tier='_default_tier', type=historical, priority=0}] at [/druid/announcements/druidproduction-w-1-20190401103922.c.adgeek-prod.internal:8083]
2019-04-02T09:29:39,783 INFO [Thread-54] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/druidproduction-w-1-20190401103922.c.adgeek-prod.internal:8083]
2019-04-02T09:29:39,787 INFO [Thread-54] io.druid.server.coordination.SegmentLoadDropHandler - Stopped.
2019-04-02T09:29:39,787 INFO [Thread-54] io.druid.server.initialization.jetty.JettyServerModule - Stopping Jetty Server...
2019-04-02T09:29:39,804 INFO [Thread-54] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.storage.hdfs.HdfsStorageAuthentication.stop()] on object[io.druid.storage.hdfs.HdfsStorageAuthentication@4ffced4e].
2019-04-02T09:29:39,804 INFO [Thread-54] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.java.util.metrics.MonitorScheduler.stop()] on object[io.druid.java.util.metrics.MonitorScheduler@34279b8a].
2019-04-02T09:29:39,805 INFO [Thread-54] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.java.util.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[ServiceEmitter{serviceDimensions={service=druid/historical, host=druidproduction-w-1-20190401103922.c.adgeek-prod.internal:8083, version=0.12.1.3.1.0.0-78}, emitter=io.druid.emitter.ambari.metrics.AmbariMetricsEmitter@a7cf42f}].
2019-04-02T09:29:39,819 INFO [Thread-54] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner.stop()] on object[io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner@46f73ffa].
2019-04-02 09:29:39,841 Thread-54 ERROR Unable to register shutdown hook because JVM is shutting down. java.lang.IllegalStateException: Not started
at io.druid.common.config.Log4jShutdown.addShutdownCallback(Log4jShutdown.java:47)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.addShutdownCallback(Log4jContextFactory.java:273)
at org.apache.logging.log4j.core.LoggerContext.setUpShutdownHook(LoggerContext.java:256)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:216)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:253)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:265)
at org.apache.curator.RetryLoop.<init>(RetryLoop.java:65)
at org.apache.curator.CuratorZookeeperClient.newRetryLoop(CuratorZookeeperClient.java:151)
at org.apache.curator.connection.StandardConnectionHandlingPolicy.callWithRetry(StandardConnectionHandlingPolicy.java:59)
at org.apache.curator.RetryLoop.callWithRetry(RetryLoop.java:100)
at org.apache.curator.framework.imps.CuratorTransactionImpl.commit(CuratorTransactionImpl.java:123)
at io.druid.curator.announcement.Announcer.unannounce(Announcer.java:403)
at io.druid.curator.announcement.Announcer.stop(Announcer.java:155)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler.stop(Lifecycle.java:434)
at io.druid.java.util.common.lifecycle.Lifecycle.stop(Lifecycle.java:335)
at io.druid.java.util.common.lifecycle.Lifecycle$1.run(Lifecycle.java:366)
at java.lang.Thread.run(Thread.java:748)
==> /var/log/druid/middleManager.log <==
2019-04-02T09:25:52,095 INFO [main] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.WorkerTaskMonitor.start() throws java.lang.Exception] on object[io.druid.indexing.worker.WorkerTaskMonitor@29bcf51d].
2019-04-02T09:25:52,100 INFO [main] io.druid.indexing.overlord.ForkingTaskRunner - Registered listener [WorkerTaskMonitor]
2019-04-02T09:25:52,101 INFO [main] io.druid.indexing.worker.WorkerTaskMonitor - Started WorkerTaskMonitor.
2019-04-02T09:25:52,102 INFO [main] io.druid.server.initialization.jetty.JettyServerModule - Starting Jetty Server...
2019-04-02T09:25:52,685 INFO [main] io.druid.curator.discovery.CuratorDruidNodeAnnouncer - Announcing [DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/middlemanager', host='druidproduction-w-1-20190401103922.c.adgeek-prod.internal', port=-1, plaintextPort=8091, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='middleManager', services={workerNodeService=WorkerNodeService{ip='druidproduction-w-1-20190401103922.c.adgeek-prod.internal', capacity=3, version='0'}}}].
2019-04-02T09:25:52,701 INFO [main] io.druid.curator.discovery.CuratorDruidNodeAnnouncer - Announced [DiscoveryDruidNode{druidNode=DruidNode{serviceName='druid/middlemanager', host='druidproduction-w-1-20190401103922.c.adgeek-prod.internal', port=-1, plaintextPort=8091, enablePlaintextPort=true, tlsPort=-1, enableTlsPort=false}, nodeType='middleManager', services={workerNodeService=WorkerNodeService{ip='druidproduction-w-1-20190401103922.c.adgeek-prod.internal', capacity=3, version='0'}}}].
2019-04-02T09:27:52,128 INFO [AmbariMetricsEmitter-0] io.druid.emitter.ambari.metrics.AmbariMetricsEmitter - Unable to connect to collector, http://d:6188/ws/v1/timeline/metrics
This exceptions will be ignored for next 100 times
2019-04-02T09:27:52,128 ERROR [AmbariMetricsEmitter-0] io.druid.emitter.ambari.metrics.AmbariMetricsEmitter - java.net.UnknownHostException: d
org.apache.hadoop.metrics2.sink.timeline.UnableToConnectException: java.net.UnknownHostException: d
at org.apache.hadoop.metrics2.sink.timeline.AbstractTimelineMetricsSink.emitMetricsJson(AbstractTimelineMetricsSink.java:137) ~[ambari-metrics-common-2.4.1.0.22.jar:?]
at org.apache.hadoop.metrics2.sink.timeline.AbstractTimelineMetricsSink.emitMetrics(AbstractTimelineMetricsSink.java:157) ~[ambari-metrics-common-2.4.1.0.22.jar:?]
at io.druid.emitter.ambari.metrics.AmbariMetricsEmitter.access$600(AmbariMetricsEmitter.java:48) ~[ambari-metrics-emitter-0.12.1.3.1.0.0-78.jar:0.12.1.3.1.0.0-78]
at io.druid.emitter.ambari.metrics.AmbariMetricsEmitter$ConsumerRunnable.run(AmbariMetricsEmitter.java:187) [ambari-metrics-emitter-0.12.1.3.1.0.0-78.jar:0.12.1.3.1.0.0-78]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_191]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_191]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_191]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_191]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]
Caused by: java.net.UnknownHostException: d
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184) ~[?:1.8.0_191]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[?:1.8.0_191]
at java.net.Socket.connect(Socket.java:589) ~[?:1.8.0_191]
at sun.net.NetworkClient.doConnect(NetworkClient.java:175) ~[?:1.8.0_191]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:463) ~[?:1.8.0_191]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:558) ~[?:1.8.0_191]
at sun.net.www.http.HttpClient.<init>(HttpClient.java:242) ~[?:1.8.0_191]
at sun.net.www.http.HttpClient.New(HttpClient.java:339) ~[?:1.8.0_191]
at sun.net.www.http.HttpClient.New(HttpClient.java:357) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1220) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1334) ~[?:1.8.0_191]
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1309) ~[?:1.8.0_191]
at org.apache.hadoop.metrics2.sink.timeline.AbstractTimelineMetricsSink.emitMetricsJson(AbstractTimelineMetricsSink.java:100) ~[?:?]
... 10 more
Command failed after 1 tries
I don't know how to solve this problem!
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org