You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Andrew Onischuk (JIRA)" <ji...@apache.org> on 2016/07/06 11:37:11 UTC

[jira] [Created] (AMBARI-17582) HIVE_SERVER_INTERACTIVE STOP failed with error "Python script has been killed due to timeout after waiting 900 secs"

Andrew Onischuk created AMBARI-17582:
----------------------------------------

             Summary: HIVE_SERVER_INTERACTIVE STOP failed with error "Python script has been killed due to timeout after waiting 900 secs"
                 Key: AMBARI-17582
                 URL: https://issues.apache.org/jira/browse/AMBARI-17582
             Project: Ambari
          Issue Type: Bug
            Reporter: Andrew Onischuk
            Assignee: Andrew Onischuk
             Fix For: 2.4.0
         Attachments: AMBARI-17582.patch

HIVE_SERVER_INTERACTIVE STOP failed with error "Python script has been killed
due to timeout after waiting 900 secs"

    
    
    
    {
      "href" : "http://172.22.117.57:8080/api/v1/clusters/cl1/requests/8/tasks/198",
      "Tasks" : {
        "attempt_cnt" : 1,
        "cluster_name" : "cl1",
        "command" : "STOP",
        "command_detail" : "HIVE_SERVER_INTERACTIVE STOP",
        "end_time" : 1467691652833,
        "error_log" : "/var/lib/ambari-agent/data/errors-198.txt",
        "exit_code" : 999,
        "host_name" : "nat-u14-dvys-ambari-logsearch-1-3.openstacklocal",
        "id" : 198,
        "output_log" : "/var/lib/ambari-agent/data/output-198.txt",
        "request_id" : 8,
        "role" : "HIVE_SERVER_INTERACTIVE",
        "stage_id" : 0,
        "start_time" : 1467690695556,
        "status" : "FAILED",
        "stderr" : "Python script has been killed due to timeout after waiting 900 secs",
        "stdout" : "2016-07-05 03:52:27,679 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-874\n2016-07-05 03:52:27,683 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-874/0\n2016-07-05 03:52:27,686 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-874', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}\n2016-07-05 03:52:27,726 - call returned (1, '/etc/hadoop/2.5.0.0-874/0 exist already', '')\n2016-07-05 03:52:27,727 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-874', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}\n2016-07-05 03:52:27,788 - checked_call returned (0, '')\n2016-07-05 03:52:27,789 - Ensuring that hadoop has the correct symlink structure\n2016-07-05 03:52:27,789 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf\n2016-07-05 03:52:27,810 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}\n2016-07-05 03:52:27,853 - call returned (0, 'hive-server2 - 2.5.0.0-874')\n2016-07-05 03:52:27,880 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /var/run/hive/hive-interactive.pid 1>/tmp/tmpnftynk 2>/tmp/tmpRKnLIa''] {'quiet': False}\n2016-07-05 03:52:27,911 - call returned (0, '######## Hortonworks #############\\nThis is MOTD message, added for testing in qe infra')\n2016-07-05 03:52:27,912 - Execute['ambari-sudo.sh kill 21297'] {'not_if': '! (ls /var/run/hive/hive-interactive.pid >/dev/null 2>&1 && ps -p 21297 >/dev/null 2>&1)'}\n2016-07-05 03:52:27,936 - Execute['ambari-sudo.sh kill -9 21297'] {'not_if': '! (ls /var/run/hive/hive-interactive.pid >/dev/null 2>&1 && ps -p 21297 >/dev/null 2>&1) || ( sleep 5 && ! (ls /var/run/hive/hive-interactive.pid >/dev/null 2>&1 && ps -p 21297 >/dev/null 2>&1) )'}\n2016-07-05 03:52:32,975 - Execute['! (ls /var/run/hive/hive-interactive.pid >/dev/null 2>&1 && ps -p 21297 >/dev/null 2>&1)'] {'tries': 20, 'try_sleep': 3}\n2016-07-05 03:52:33,036 - Retrying after 3 seconds. Reason: Execution of '! (ls /var/run/hive/hive-interactive.pid >/dev/null 2>&1 && ps -p 21297 >/dev/null 2>&1)' returned 1. \n2016-07-05 03:52:36,062 - File['/var/run/hive/hive-interactive.pid'] {'action': ['delete']}\n2016-07-05 03:52:36,063 - Deleting File['/var/run/hive/hive-interactive.pid']\n2016-07-05 03:52:36,063 - Stopping LLAP\n2016-07-05 03:52:36,063 - Command: ['slider', 'stop', 'llap0']\n2016-07-05 03:52:36,063 - call[['slider', 'stop', 'llap0']] {'logoutput': True, 'user': 'hive', 'stderr': -1}\n######## Hortonworks #############\nThis is MOTD message, added for testing in qe infra\n2016-07-05 03:52:41,508 [main] INFO  impl.TimelineClientImpl - Timeline service address: http://nat-u14-dvys-ambari-logsearch-1-4.openstacklocal:8188/ws/v1/timeline/\n2016-07-05 03:52:42,856 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.\n2016-07-05 03:52:42,873 [main] INFO  client.RMProxy - Connecting to ResourceManager at nat-u14-dvys-ambari-logsearch-1-4.openstacklocal/172.22.117.203:8050\n2016-07-05 03:52:43,829 [main] INFO  util.ExitUtil - Exiting with status 0\n2016-07-05 03:52:44,225 - call returned (0, '######## Hortonworks #############\\nThis is MOTD message, added for testing in qe infra\\n2016-07-05 03:52:41,508 [main] INFO  impl.TimelineClientImpl - Timeline service address: http://nat-u14-dvys-ambari-logsearch-1-4.openstacklocal:8188/ws/v1/timeline/\\n2016-07-05 03:52:42,856 [main] WARN  shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.\\n2016-07-05 03:52:42,873 [main] INFO  client.RMProxy - Connecting to ResourceManager at nat-u14-dvys-ambari-logsearch-1-4.openstacklocal/172.22.117.203:8050\\n2016-07-05 03:52:43,829 [main] INFO  util.ExitUtil - Exiting with status 0', '')\n2016-07-05 03:52:44,225 - Stopped llap0 application on Slider successfully\n2016-07-05 03:52:44,225 - call[['slider', 'destroy', 'llap0', '--force']] {'user': 'hive', 'stderr': -1}\n\nCommand failed after 1 tries\n",
        "structured_out" : { }
      }
    }
    





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)