You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Emanuele Palese (JIRA)" <ji...@apache.org> on 2016/12/05 16:20:58 UTC

[jira] [Comment Edited] (AIRFLOW-671) Travis build failure

    [ https://issues.apache.org/jira/browse/AIRFLOW-671?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15722623#comment-15722623 ] 

Emanuele Palese edited comment on AIRFLOW-671 at 12/5/16 4:20 PM:
------------------------------------------------------------------

[~bolke] fixed the issue:
{quote}
I just pushed a fix for this. Travis has changed its way how it makes the jdk available. PRs need a force push or empty commit to trigger a new build.
{quote}

{code}
+#
 sudo: true
 dist: trusty
 language: python
-jdk: oraclejdk7
+jdk:
+  - oraclejdk8
 services:
   - mysql
   - postgresql
@@ -30,6 +31,7 @@ addons:
       - krb5-user
       - krb5-kdc
       - krb5-admin-server
+      - oracle-java8-installer
       - python-selinux
   postgresql: "9.2"
 python:
@@ -91,6 +93,7 @@ before_install:
   - cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
   - ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
   - chmod 600 ~/.ssh/*
+  - jdk_switcher use oraclejdk8
...
{code}


was (Author: epalese):
[~bolke] fixed the issue:
{quote}
I just pushed a fix for this. Travis has changed its way how it makes the jdk available. PRs need a force push or empty commit to trigger a new build.
{quote}

The new .travis.yml has the following option under "before_install" section:
{code}
before_install:
...
  - jdk_switcher use oraclejdk8
{code}


> Travis build failure
> --------------------
>
>                 Key: AIRFLOW-671
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-671
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: travis
>            Reporter: Emanuele Palese
>            Priority: Minor
>              Labels: beginner, build
>
> I tried to fork the repository and build it with TravisCI but I got this error:
> {code}
> ======================================================================
> ERROR: test_mysql_to_hive (tests.TransferTests)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/home/travis/build/epalese/incubator-airflow/tests/operators/operators.py", line 192, in test_mysql_to_hive
>     t.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/models.py", line 2302, in run
>     ignore_ti_state=ignore_ti_state)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/utils/db.py", line 54, in wrapper
>     result = func(*args, **kwargs)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/models.py", line 1299, in run
>     result = task_copy.execute(context=context)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/operators/mysql_to_hive.py", line 131, in execute
>     recreate=self.recreate)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/hooks/hive_hooks.py", line 389, in load_file
>     self.run_cli(hql)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/hooks/hive_hooks.py", line 225, in run_cli
>     raise AirflowException(stdout)
> AirflowException: scan complete in 2ms
> Connecting to jdbc:hive2://localhost:10000/default
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
> beeline> USE default;
> No current connection
> -------------------- >> begin captured logging << --------------------
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Task Instance State' PASSED: Context specified that state should be ignored.
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Previous Dagrun State' PASSED: The task did not have depends_on_past set.
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Not In Retry Period' PASSED: The task instance was not marked for retrying.
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Trigger Rule' PASSED: The task instance did not have any upstream tasks.
> root: INFO: Dependencies all met for <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]>
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Previous Dagrun State' PASSED: The task did not have depends_on_past set.
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Trigger Rule' PASSED: The task instance did not have any upstream tasks.
> root: DEBUG: <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]> dependency 'Not In Retry Period' PASSED: The task instance was not marked for retrying.
> root: INFO: Dependencies all met for <TaskInstance: unit_test_dag.test_m2h 2015-01-01 00:00:00 [None]>
> root: INFO: 
> --------------------------------------------------------------------------------
> Starting attempt 1 of 1
> --------------------------------------------------------------------------------
> root: INFO: Executing <Task(MySqlToHiveTransfer): test_m2h> on 2015-01-01 00:00:00
> root: INFO: Using connection to: localhost
> root: INFO: Dumping MySQL query results to local file
> root: INFO: Using connection to: localhost
> root: INFO: Loading file into Hive
> root: INFO: DROP TABLE IF EXISTS test_mysql_to_hive;
> CREATE TABLE IF NOT EXISTS test_mysql_to_hive (
> org_year INT,
>     baby_name STRING,
>     rate DOUBLE,
>     sex STRING)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ','
> STORED AS textfile;
> root: INFO: beeline -u jdbc:hive2://localhost:10000/default -f /tmp/airflow_hiveop_VLclaH/tmpfskBTk
> root: INFO: scan complete in 2ms
> root: INFO: Connecting to jdbc:hive2://localhost:10000/default
> root: INFO: HS2 may be unavailable, check server status
> root: INFO: Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
> root: INFO: beeline> USE default;
> root: INFO: No current connection
> root: INFO: 
> root: ERROR: scan complete in 2ms
> Connecting to jdbc:hive2://localhost:10000/default
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
> beeline> USE default;
> No current connection
> Traceback (most recent call last):
>   File "/home/travis/build/epalese/incubator-airflow/airflow/models.py", line 1299, in run
>     result = task_copy.execute(context=context)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/operators/mysql_to_hive.py", line 131, in execute
>     recreate=self.recreate)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/hooks/hive_hooks.py", line 389, in load_file
>     self.run_cli(hql)
>   File "/home/travis/build/epalese/incubator-airflow/airflow/hooks/hive_hooks.py", line 225, in run_cli
>     raise AirflowException(stdout)
> AirflowException: scan complete in 2ms
> Connecting to jdbc:hive2://localhost:10000/default
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
> beeline> USE default;
> No current connection
> root: INFO: Marking task as FAILED.
> root: ERROR: scan complete in 2ms
> Connecting to jdbc:hive2://localhost:10000/default
> HS2 may be unavailable, check server status
> Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
> beeline> USE default;
> No current connection
> --------------------- >> end captured logging << ---------------------
> {code}
> Full log: https://api.travis-ci.org/jobs/181036808/log.txt?deansi=true 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)