You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2018/09/04 17:58:34 UTC

[GitHub] r39132 commented on a change in pull request #3834: [AIRFLOW-2965] CLI tool to show the next execution datetime

r39132 commented on a change in pull request #3834: [AIRFLOW-2965] CLI tool to show the next execution datetime
URL: https://github.com/apache/incubator-airflow/pull/3834#discussion_r215010936
 
 

 ##########
 File path: airflow/bin/cli.py
 ##########
 @@ -551,6 +551,17 @@ def dag_state(args):
     print(dr[0].state if len(dr) > 0 else None)
 
 
+@cli_utils.action_logging
 
 Review comment:
   @XD-DENG I just tested this with some of the example dags in https://github.com/apache/incubator-airflow/tree/master/airflow/example_dags. Can you test your code with different schedule types including `@once`, `daily/weekly`, `timedelta(hours=1)`, etc... in addition to the case you provide which is cron expressions. Also, can you add tests for these?
   
   ```
   (venv) sianand@LM-SJN-21002367:~/Projects/airflow_incubator $ airflow next_execution latest_only
   [2018-09-04 10:52:19,613] {__init__.py:51} INFO - Using executor SequentialExecutor
   /Users/sianand/miniconda3/lib/python3.6/site-packages/apache_airflow-2.0.0.dev0+incubating-py3.6.egg/airflow/bin/cli.py:1724: DeprecationWarning: The celeryd_concurrency option in [celery] has been renamed to worker_concurrency - the old setting has been used, but please update your config.
     default=conf.get('celery', 'worker_concurrency')),
   [2018-09-04 10:52:19,822] {models.py:260} INFO - Filling up the DagBag from /Users/sianand/Projects/airflow_incubator/dags
   [2018-09-04 10:52:19,882] {example_kubernetes_operator.py:55} WARNING - Could not import KubernetesPodOperator: No module named 'kubernetes'
   [2018-09-04 10:52:19,882] {example_kubernetes_operator.py:56} WARNING - Install kubernetes dependencies with:     pip install apache-airflow[kubernetes]
   Traceback (most recent call last):
     File "/Users/sianand/miniconda3/bin/airflow", line 4, in <module>
       __import__('pkg_resources').run_script('apache-airflow==2.0.0.dev0+incubating', 'airflow')
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/pkg_resources/__init__.py", line 654, in run_script
       self.require(requires)[0].run_script(script_name, ns)
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/pkg_resources/__init__.py", line 1434, in run_script
       exec(code, namespace, namespace)
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/apache_airflow-2.0.0.dev0+incubating-py3.6.egg/EGG-INFO/scripts/airflow", line 32, in <module>
       args.func(args)
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/apache_airflow-2.0.0.dev0+incubating-py3.6.egg/airflow/utils/cli.py", line 74, in wrapper
       return f(*args, **kwargs)
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/apache_airflow-2.0.0.dev0+incubating-py3.6.egg/airflow/bin/cli.py", line 562, in next_execution
       print(dag.following_schedule(dag.latest_execution_date))
     File "/Users/sianand/miniconda3/lib/python3.6/site-packages/apache_airflow-2.0.0.dev0+incubating-py3.6.egg/airflow/models.py", line 3371, in following_schedule
       return dttm + self._schedule_interval
   TypeError: unsupported operand type(s) for +: 'NoneType' and 'datetime.timedelta'
   
   
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services