You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@amaterasu.apache.org by GitBox <gi...@apache.org> on 2018/09/15 08:48:07 UTC

[GitHub] nadav-har-tzvi closed pull request #17: Amaterasu CLI for V0.2.1

nadav-har-tzvi closed pull request #17: Amaterasu CLI for V0.2.1
URL: https://github.com/apache/incubator-amaterasu/pull/17
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/README.md b/README.md
index bbc54e1..52a1d99 100755
--- a/README.md
+++ b/README.md
@@ -26,36 +26,131 @@
                                                         
 
 Apache Amaterasu is an open-source, deployment tool for data pipelines. Amaterasu allows developers to write and easily deploy data pipelines, and clusters manage their configuration and dependencies.
+ 
 
-## Download
+### Supported cluster managers
 
-For this preview version, we have packaged amaterasu nicely for you to just [download](https://s3-ap-southeast-2.amazonaws.com/amaterasu/amaterasu.tgz) and extract.
-Once you do that, you are just a couple of easy steps away from running your first job.
+Currently Apache Amaterasu supports the following cluster managers:
+1. Apache Mesos
+    > Important! Does not support DC/OS at the moment! only standalone deployment of mesos.
+2. Apache Hadoop YARN
+    > Due to a bug, we do not support Micorosft Azure HDInsight at the moment.
+
+
+### Requirements
+
+Here is the list of requirements you will need to have installed on your cluster's master node.
+
+1. Java 1.8 (Oracle or OpenJDK, either is OK)
+1. Python 3.3+
+2. [pip](https://pip.pypa.io/en/stable/installing/)
+
+### Installation
+
+First, on your cluster's master node, download the latest Apache Amaterasu [distributable](https://github.com/apache/incubator-amaterasu/releases/latest).
+
+Next, unpack the tarball:
+
+```bash
+tar -xzf apache-amaterasu-<version>.tar amaterasu
+```
+
+Next install the CLI:
+
+```
+cd amaterasu
+sudo pip install ./cli
+```
+
+After installation you will need to setup Amaterasu. You can do it using the `ama setup` command.
+
+The setup phase is dictated by the cluster manager you use.
+So for example, to setup Amaterasu for a mesos cluster, you would do:
+```bash
+ama setup mesos
+```
+
+During step you will be asked to configure a few parameters, the setup will result in the following happening:
+1. The Amaterasu distributable will be downloaded and deployed. 
+2. `amaterasu.conf` file will be created by default in ```/etc/amaterasu```. This is a configuration file that feeds the distributable.
+3. Dependencies will be downloaded (miniconda and spark for a Mesos cluster and only miniconda for a Hadoop cluster)
 
 ## Creating a dev/test Mesos cluster
 
 We have also created a Mesos cluster you can use to test Amaterasu or use for development purposes.
 For more details, visit the [amaterasu-vagrant](https://github.com/shintoio/amaterasu-vagrant) repo
 
-## Configuration
+## Creating a job repository
+
+Amaterasu has a very specific definition for how an Amaterasu compliant job repository should look like. For this, we supply the `ama init` command.
+
+Here are the steps to properly create an Apache Amaterasu job repository:
+1. Start by creating a new directory, for example "ExampleJobRepo"
+2. `cd ExampleJobRepo`
+3. Run `ama init`
 
-Configuring amaterasu is very simple. Before running amaterasu, open the `amaterasu.properties` file in the top-level amaterasu directory, and verify the following properties:
+This will create a new repository inside the ExampleJobRepo directory. The new repository will include the following structure:
 
-| property   | Description                | Default value  |
-| ---------- | -------------------------- | -------------- |
-| zk         | The ZooKeeper connection<br> string to be used by<br> amaterasu | 192.168.33.11  |
-| master     | The clusters' Mesos master | 192.168.33.11  |
-| user       | The user that will be used<br> to run amaterasu | root           |
+1. **maki.yml** - This is the job instructions file, more on this below.
+2. **src** - A directory for source files that are used as part of the job. Currently we support Spark with Scala, Python, R and SQL bindings.
+3. **env** - A directory for environment specific configuration files. You may want to have different configurations for development and production environments
+4. **deps** - A directory for files that list software dependencies that the job requires. E.g. - numpy, pandas, etc.
+
+After you've created the ExampleJobRepo, fill it in with your source code, environment configurations,  you will need to push it to a remote git host that your cluster has access to. 
+That's it, you are set to run your first job.
 
 ## Running a Job
 
-To run an amaterasu job, run the following command in the top-level amaterasu directory:
+To run a job using Amaterasu, we prepared a nifty little CLI command.
 
+```bash
+ama run
 ```
-ama-start.sh --repo="https://github.com/shintoio/amaterasu-job-sample.git" --branch="master" --env="test" --report="code" 
+
+The ama run commands receives a mandatory job repository URL.
+
+```bash
+ama run <repository_url>
+
+# e.g.
+
+ama run https://github.com/shintoio/amaterasu-job-sample.git
+```
+
+Unless specified otherwise, we use the master branch. If you want to change this, you can add the ```-b``` flag.
+
+```bash
+ama run <repository_url> -b <your_branch>
+
+# e.g.
+
+ama run https://github.com/shintoio/amaterasu-job-sample.git -b python-support
+```
+
+
+#### Execution Environments
+
+So you are running Apache Amaterasu, that's great! But maybe, you'd want to specify some job or environment related arguments for Apache Amaterasu to pass on to the underlying cluster manager.
+For example, maybe for production you want to set the spark executor memory on HDP to 1g, you can do it by adding it to a ```env/hdp-prod/spark.yml``` file.
+Maybe during the testing phase, you'd like Spark driver to only use 2 cores, this is why you'd want to have different environment for testing.
+
+So let's assume that you ended up creating 2 environments - ```hdp-prod``` and ```hdp-test```.
+
+To use the ```hdp-prod``` environment, simply run with the ```-e``` flag like this:
+
 ```
+ama run <repository_url> -e <environment>
+
+# e.g. 
+ama run https://github.com/shintoio/amaterasu-job-sample.git -e hdp-prod
+```
+
+> Note - If you don't specify any environment, Amaterasu will use the "default" environment.
+
+For more CLI options, use the builtin help (```ama -h```)
+
+It is highly recommended that you take a peek at our [sample job repository](https://github.com/shintoio/amaterasu-job-sample.git) before using Amaterasu.
 
-We recommend you either fork or clone the job sample repo and use that as a starting point for creating your first job.
 
 # Apache Amaterasu Developers Information 
 
diff --git a/build.gradle b/build.gradle
index 0f11347..3ec6bac 100644
--- a/build.gradle
+++ b/build.gradle
@@ -28,6 +28,7 @@ allprojects {
 project(':leader')
 project(':common')
 project(':executor')
+project(':cli')
 
 task copyLeagalFiles(type: Copy) {
     from "./DISCLAIMER", "./LICENSE", "./NOTICE"
diff --git a/cli/build.gradle b/cli/build.gradle
new file mode 100644
index 0000000..24af84d
--- /dev/null
+++ b/cli/build.gradle
@@ -0,0 +1,21 @@
+/*
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+*/
+
+task copyToHome(type: Copy) {
+    from 'src'
+    into '../build/amaterasu/cli'
+}
diff --git a/cli/src/LICENSE b/cli/src/LICENSE
new file mode 100644
index 0000000..13f2054
--- /dev/null
+++ b/cli/src/LICENSE
@@ -0,0 +1,225 @@
+                                 Apache License
+                           Version 2.0, January 2004
+                        http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+      "License" shall mean the terms and conditions for use, reproduction,
+      and distribution as defined by Sections 1 through 9 of this document.
+
+      "Licensor" shall mean the copyright owner or entity authorized by
+      the copyright owner that is granting the License.
+
+      "Legal Entity" shall mean the union of the acting entity and all
+      other entities that control, are controlled by, or are under common
+      control with that entity. For the purposes of this definition,
+      "control" means (i) the power, direct or indirect, to cause the
+      direction or management of such entity, whether by contract or
+      otherwise, or (ii) ownership of fifty percent (50%) or more of the
+      outstanding shares, or (iii) beneficial ownership of such entity.
+
+      "You" (or "Your") shall mean an individual or Legal Entity
+      exercising permissions granted by this License.
+
+      "Source" form shall mean the preferred form for making modifications,
+      including but not limited to software source code, documentation
+      source, and configuration files.
+
+      "Object" form shall mean any form resulting from mechanical
+      transformation or translation of a Source form, including but
+      not limited to compiled object code, generated documentation,
+      and conversions to other media types.
+
+      "Work" shall mean the work of authorship, whether in Source or
+      Object form, made available under the License, as indicated by a
+      copyright notice that is included in or attached to the work
+      (an example is provided in the Appendix below).
+
+      "Derivative Works" shall mean any work, whether in Source or Object
+      form, that is based on (or derived from) the Work and for which the
+      editorial revisions, annotations, elaborations, or other modifications
+      represent, as a whole, an original work of authorship. For the purposes
+      of this License, Derivative Works shall not include works that remain
+      separable from, or merely link (or bind by name) to the interfaces of,
+      the Work and Derivative Works thereof.
+
+      "Contribution" shall mean any work of authorship, including
+      the original version of the Work and any modifications or additions
+      to that Work or Derivative Works thereof, that is intentionally
+      submitted to Licensor for inclusion in the Work by the copyright owner
+      or by an individual or Legal Entity authorized to submit on behalf of
+      the copyright owner. For the purposes of this definition, "submitted"
+      means any form of electronic, verbal, or written communication sent
+      to the Licensor or its representatives, including but not limited to
+      communication on electronic mailing lists, source code control systems,
+      and issue tracking systems that are managed by, or on behalf of, the
+      Licensor for the purpose of discussing and improving the Work, but
+      excluding communication that is conspicuously marked or otherwise
+      designated in writing by the copyright owner as "Not a Contribution."
+
+      "Contributor" shall mean Licensor and any individual or Legal Entity
+      on behalf of whom a Contribution has been received by Licensor and
+      subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      copyright license to reproduce, prepare Derivative Works of,
+      publicly display, publicly perform, sublicense, and distribute the
+      Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      (except as stated in this section) patent license to make, have made,
+      use, offer to sell, sell, import, and otherwise transfer the Work,
+      where such license applies only to those patent claims licensable
+      by such Contributor that are necessarily infringed by their
+      Contribution(s) alone or by combination of their Contribution(s)
+      with the Work to which such Contribution(s) was submitted. If You
+      institute patent litigation against any entity (including a
+      cross-claim or counterclaim in a lawsuit) alleging that the Work
+      or a Contribution incorporated within the Work constitutes direct
+      or contributory patent infringement, then any patent licenses
+      granted to You under this License for that Work shall terminate
+      as of the date such litigation is filed.
+
+   4. Redistribution. You may reproduce and distribute copies of the
+      Work or Derivative Works thereof in any medium, with or without
+      modifications, and in Source or Object form, provided that You
+      meet the following conditions:
+
+      (a) You must give any other recipients of the Work or
+          Derivative Works a copy of this License; and
+
+      (b) You must cause any modified files to carry prominent notices
+          stating that You changed the files; and
+
+      (c) You must retain, in the Source form of any Derivative Works
+          that You distribute, all copyright, patent, trademark, and
+          attribution notices from the Source form of the Work,
+          excluding those notices that do not pertain to any part of
+          the Derivative Works; and
+
+      (d) If the Work includes a "NOTICE" text file as part of its
+          distribution, then any Derivative Works that You distribute must
+          include a readable copy of the attribution notices contained
+          within such NOTICE file, excluding those notices that do not
+          pertain to any part of the Derivative Works, in at least one
+          of the following places: within a NOTICE text file distributed
+          as part of the Derivative Works; within the Source form or
+          documentation, if provided along with the Derivative Works; or,
+          within a display generated by the Derivative Works, if and
+          wherever such third-party notices normally appear. The contents
+          of the NOTICE file are for informational purposes only and
+          do not modify the License. You may add Your own attribution
+          notices within Derivative Works that You distribute, alongside
+          or as an addendum to the NOTICE text from the Work, provided
+          that such additional attribution notices cannot be construed
+          as modifying the License.
+
+      You may add Your own copyright statement to Your modifications and
+      may provide additional or different license terms and conditions
+      for use, reproduction, or distribution of Your modifications, or
+      for any such Derivative Works as a whole, provided Your use,
+      reproduction, and distribution of the Work otherwise complies with
+      the conditions stated in this License.
+
+   5. Submission of Contributions. Unless You explicitly state otherwise,
+      any Contribution intentionally submitted for inclusion in the Work
+      by You to the Licensor shall be under the terms and conditions of
+      this License, without any additional terms or conditions.
+      Notwithstanding the above, nothing herein shall supersede or modify
+      the terms of any separate license agreement you may have executed
+      with Licensor regarding such Contributions.
+
+   6. Trademarks. This License does not grant permission to use the trade
+      names, trademarks, service marks, or product names of the Licensor,
+      except as required for reasonable and customary use in describing the
+      origin of the Work and reproducing the content of the NOTICE file.
+
+   7. Disclaimer of Warranty. Unless required by applicable law or
+      agreed to in writing, Licensor provides the Work (and each
+      Contributor provides its Contributions) on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+      implied, including, without limitation, any warranties or conditions
+      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+      PARTICULAR PURPOSE. You are solely responsible for determining the
+      appropriateness of using or redistributing the Work and assume any
+      risks associated with Your exercise of permissions under this License.
+
+   8. Limitation of Liability. In no event and under no legal theory,
+      whether in tort (including negligence), contract, or otherwise,
+      unless required by applicable law (such as deliberate and grossly
+      negligent acts) or agreed to in writing, shall any Contributor be
+      liable to You for damages, including any direct, indirect, special,
+      incidental, or consequential damages of any character arising as a
+      result of this License or out of the use or inability to use the
+      Work (including but not limited to damages for loss of goodwill,
+      work stoppage, computer failure or malfunction, or any and all
+      other commercial damages or losses), even if such Contributor
+      has been advised of the possibility of such damages.
+
+   9. Accepting Warranty or Additional Liability. While redistributing
+      the Work or Derivative Works thereof, You may choose to offer,
+      and charge a fee for, acceptance of support, warranty, indemnity,
+      or other liability obligations and/or rights consistent with this
+      License. However, in accepting such obligations, You may act only
+      on Your own behalf and on Your sole responsibility, not on behalf
+      of any other Contributor, and only if You agree to indemnify,
+      defend, and hold each Contributor harmless for any liability
+      incurred by, or claims asserted against, such Contributor by reason
+      of your accepting any such warranty or additional liability.
+
+   END OF TERMS AND CONDITIONS
+
+   APPENDIX: How to apply the Apache License to your work.
+
+      To apply the Apache License to your work, attach the following
+      boilerplate notice, with the fields enclosed by brackets "[]"
+      replaced with your own identifying information. (Don't include
+      the brackets!)  The text should be enclosed in the appropriate
+      comment syntax for the file format. We also recommend that a
+      file or class name and description of purpose be included on the
+      same "printed page" as the copyright notice for easier
+      identification within third-party archives.
+
+   Copyright [yyyy] [name of copyright owner]
+
+   Licensed under the Apache License, Version 2.0 (the "License");
+   you may not use this file except in compliance with the License.
+   You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+
+For persistence-elasticsearch/plugins/security/src/main/java/org/apache/unomi/elasticsearch/plugin/security/IPRangeMatcher.java :
+
+The MIT License
+
+Copyright (c) 2013 Edin Dazdarevic (edin.dazdarevic@gmail.com)
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in
+all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+THE SOFTWARE.
\ No newline at end of file
diff --git a/cli/src/MANIFEST.in b/cli/src/MANIFEST.in
new file mode 100644
index 0000000..5216b7f
--- /dev/null
+++ b/cli/src/MANIFEST.in
@@ -0,0 +1 @@
+graft amaterasu/cli/resources
\ No newline at end of file
diff --git a/cli/src/README.md b/cli/src/README.md
new file mode 100644
index 0000000..e69de29
diff --git a/cli/src/__init__.py b/cli/src/__init__.py
new file mode 100644
index 0000000..fa6c0be
--- /dev/null
+++ b/cli/src/__init__.py
@@ -0,0 +1,16 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
\ No newline at end of file
diff --git a/cli/src/amaterasu/__init__.py b/cli/src/amaterasu/__init__.py
new file mode 100644
index 0000000..e5f3f27
--- /dev/null
+++ b/cli/src/amaterasu/__init__.py
@@ -0,0 +1,19 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from . import cli
+
+__all__ = ['cli']
\ No newline at end of file
diff --git a/cli/src/amaterasu/__main__.py b/cli/src/amaterasu/__main__.py
new file mode 100644
index 0000000..9c700aa
--- /dev/null
+++ b/cli/src/amaterasu/__main__.py
@@ -0,0 +1,132 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+__doc__ = """
+{amaterasu_logo}
+
+Usage: ama [--verbose] <command> [<args>...]
+
+Builtin commands:
+    init        Start a new Amaterasu compliant repository
+    setup       Initial setup for Amaterasu
+    update      Update an existing Amaterasu repository based on a maki file
+    run         Run an Amaterasu job
+
+Options:
+    -V --verbose    Enable verbose output.
+
+See 'ama <command> --help' for more detailed information.
+"""
+__version__ = '0.2.0-incubating-rc4'
+
+import colorama
+import pkgutil
+import importlib
+import logging
+import logging.config
+import os
+import yaml
+from .cli import common, consts, handlers
+from .cli.utils import exceptions
+from docopt import docopt
+
+if not os.getenv("AMATERASU_HOME"):
+    raise exceptions.ImproperlyConfiguredError("$AMATERASU_HOME isn't defined! Please export $AMATERASU_HOME or add it to .bashrc")
+
+ama_package_path = os.path.abspath(os.path.dirname(__file__))
+logging_config_path = '{}/cli/resources/logging.yml'.format(ama_package_path)
+with open(logging_config_path) as fp:
+    logging_cfg = yaml.load(fp)
+logging.config.dictConfig(logging_cfg)
+logger = logging.getLogger(__name__)
+
+
+colorama.init()
+lines = []
+for idx, line in enumerate(common.RESOURCES[consts.AMATERASU_LOGO]):
+    if idx <= 7:
+        lines.append("\033[38;5;202m" + line)
+    elif 7 < idx < 14:
+        lines.append("\033[38;5;214m" + line)
+    else:
+        lines.append("\033[38;5;220m" + line)
+desc = ''.join(lines)
+desc += colorama.Fore.RESET + '\n\n'
+desc += common.RESOURCES[consts.APACHE_LOGO]
+desc += common.RESOURCES[consts.AMATERASU_TXT]
+
+
+def load_handlers():
+    return {
+        name.split('.')[-1]: importlib.import_module(name)
+        for _, name, _
+        in pkgutil.iter_modules(handlers.__path__, handlers.__name__ + ".")
+        if not name.endswith('base')
+    }
+
+
+def extract_args(args):
+    """
+    Cleans docopt's output, collects the <arg> arguments, strips the "<" ">" and returns an equivalent dictionary
+    :param args: docopt result arguments
+    :type args: dict
+    :return:
+    """
+    kwargs = {}
+    for k,v in args.items():
+        if k.startswith('--'):
+            key = k.lstrip('--')
+        elif k.startswith('<') and k.endswith('>'):
+            key = k.strip('<').strip('>')
+        else:
+            key = k
+        kwargs[key] = v
+    return kwargs
+
+
+def find_handler(handler_module, **kwargs):
+    """
+    Looks for a handler class that inherits from BaseHandler. We assume that the class with the longest MRO is
+    the one we look for
+    :param handler_module: A handler module loaded by importlib
+    :return:
+    """
+    try:
+        return handler_module.get_handler(**kwargs)
+    except AttributeError:
+        raise AttributeError("Module {} does not define a get_handler function".format(handler_module.__name__))
+
+
+def main():
+    doc = __doc__.format(amaterasu_logo=desc, additional_commands='')  # TODO: implement additional_commands
+    root_args = docopt(doc, version=__version__, options_first=True)
+    handler_modules = load_handlers()
+    command = root_args['<command>']
+    if root_args['--verbose']:
+        logging.basicConfig(level=logging.DEBUG)
+
+    logger.debug('CLI Started. Received the following arguments: {}'.format(root_args))
+    if command in handler_modules:
+        handler_vars = vars(handler_modules[command])
+        cmd_args = docopt(handler_vars['__doc__'], version=handler_vars.get('__version__', __version__))
+        handler = find_handler(handler_modules[command], **cmd_args)
+        handler(**extract_args(cmd_args)).handle()
+    else:
+        logger.debug("Unrecognized command received: {}".format(command))
+        print(doc)
+
+if __name__ == '__main__':
+    main()
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/__init__.py b/cli/src/amaterasu/cli/__init__.py
new file mode 100644
index 0000000..fa6c0be
--- /dev/null
+++ b/cli/src/amaterasu/cli/__init__.py
@@ -0,0 +1,16 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/common.py b/cli/src/amaterasu/cli/common.py
new file mode 100644
index 0000000..e54a43b
--- /dev/null
+++ b/cli/src/amaterasu/cli/common.py
@@ -0,0 +1,80 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import os
+from argparse import ArgumentParser, _SubParsersAction
+from multipledispatch import dispatch
+
+
+class User:
+    name = None
+    email = None
+
+    def __init__(self, name, email):
+        self.name = name
+        self.email = email
+
+
+class AmaterasuArgumentParser(ArgumentParser):
+    """
+    We use our own parser here so our handlers can expose a cleaner registration API.
+    """
+
+    def __init__(self, name=None, **kwargs):
+        if not name:
+            raise TypeError("Missing keyword argument 'name'")
+        self.name = name
+        self.arguments = []
+        self.kwargs = kwargs
+        super(AmaterasuArgumentParser, self).__init__(**kwargs)
+
+
+    def add_argument(self, *args, **kwargs):
+        self.arguments.append((args, kwargs))
+        return super(AmaterasuArgumentParser, self).add_argument(*args, **kwargs)
+
+    def add_subparsers(self, **kwargs):
+        return super(AmaterasuArgumentParser, self).add_subparsers(action=AmaterasuSubParsers, **kwargs)
+
+
+class AmaterasuSubParsers(_SubParsersAction):
+
+    @dispatch(object, object)
+    def add_parser(self, arg, **kwargs):
+        super(AmaterasuSubParsers, self).add_parser(arg, **kwargs)
+
+    @dispatch(AmaterasuArgumentParser)
+    def add_parser(self, ama_parser):
+        self._name_parser_map[ama_parser.name] = ama_parser
+
+
+class Resources(dict):
+    BASE_DIR = '{}/resources'.format(os.path.dirname(__file__))
+
+    def __init__(self, path=None):
+        super(Resources, self).__init__()
+        if path:
+            self.BASE_DIR = '{}/resources'.format(os.path.abspath(path))
+        for (_, _, files) in os.walk(self.BASE_DIR):
+            for f in files:
+                with open('{}/{}'.format(self.BASE_DIR, f), 'r') as fd:
+                    if f != 'banner2.txt':
+                        self[f] = fd.read()
+                    else:
+                        self[f] = fd.readlines()
+
+
+RESOURCES = Resources()
diff --git a/cli/src/amaterasu/cli/compat.py b/cli/src/amaterasu/cli/compat.py
new file mode 100644
index 0000000..aa4aab2
--- /dev/null
+++ b/cli/src/amaterasu/cli/compat.py
@@ -0,0 +1,93 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+Compatibility utilities for support of Python <3.5 and Python >3.5
+"""
+from __future__ import absolute_import
+import six
+import abc
+import os
+import sys
+import subprocess
+
+
+class _ABC(six.with_metaclass(abc.ABCMeta)):
+    """
+    Compatibility patching for Python 2
+    """
+    pass
+
+
+abc.ABC = _ABC
+
+
+def _makedirs(name, mode=0o777, exist_ok=False):
+    """makedirs(name [, mode=0o777][, exist_ok=False])
+
+    Super-mkdir; create a leaf directory and all intermediate ones.  Works like
+    mkdir, except that any intermediate path segment (not just the rightmost)
+    will be created if it does not exist. If the target directory already
+    exists, raise an OSError if exist_ok is False. Otherwise no exception is
+    raised.  This is recursive.
+
+
+    ported from Python3 os module so we can use it in python 2
+    """
+    head, tail = os.path.split(name)
+    if not tail:
+        head, tail = os.path.split(head)
+    if head and tail and not os.path.exists(head):
+        try:
+            os.makedirs(head, mode, exist_ok)
+        except FileExistsError:
+            # Defeats race condition when another thread created the path
+            pass
+        cdir = os.path.curdir
+        if isinstance(tail, bytes):
+            cdir = bytes(os.path.curdir, 'ASCII')
+        if tail == cdir:           # xxx/newdir/. exists if xxx/newdir exists
+            return
+    try:
+        os.mkdir(name, mode)
+    except OSError:
+        # Cannot rely on checking for EEXIST, since the operating system
+        # could give priority to other errors like EACCES or EROFS
+        if not exist_ok or not os.path.isdir(name):
+            raise
+
+
+os.makedirs = _makedirs
+
+
+try:
+    FileNotFoundError = FileNotFoundError
+except NameError:
+    FileNotFoundError = IOError
+
+try:
+    WindowsError = WindowsError
+except NameError:
+    WindowsError = OSError
+
+
+def run_subprocess(*args, **kwargs):
+    if sys.version_info.major >= 3 and sys.version_info.minor >= 5:
+        return subprocess.run(*args, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, **kwargs)
+    else:
+        return subprocess.check_output(*args, stderr=subprocess.PIPE)
+
+
+__all__ = ['FileNotFoundError', 'WindowsError', 'run_subprocess']
diff --git a/cli/src/amaterasu/cli/conf/ama-base.conf b/cli/src/amaterasu/cli/conf/ama-base.conf
new file mode 100644
index 0000000..b73853a
--- /dev/null
+++ b/cli/src/amaterasu/cli/conf/ama-base.conf
@@ -0,0 +1,35 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Common properties must be configured!
+# Then you must configure the relevant section in accordance to your deployment
+
+#### COMMON PROPRETIES ###
+zk=localhost
+version=0.2.1-incubating
+user=hadoop
+amaterasu.home=/path/to/amaterasu
+# jobs.mem=1024
+# jobs.cpus=1
+# jobs.repoSize=1024
+# jobs.tasks.mem=1024
+# jobs.tasks.cpus=1
+# jobs.tasks.attempts=3
+# timeout=600000 # m/s
+
+#### EXTRA SPARK PROPERTIES ####
+# spark.opts.<configuration>=<value>
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/conf/ama-mesos.conf b/cli/src/amaterasu/cli/conf/ama-mesos.conf
new file mode 100644
index 0000000..e20b61f
--- /dev/null
+++ b/cli/src/amaterasu/cli/conf/ama-mesos.conf
@@ -0,0 +1,7 @@
+{% extends "ama-base.conf" %}
+
+#### MESOS PROPERTIES ####
+# cluster.manager=mesos
+# spark.version=2.2.1-bin-hadoop2.7
+# webserver.port=8000
+# webserver.root=dist
diff --git a/cli/src/amaterasu/cli/conf/ama-yarn.conf b/cli/src/amaterasu/cli/conf/ama-yarn.conf
new file mode 100644
index 0000000..d1c0ab2
--- /dev/null
+++ b/cli/src/amaterasu/cli/conf/ama-yarn.conf
@@ -0,0 +1,20 @@
+{% extends "ama-base.conf" %}
+
+#### YARN PROPERTIES ####
+# cluster.manager=yarn
+# spark.version=2.3.0
+# yarn.queue=default
+# yarn.jarspath=hdfs:///apps/amaterasu
+# yarn.hadoop.home.dir=/etc/hadoop
+# yarn.master.memoryMB=1024
+# yarn.master.cores=1
+# yarn.worker.memoryMB=1024
+# yarn.worker.cores=1
+
+#### EMR ####
+# spark.home=/usr/lib/spark
+
+#### HDP PROPERTIES ####
+# spark.home=/usr/hdp/current/spark2-client
+# spark.opts.spark.yarn.am.extraJavaOptions="-Dhdp.version=2.6.1.0-129"
+# spark.opts.spark.driver.extraJavaOptions="-Dhdp.version=2.6.1.0-129"
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/consts.py b/cli/src/amaterasu/cli/consts.py
new file mode 100644
index 0000000..cc67c62
--- /dev/null
+++ b/cli/src/amaterasu/cli/consts.py
@@ -0,0 +1,28 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+INIT = 'init'
+UPDATE = 'update'
+RUN = 'run'
+MAKI = 'maki.yml'
+JOB_FILE = 'job.yml'
+SPARK_CONF = 'spark.yml'
+AMATERASU_LOGO = 'banner2.txt'
+APACHE_LOGO = 'apache.txt'
+AMATERASU_TXT = 'amaterasu.txt'
+AMATERASU_URL = 'https://s3.eu-central-1.amazonaws.com/amaterasu-assets/apache-amaterasu-0.2.0-incubating.tar'
+SPARK_URL = 'https://d3kbcqa49mib13.cloudfront.net/spark-2.1.1-bin-hadoop2.7.tgz'
+ANACONDA_URL = 'https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh'
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/handlers/__init__.py b/cli/src/amaterasu/cli/handlers/__init__.py
new file mode 100644
index 0000000..74a8e2f
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/__init__.py
@@ -0,0 +1,18 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+
+__all__ = ['base', 'setup', 'run', 'update', 'init']
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/handlers/base.py b/cli/src/amaterasu/cli/handlers/base.py
new file mode 100644
index 0000000..6c4a88c
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/base.py
@@ -0,0 +1,241 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+
+import abc
+import os
+import yaml
+import logging
+from six.moves import configparser
+
+__version__ = '0.2.0-incubating-rc4'
+
+git_parser = configparser.ConfigParser()
+git_parser.read(os.path.expanduser('~/.gitconfig'))
+logger = logging.getLogger(__name__)
+
+class HandlerError(Exception):
+    def __init__(self, *args, **kwargs):
+        inner_errors = kwargs.get('inner_errors', [])
+        if inner_errors:
+            message = 'Encountered the following errors: \r\n'
+            for error in inner_errors:
+                message += '{}: {}\r\n'.format(type(error).__name__, str(error))
+            super(HandlerError, self).__init__(message)
+        else:
+            super(HandlerError, self).__init__(*args)
+
+
+class ValidationError(Exception):
+    pass
+
+
+class BaseHandler(abc.ABC):
+    """
+    The CLI handlers detection -
+    We offer our own set of handlers, but in case you'd like to extend the CLI for your own needs, we offer that as well.
+    So this idea was basically taken from Django's management commands idea. All you need to do is subclass BaseHandler
+    in one way or another.
+
+    At runtime, we look for all the subclasses of BaseHandler and look for ones that implement the **handle** method.
+    If the handler has implemented the **handle** method, we then proceed to get its parser.
+    We mount all the parsers we find on the root Amaterasu parser defined in __main__.py
+
+    TL;DR - To create a handler of your own:
+     1. Subclass BaseHandler (or any of its subclasses)
+     2. Implement handle
+     3. Implement get_parser
+    """
+    CONFIGURATION_PATH = '/etc/amaterasu/amaterasu.conf'
+    AMATERASU_HOME = os.getenv("AMATERASU_HOME")
+
+    def __init__(self, **args):
+        self.args = args
+
+    @abc.abstractmethod
+    def handle(self):
+        """
+        This is where the magic happens. Write the handling logic here!
+        :return:
+        """
+        pass
+
+
+class BaseRepositoryHandler(BaseHandler):
+
+    def __init__(self, **args):
+        super(BaseRepositoryHandler, self).__init__(**args)
+        path = args['path'] or os.getcwd()
+        self.dir_path = path if os.path.isabs(path) else os.path.abspath(path)
+        self._validate_path()
+
+    def _validate_path(self):
+        root_dir_exists = os.path.exists(self.dir_path)
+        if not root_dir_exists:
+            base_path = os.path.split(self.dir_path)[0]
+            if not os.path.exists(base_path):
+                raise HandlerError("The base path: \"{}\" doesn't exist!".format(base_path))
+
+
+class MakiMixin(object):
+    """
+    A mixin that takes care of loading and validating a maki file.
+    Multiple handlers require this logic, e.g. - UpdateRepositoryHandler, RunPipelineHandler
+    and possibly any handler that has something to do with the maki file.
+    Please add only shared maki related code here, it is not the place for handler specific code!
+    I will personally kick your ass if you do.
+    Regardless, Sasuke sucks.
+    """
+
+    @staticmethod
+    def _validate_maki(maki):
+        """
+        A valid maki looks like the following:
+
+                job-name:    amaterasu-test [REQUIRED]
+                flow: [REQUIRED]
+                --  - name: start [REQUIRED]
+                |     runner: [REQUIRED]
+                |         group: spark [REQUIRED]
+       (1..n)  -|         type: scala [REQUIRED]
+                |     file: file.scala [REQUIRED]
+                |     exports: [OPTIONAL]
+                --        odd: parquet
+                    - name: step2
+                      runner:
+                          group: spark
+                          type: scala
+                      file: file2.scala
+        :param maki:
+        :return:
+        """
+
+        def str_ok(x):
+            return type(x) == str and len(x) > 0
+
+        VALID_GROUPS = ['spark']
+        VALID_TYPES = ['scala', 'sql', 'python', 'r']
+
+        if not maki:
+            raise HandlerError('Empty maki supplied')
+        first_level_ok = 'job-name' in maki and 'flow' in maki
+        if not first_level_ok:
+            raise HandlerError('Invalid maki!')
+        job_name_ok = str_ok(maki['job-name'])
+        flow_ok = type(maki['flow']) == list and len(maki['flow']) > 0
+        flow_steps_ok = True
+        for step in maki['flow']:
+            step_name_ok = lambda: 'name' in step and str_ok(step['name'])
+            step_runner_ok = lambda: 'runner' in step and type(step['runner']) == dict \
+                                     and 'group' in step['runner'] and str_ok(step['runner']['group']) \
+                                     and step['runner']['group'] in VALID_GROUPS \
+                                     and 'type' in step['runner'] and str_ok(step['runner']['type']) \
+                                     and step['runner']['type'] in VALID_TYPES
+            file_ok = lambda: 'file' in step and str_ok(step['file'])
+            step_ok = type(step) == dict and step_name_ok() and step_runner_ok() and file_ok()
+            if not step_ok:
+                flow_steps_ok = False
+                break
+        return job_name_ok and flow_ok and flow_steps_ok
+
+    @staticmethod
+    def load_maki(maki_path):
+        with open(maki_path, 'r') as f:
+            maki = yaml.load(f)
+        MakiMixin._validate_maki(maki)
+        return maki
+
+
+class ValidateRepositoryMixin(object):
+    """
+    We need valid repositories as inputs for the Amaterasu pipeline.
+    A valid repository looks like this:
+
+    /root_dir
+    |__ /src ## This is where the source code resides
+    |    |
+    |    |__ task1.scala
+    |    |
+    |    |__ task2.py
+    |    |
+    |    |__ task3.sql
+    |
+    |__ /env ## This is a configuration directory for each environment the user defines, there should be a "default" env.
+    |    |
+    |    |__ /default
+    |    |   |
+    |    |   |__ job.yml
+    |    |   |
+    |    |   |__ spark.yml
+    |    |
+    |    |__ /test
+    |    |
+    |    |__ /<some other env>
+    |
+    |__ maki.yml ## The job definition
+    """
+
+    def _validate_repository(self):
+        src_path = os.path.join(self.dir_path, 'src')
+        env_path = os.path.join(self.dir_path, 'env')
+        default_env_path = os.path.join(self.dir_path, 'env', 'default')
+        errors = []
+        print(src_path, env_path, default_env_path)
+        if not os.path.exists(src_path):
+            errors.append(ValidationError('Repository has no src directory'))
+        if not os.path.exists(env_path):
+            errors.append(ValidationError('Repository has no env directory'))
+        if not os.path.exists(default_env_path):
+            errors.append(ValidationError('Repository has no env/default directory'))
+        if errors:
+            raise HandlerError(inner_errors=errors)
+
+
+class ConfigurationFile(dict):
+
+    def __init__(self, path, **kwargs) -> None:
+        abs_path = os.path.expanduser(path) if path.startswith('~') else os.path.abspath(path)
+        self.path = abs_path
+        try:
+            with open(abs_path, 'r') as f:
+                logger.debug(
+                    "Opened an existing configuration file at {}".format(self.path))
+                for i, line in enumerate(f.read().splitlines()):
+                    try:
+                        parts = line.split('=')
+                        if len(parts) > 2:
+                            var = parts[0]
+                            value = '='.join(parts[1:])
+                        else:
+                            var, value = parts
+                        self[var.strip()] = value.strip()
+                    except ValueError:
+                        logger.warning('Improperly Configured: bad form of line {} in "{}"'.format(i, self.path))
+
+        except FileNotFoundError:
+            logger.info("No previous configuration file found at {}".format(self.path))
+
+        super().__init__(**kwargs)
+
+    def startswith(self, prefix:str):
+        return [(k, v) for k, v in self.items() if k.startswith(prefix)]
+
+    def save(self):
+        configuration_root = os.path.dirname(self.path)
+        os.makedirs(configuration_root, exist_ok=True)
+        with open(self.path, 'w') as f:
+            for k, v in self.items():
+                f.write('{}={}\n'.format(k, v))
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/handlers/init.py b/cli/src/amaterasu/cli/handlers/init.py
new file mode 100644
index 0000000..d802d3f
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/init.py
@@ -0,0 +1,98 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+Start a new Amaterasu repository at the given path
+By default, uses PWD.
+
+Usage:
+    ama init [<path>]
+
+Options:
+    -h --help       Show this screen.
+
+"""
+
+from .base import BaseRepositoryHandler, git_parser, HandlerError
+from .. import common
+from ..repository import AmaRepository
+
+
+class InitRepositoryHandler(BaseRepositoryHandler):
+    """
+    A handler for creating a new Amaterasu repository
+    We generate the following structure:
+    /root_dir
+    |__ /src ## This is where the source code resides
+    |    |
+    |    |__ task1.scala
+    |    |
+    |    |__ task2.py
+    |    |
+    |    |__ task3.sql
+    |
+    |__ /env ## This is a configuration directory for each environment the user defines, there should be a "default" env.
+    |    |
+    |    |__ /default
+    |    |   |
+    |    |   |__ job.yml
+    |    |   |
+    |    |   |__ spark.yml
+    |    |
+    |    |__ /test
+    |
+    |__ maki.yml ## The job definition
+    """
+
+    @staticmethod
+    def _config_user():
+        """
+        First we try to get the user details from the global .gitconfig
+        If we fail at that, then we will ask the user for his credentials
+        :return:
+        """
+        try:
+            username = git_parser.get('user', 'name')
+        except KeyError:
+            username = ''
+        try:
+            email = git_parser.get('user', 'email')
+        except KeyError:
+            email = ''
+
+        new_name = input("Your name [{}]: ".format(username))
+        if new_name == username == '':
+            raise HandlerError('Username is required!')
+        elif new_name == '':
+            new_name = username
+
+        new_email = input("Your email [{}]:".format(email))
+        if new_email == email == '':
+            raise HandlerError('Email is required!')
+        elif new_email == '':
+            new_email = email
+
+        return common.User(new_name, new_email)
+
+    def handle(self):
+        print("Setting up an Amaterasu job repository at {}".format(self.dir_path))
+        repo = AmaRepository(self.dir_path)
+        repo.init_repo()
+        repo.commit()
+        print("Amaterasu job repository set up successfully")
+
+
+def get_handler(**kwargs):
+    return InitRepositoryHandler
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/handlers/run.py b/cli/src/amaterasu/cli/handlers/run.py
new file mode 100644
index 0000000..abcd45f
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/run.py
@@ -0,0 +1,168 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+
+Run an Amaterasu pipeline.
+You have to have Mesos installed on the same machine where Amaterasu is installed to use this command.
+IMPORTANT:
+In the future we plan to enable remote execution, hence you will be required to connect to a cluster prior to executing this command
+
+Usage: ama run <repository_url> [-e <env>] [-r <report>] [-b <branch>] [-j <job-id>] [-n <name>] [-f]
+
+Options:
+    -h --help               Show this screen
+    -e --env=<env>          The environment to use for running this job [default: default]
+    -r --report=<report>    Verbosity, this controls how much of the job's logs is propagated to the CLI [default: none]
+    -b --branch=<branch>    What branch to use when running this job [default: master]
+    -j --job-id=<job-id>    Provide a job-id to resume a paused job.
+    -n --name=<name>        Provider a name for the job.
+    -f --force-bin          Force deleting and re-creating the HDFS amaterasu folder
+"""
+from .. import common, consts, compat
+from .base import MakiMixin, ValidateRepositoryMixin, BaseHandler, HandlerError, \
+    ConfigurationFile
+import abc
+import os
+import socket
+import uuid
+import git
+
+
+__version__ = '0.2.0-incubating-rc4'
+
+
+class BaseRunPipelineHandler(BaseHandler, MakiMixin):
+
+    cluster_manager = None
+
+    def __init__(self, **args):
+        super(BaseRunPipelineHandler, self).__init__(**args)
+        self.props = ConfigurationFile(self.CONFIGURATION_PATH)
+        self.base_dir = '/tmp/amaterasu/repos'
+        self.dir_path = '{}/{}'.format(self.base_dir, uuid.uuid4())
+        self.amaterasu_root = self.props['amaterasu.home']
+
+    def _validate_repository(self):
+        super(BaseRunPipelineHandler, self)._validate_repository()
+        BaseRunPipelineHandler.load_maki(
+            os.path.join(self.dir_path, 'maki.yml'))
+
+    @abc.abstractmethod
+    def _get_command_params(self):
+        pass
+
+    def handle(self):
+        try:
+            git.Repo.clone_from(self.args['repository_url'], self.dir_path)
+            self._validate_repository()
+            command_params = self._get_command_params()
+            os.environ.setdefault('AWS_ACCESS_KEY_ID', "0")
+            os.environ.setdefault('AWS_SECRET_ACCESS_KEY', "0")
+            os.environ.setdefault('AMA_NODE', socket.gethostname())
+            compat.run_subprocess(command_params, cwd=self.amaterasu_root)
+            print('W00t amaterasu job is finished!!!')
+        except git.GitError as e:
+            raise HandlerError(inner_errors=[e])
+
+
+class RunMesosPipelineHandler(BaseRunPipelineHandler, MakiMixin,  ValidateRepositoryMixin):
+    """
+    This handler takes care of starting up Amaterasu Scala process.
+    First, we validate the inputs we get. The user is expected to pass at least the repository URL.
+    We inspect the submitted repository and validate that it exists and fits the structure of a valid Amaterasu job repository
+    If all validations are passed, we invoke the Scala runtime.
+    """
+
+    cluster_manager = 'mesos'
+
+    def _get_command_params(self):
+        command_params = [
+            'java',
+            '-cp',
+            '{}/bin/leader-{}-all.jar'.format(self.amaterasu_root, __version__),
+            "-Djava.library.path=/usr/lib",
+            "org.apache.amaterasu.leader.mesos.MesosJobLauncher",
+            "--home",
+            self.amaterasu_root,
+            "--repo",
+            self.args['repository_url'],
+            "--env",
+            self.args.get('env', 'default'),
+            "--report",
+            self.args.get('report', 'code'),
+            "--branch",
+            self.args.get('branch', 'master'),
+            "--config-file",
+            self.args.get('config_file', self.CONFIGURATION_PATH)
+        ]
+        if self.args.get('job_id'):
+            command_params.extend(["--job-id", self.args['job_id']])
+        if self.args.get('name'):
+            command_params.extend(["--name", self.args['name']])
+        return command_params
+
+
+class RunYarnPipelineHandler(BaseRunPipelineHandler, MakiMixin,  ValidateRepositoryMixin):
+
+    cluster_manager = 'yarn'
+
+    def _get_command_params(self):
+        """
+        yarn jar ${BASEDIR}/bin/leader-0.2.0-incubating-all.jar org.apache.amaterasu.leader.yarn.Client --home ${BASEDIR}
+        :return:
+        """
+        command_params = [
+            'yarn',
+            'jar',
+            '{}/bin/leader-{}-all.jar'.format(self.amaterasu_root, __version__),
+            'org.apache.amaterasu.leader.yarn.Client',
+            "--home",
+            self.amaterasu_root,
+            "--repo",
+            self.args['repository_url'],
+            "--env",
+            self.args.get('env', 'default'),
+            "--report",
+            self.args.get('report', 'code'),
+            "--branch",
+            self.args.get('branch', 'master'),
+            "--config-file",
+            self.args.get('config_file', self.CONFIGURATION_PATH)
+        ]
+        if self.args.get('job_id'):
+            command_params.extend(["--job-id", self.args['job_id']])
+        if self.args.get('name'):
+            command_params.extend(["--name", self.args['name']])
+        return command_params
+
+    def handle(self):
+        if self.args.get('force-bin', False):
+            compat.run_subprocess('hdfs', 'dfs', '-rm', '-R', '-skipTrash', self.props['yarn.jarspath'])
+        return super().handle()
+
+
+def get_handler(**kwargs):
+    try:
+        props = ConfigurationFile(BaseHandler.CONFIGURATION_PATH)
+        cluster_manager = props['cluster.manager']
+        if cluster_manager == 'mesos':
+            return RunMesosPipelineHandler
+        elif cluster_manager == 'yarn':
+            return RunYarnPipelineHandler
+        else:
+            raise NotImplemented('Unsupported cluster manager: {}'.format(cluster_manager))
+    except KeyError:
+        raise HandlerError('cluster.manager is missing from configuration! Please run ama setup and try again.')
diff --git a/cli/src/amaterasu/cli/handlers/setup.py b/cli/src/amaterasu/cli/handlers/setup.py
new file mode 100644
index 0000000..28d2a31
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/setup.py
@@ -0,0 +1,122 @@
+"""
+Create or change Amaterasu's configuration.
+
+Usage:
+    ama [-V] setup ( mesos | yarn [-f] )
+
+Options:
+    -f --force-bin  YARN-only - remove all existing Amaterasu HDFS assets
+"""
+import shutil
+from .base import BaseHandler
+from ..compat import run_subprocess
+from ..utils import input
+import os
+import wget
+import colorama
+import logging
+import subprocess
+from jinja2 import Environment, FileSystemLoader
+
+logger = logging.getLogger(__name__)
+
+__version__ = '0.2.0-incubating-rc4'
+THIS_DIR = os.path.dirname(os.path.abspath(__file__))
+
+
+class BaseConfigurationHandler(BaseHandler):
+
+    TEMPLATE_NAME = None
+
+    def __init__(self, **args):
+        self.jinja_env = Environment(
+            loader=FileSystemLoader(os.path.normpath(os.path.join(THIS_DIR, os.path.pardir, 'conf')))
+        )
+        super().__init__(**args)
+
+    def _render_configuration_file(self):
+        if os.path.exists(self.CONFIGURATION_PATH):
+            answer = input.default_input("An Apache Amaterasu configuration file exists, do you want to overwrite (Yn)?", "n")
+            generate_new_configuration = answer.lower() == 'y'
+        else:
+            generate_new_configuration = True
+        if generate_new_configuration:
+            self.jinja_env.get_template(self.TEMPLATE_NAME).stream().dump(self.CONFIGURATION_PATH)
+        logger.info("Successfully created Apache Amaterasu configuration file")
+
+    def _download_dependencies(self):
+        miniconda_dist_path = os.path.join(self.AMATERASU_HOME, 'dist', 'Miniconda2-latest-Linux-x86_64.sh')
+        if not os.path.exists(miniconda_dist_path):
+            print('\n', colorama.Style.BRIGHT, 'Fetching Miniconda distributable', colorama.Style.RESET_ALL)
+            wget.download(
+                'https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh',
+                out=miniconda_dist_path
+            )
+
+    def handle(self):
+        self._render_configuration_file()
+        self._download_dependencies()
+
+
+class MesosConfigurationHandler(BaseConfigurationHandler):
+
+    TEMPLATE_NAME = "ama-mesos.conf"
+
+    def _download_dependencies(self):
+        super()._download_dependencies()
+        spark_dist_path = os.path.join(self.AMATERASU_HOME, 'dist',
+                                       'spark-{}.tgz'.format(
+                                           self.spark_version))
+        if not os.path.exists(spark_dist_path):
+            print(colorama.Style.BRIGHT, 'Fetching Spark distributable', colorama.Style.RESET_ALL)
+            spark_url = 'http://apache.mirror.digitalpacific.com.au/spark/spark-{}/spark-{}.tgz'.format(self.spark_version.split('-')[0], self.spark_version)
+            wget.download(
+                spark_url,
+                out=spark_dist_path
+            )
+
+
+class YarnConfigurationHandler(BaseConfigurationHandler):
+
+    TEMPLATE_NAME = "ama-yarn.conf"
+
+    def _hdfs_directory_exists(self, dir_name):
+        try:
+            run_subprocess([
+                "su",
+                "hadoop",
+                "-c",
+                "hdfs dfs -test -e {}".format(dir_name)
+            ])
+            amaterasu_hdfs_dir_exists = True
+        except subprocess.CalledProcessError as e:
+            print(e.returncode)
+            if e.returncode == 1:
+                amaterasu_hdfs_dir_exists = False
+            else:
+                raise
+        return amaterasu_hdfs_dir_exists
+
+    def _remove_amaterasu_HDFS_assets(self):
+        run_subprocess([
+            "su",
+            self.user,
+            "-c",
+            "hdfs dfs -rm -r -skipTrash /apps/amaterasu"
+        ])
+
+    def handle(self):
+        super().handle()
+        amaterasu_dir_exists = lambda: self._hdfs_directory_exists("/apps/amaterasu")
+
+        if self.args.get('force-bin', False) and amaterasu_dir_exists():
+            self._remove_amaterasu_HDFS_assets()
+
+
+def get_handler(**kwargs):
+    if kwargs['mesos']:
+        return MesosConfigurationHandler
+    elif kwargs['yarn']:
+        return YarnConfigurationHandler
+    else:
+        raise ValueError('Could not find a handler for the given arguments')
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/handlers/update.py b/cli/src/amaterasu/cli/handlers/update.py
new file mode 100644
index 0000000..2d5c1ad
--- /dev/null
+++ b/cli/src/amaterasu/cli/handlers/update.py
@@ -0,0 +1,111 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+
+Update the repository FS based on the maki file.
+
+Usage: ama update [<path>]
+
+Options:
+    -h --help       Show this screen
+
+"""
+
+import os
+
+from .base import ValidateRepositoryMixin, MakiMixin, BaseRepositoryHandler, HandlerError
+
+
+class UpdateRepositoryHandler(BaseRepositoryHandler, MakiMixin, ValidateRepositoryMixin):
+    """
+    Handler that updates a repository based on its Maki.yml file
+    Currently, it fills in the src directory with templates for the specified source files
+    If a source file exists in the src directory that is not specified in the Maki.yml file,
+    the user will be prompted to take action.
+    """
+
+    def _validate_path(self):
+        super(UpdateRepositoryHandler, self)._validate_path()
+        validation_errors = self._validate_repository()
+        if validation_errors:
+            raise HandlerError('Repository structure isn\'t valid!', inner_errors=validation_errors)
+
+    def _load_existing_sources(self):
+        return set(os.listdir(os.path.join(self.dir_path, 'src')))
+
+    def _load_maki_sources(self):
+        maki = UpdateRepositoryHandler.load_maki(os.path.join(self.dir_path, 'maki.yml'))
+        source_files = {step['file'] for step in maki['flow']}
+        return source_files
+
+    def _write_sources_to_fs(self, sources):
+        for file in sources:
+            with open(os.path.join(self.dir_path, 'src', '{}'.format(file)), 'w'):
+                pass
+
+    def _get_user_input_for_source_not_on_maki(self, source): # This was separated out from _handle_source_not_on_maki so we can mock it
+        print("The following source file: \"{}\" doesn't exist in the maki.yml file.".format(source))
+        decision = input("[k]eep [d]elete [A]ll (e.g.: \"dA\" delete all): ").strip()
+        while decision not in ['k', 'd', 'kA', 'dA']:
+            print('Invalid choice "{}"')
+            decision = input("[k]eep [d]elete [A]ll (e.g.: \"dA\" delete all): ")
+        return decision
+
+    def _handle_sources_not_on_maki(self, sources):
+        """
+        We ask the user to give us answers about sources that are not in the maki file.
+        Currently, we only support either keeping them, or deleting them.
+        :param sources:
+        :return:
+        """
+        sources_iter = iter(sources)
+        for source in sources_iter:
+            decision = self._get_user_input_for_source_not_on_maki(source)
+            if decision == 'dA':
+                os.remove(os.path.join(self.dir_path, 'src', '{}'.format(source)))
+                break
+            elif decision == 'kA':
+                return
+            else:
+                if decision == 'd':
+                    os.remove(os.path.join(self.dir_path, 'src', '{}'.format(source)))
+                else:
+                    continue
+        else:
+            return
+
+        # In case the user decided to do delete the rest in bulk:
+        for source in sources_iter:
+            os.remove(os.path.join(self.dir_path, 'src', '{}'.format(source)))
+
+    def handle(self):
+        """
+        The idea is as following:
+        Find all the sources that are present in the repository
+        Find all the sources that are mentioned in the maki file
+        If a source is mentioned in the maki and doesn't exist in the repository, create it
+        If a source exists in the repository and doesn't exist in the maki, ask for user intervention
+        :return:
+        """
+        existing_sources = self._load_existing_sources()
+        maki_sources = self._load_maki_sources()
+        sources_not_in_fs = maki_sources.difference(existing_sources)
+        sources_not_in_maki = existing_sources.difference(maki_sources)
+        self._write_sources_to_fs(sources_not_in_fs)
+        self._handle_sources_not_on_maki(sources_not_in_maki)
+
+
+def get_handler(**kwargs):
+    return UpdateRepositoryHandler
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/repository.py b/cli/src/amaterasu/cli/repository.py
new file mode 100644
index 0000000..dafa129
--- /dev/null
+++ b/cli/src/amaterasu/cli/repository.py
@@ -0,0 +1,58 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from . import consts, common
+import git
+import os
+
+
+class AmaRepository:
+
+    def __init__(self, root_path):
+        """
+
+        :param root_path:
+        :param user_info:
+        :type user_info: common.User
+        """
+        self.root_path = root_path
+        self.src_path = os.path.abspath('{}/src'.format(root_path))
+        self.env_path = os.path.abspath('{}/env'.format(root_path))
+        os.makedirs(self.root_path, exist_ok=True)
+        self.git_repository = git.Repo.init(self.root_path)
+        # self.signature = pygit2.Signature(user_info.name, user_info.email)
+
+    @property
+    def exists(self):
+        return os.path.exists('{}/.git'.format(self.root_path))
+
+    def init_repo(self):
+        default_env = os.path.abspath('{}/default'.format(self.env_path))
+        os.makedirs(self.src_path, exist_ok=True)
+        os.makedirs(self.env_path, exist_ok=True)
+        os.makedirs(default_env, exist_ok=True)
+        if not os.path.exists('{}/{}'.format(self.root_path, consts.MAKI)):
+            with open('{}/{}'.format(self.root_path, consts.MAKI), 'w') as f:
+                f.write(common.RESOURCES[consts.MAKI])
+        if not os.path.exists('{}/{}'.format(default_env, consts.JOB_FILE)):
+            with open('{}/{}'.format(default_env, consts.JOB_FILE), 'w') as f:
+                f.write(common.RESOURCES[consts.JOB_FILE])
+        if not os.path.exists('{}/{}'.format(default_env, consts.SPARK_CONF)):
+            with open('{}/{}'.format(default_env, consts.SPARK_CONF), 'w') as f:
+                f.write(common.RESOURCES[consts.SPARK_CONF])
+
+    def commit(self):
+        self.git_repository.index.commit("Amaterasu job repo init")
diff --git a/cli/src/amaterasu/cli/resources/amaterasu.txt b/cli/src/amaterasu/cli/resources/amaterasu.txt
new file mode 100644
index 0000000..062842b
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/amaterasu.txt
@@ -0,0 +1,9 @@
+
+            __  __         _______  ______  _____              _____  _    _
+     /\    |  \/  |    /\ |__   __||  ____||  __ \     /\     / ____|| |  | |
+    /  \   | \  / |   /  \   | |   | |__   | |__) |   /  \   | (___  | |  | |
+   / /\ \  | |\/| |  / /\ \  | |   |  __|  |  _  /   / /\ \   \___ \ | |  | |
+  / ____ \ | |  | | / ____ \ | |   | |____ | | \ \  / ____ \  ____) || |__| |
+ /_/    \_\|_|  |_|/_/    \_\|_|   |______||_|  \_\/_/    \_\|_____/  \____/
+
+
diff --git a/cli/src/amaterasu/cli/resources/apache.txt b/cli/src/amaterasu/cli/resources/apache.txt
new file mode 100644
index 0000000..30bebf4
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/apache.txt
@@ -0,0 +1,6 @@
+
+   ___    ___   ___   _____ __ __ ____
+  / _ |  / _ \ / _ | / ___// // // __/
+ / __ | / ___// __ |/ /__ / _  // _/
+/_/ |_|/_/   /_/ |_|\___//_//_//___/
+
diff --git a/cli/src/amaterasu/cli/resources/banner2.txt b/cli/src/amaterasu/cli/resources/banner2.txt
new file mode 100644
index 0000000..c338481
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/banner2.txt
@@ -0,0 +1,21 @@
+                 ++++        ++++         ++++
+               +     +     +     +      +     +
+              +       +   +       +    +       +
+             +       +   +       +    +       +
+            +       +   +       +    +       +
+           +       +   +       +    +       +
+           +      +   +        +    +      +
+            +   ++   +          +   ++    +
+             +++    +            +    +++
+                   +              +
+                  +       ++       +
+                 +       ++++      ++
+                ++      ++  ++      ++
+               ++      ++    +       ++
+              ++      ++      +       +
+             ++      ++        +       +
+            ++      ++          +       +
+           ++       +            +       +
+           +       +              +      +
+            ++  +++                ++   ++
+              ++                     +++
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/resources/job.yml b/cli/src/amaterasu/cli/resources/job.yml
new file mode 100644
index 0000000..d2319e1
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/job.yml
@@ -0,0 +1,24 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+name: default
+master: mesos://localhost:5050
+inputRootPath: hdfs://localhost:9000/user/amaterasu/input
+outputRootPath: hdfs://localhost:9000/user/amaterasu/output
+workingDir: alluxio://localhost:19998/
+configuration:
+    spark.cassandra.connection.host: 127.0.0.1,
+    sourceTable: documents
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/resources/logging.yml b/cli/src/amaterasu/cli/resources/logging.yml
new file mode 100644
index 0000000..421ef38
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/logging.yml
@@ -0,0 +1,46 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+version: 1
+formatters:
+  simple:
+    format: '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
+handlers:
+  console:
+    class: logging.StreamHandler
+    level: ERROR
+    formatter: simple
+    stream: ext://sys.stdout
+  logFile:
+    class: amaterasu.cli.utils.logging.AllUsersTimedRotatingFileHandler
+    level: INFO
+    filename: /var/log/amaterasu/amaterasu.log
+    formatter: simple
+    when: midnight
+  debugLog:
+    class: amaterasu.cli.utils.logging.AllUsersTimedRotatingFileHandler
+    level: DEBUG
+    filename: /var/log/amaterasu/amaterasu-debug.log
+    when: midnight
+    formatter: simple
+loggers:
+  amaterasu:
+    level: DEBUG
+    handlers: [console, logFile, debugLog]
+    propagate: no
+root:
+  level: ERROR
+  handlers: [console, logFile]
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/resources/maki.yml b/cli/src/amaterasu/cli/resources/maki.yml
new file mode 100644
index 0000000..06bcf1b
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/maki.yml
@@ -0,0 +1,32 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+---
+job-name: amaterasu-test # Replace this with your job's name
+flow:
+    - name: start # Name of this step
+      runner:
+          group: spark # Currently supporting spark only, but expect more here in the future!
+          type: scala # scala, sql, r, python
+      file: file.scala # Source code for the step
+      exports:
+          odd: parquet
+    - name: step2
+      runner:
+          group: spark
+          type: scala
+      file: file2.scala
+...
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/resources/spark.yml b/cli/src/amaterasu/cli/resources/spark.yml
new file mode 100644
index 0000000..2f4b9d5
--- /dev/null
+++ b/cli/src/amaterasu/cli/resources/spark.yml
@@ -0,0 +1,18 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+spark.executor.extraJavaOptions: -XX:+PrintGCDetails
+spark.executor.memory: 1g
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/utils/__init__.py b/cli/src/amaterasu/cli/utils/__init__.py
new file mode 100644
index 0000000..fa6c0be
--- /dev/null
+++ b/cli/src/amaterasu/cli/utils/__init__.py
@@ -0,0 +1,16 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/utils/exceptions.py b/cli/src/amaterasu/cli/utils/exceptions.py
new file mode 100644
index 0000000..0733230
--- /dev/null
+++ b/cli/src/amaterasu/cli/utils/exceptions.py
@@ -0,0 +1,5 @@
+class ImproperlyConfiguredError(Exception):
+    pass
+
+class ValidationError(Exception):
+    pass
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/utils/input.py b/cli/src/amaterasu/cli/utils/input.py
new file mode 100644
index 0000000..5690329
--- /dev/null
+++ b/cli/src/amaterasu/cli/utils/input.py
@@ -0,0 +1,36 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+
+import readline
+
+
+def default_input(prompt, default=''):
+    """
+    This prints the default value next to the prompt and the value is editable.
+    Important note!
+    The default value cannot be displayed in the Mac OSX default shell.
+    You can bypass this by installing zsh on Mac OSX.
+    As far as tested on
+    :param prompt:
+    :param default:
+    :return:
+    """
+    readline.set_startup_hook(lambda: readline.insert_text(default))
+    try:
+        return input(prompt)
+    finally:
+        readline.set_startup_hook()
\ No newline at end of file
diff --git a/cli/src/amaterasu/cli/utils/logging.py b/cli/src/amaterasu/cli/utils/logging.py
new file mode 100644
index 0000000..d46772a
--- /dev/null
+++ b/cli/src/amaterasu/cli/utils/logging.py
@@ -0,0 +1,27 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import os
+
+from logging import handlers
+
+class AllUsersTimedRotatingFileHandler(handlers.TimedRotatingFileHandler):
+
+    def _open(self):
+        prevumask = os.umask(0o000)
+        rtv = super(AllUsersTimedRotatingFileHandler, self)._open()
+        os.umask(prevumask)
+        return rtv
\ No newline at end of file
diff --git a/cli/src/requirements.txt b/cli/src/requirements.txt
new file mode 100644
index 0000000..fa361d3
--- /dev/null
+++ b/cli/src/requirements.txt
@@ -0,0 +1,19 @@
+behave==1.2.5
+cffi==1.11.4
+colorama==0.3.9
+parse==1.8.2
+parse-type==0.4.2
+pycparser==2.18
+pyfakefs==3.2
+pygit2==0.26.0
+PyHamcrest==1.9.0
+six==1.11.0
+mock; python_version <= '2.7'
+PyYAML
+wget
+netifaces
+multipledispatch
+GitPython
+docopt
+paramiko
+docker-py
\ No newline at end of file
diff --git a/cli/src/setup.py b/cli/src/setup.py
new file mode 100644
index 0000000..ebe147b
--- /dev/null
+++ b/cli/src/setup.py
@@ -0,0 +1,83 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from setuptools import setup, find_packages
+from setuptools.command.install import install
+import os
+import shutil
+
+class PostInstallCommand(install):
+
+    AMATERASU_LOG_DIR = '/var/log/amaterasu'
+    AMATERASU_CONFIG_DIR = '/etc/amaterasu'
+
+    def run(self):
+        old_umask = os.umask(000)
+        if not os.path.exists(self.AMATERASU_LOG_DIR):
+            os.mkdir(self.AMATERASU_LOG_DIR)
+            os.chmod(self.AMATERASU_LOG_DIR, 777)
+        if not os.path.exists(self.AMATERASU_CONFIG_DIR):
+            os.mkdir(self.AMATERASU_CONFIG_DIR)
+        res = super().run()
+        os.umask(old_umask)
+        return res
+
+
+setup(
+    name='amaterasu',
+    version='0.2.0-incubating-rc4',
+    packages=find_packages(),
+    url='https://github.com/apache/incubator-amaterasu',
+    license='Apache License 2.0 ',
+    author='Apache Amaterasu (incubating)',
+    author_email="dev@amaterasu.incubator.apache.org",
+    description='Apache Amaterasu (incubating) is an open source, configuration managment and deployment framework for big data pipelines',
+    install_requires=['colorama', 'GitPython', 'six', 'PyYAML', 'netifaces', 'multipledispatch', 'docopt', 'paramiko', 'wget'],
+    tests_require=['behave'],
+    python_requires='!=3.0.*, !=3.1.*, !=3.2.*, <4',
+    entry_points={
+        'console_scripts': [
+            'ama=amaterasu.__main__:main'
+        ]
+    },
+    include_package_data=True,
+    package_data={
+        'amaterasu.cli.resources': ['*']
+    },
+    cmdclass={
+        'install': PostInstallCommand
+    },
+    classifiers=[
+        'Development Status :: 3 - Alpha',
+        'Environment :: Console',
+        'Intended Audience :: Developers',
+        'Intended Audience :: System Administrators',
+        'Intended Audience :: Information Technology',
+        'License :: OSI Approved :: Apache Software License',
+        'Natural Language :: English',
+        'Operating System :: POSIX :: Linux',
+        'Programming Language :: Java',
+        'Programming Language :: Python',
+        'Programming Language :: Python :: 3',
+        'Programming Language :: Python :: 3.3',
+        'Programming Language :: Python :: 3.4',
+        'Programming Language :: Python :: 3.5',
+        'Programming Language :: Python :: 3.6',
+        'Programming Language :: Python :: 3.7',
+        'Programming Language :: Python :: Implementation :: CPython',
+        'Topic :: Scientific/Engineering'
+    ]
+)
diff --git a/cli/src/tests/__init__.py b/cli/src/tests/__init__.py
new file mode 100644
index 0000000..56e3fc4
--- /dev/null
+++ b/cli/src/tests/__init__.py
@@ -0,0 +1,17 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+__all__ = ['compat', 'utils']
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/__init__.py b/cli/src/tests/ama_cli/__init__.py
new file mode 100644
index 0000000..fa6c0be
--- /dev/null
+++ b/cli/src/tests/ama_cli/__init__.py
@@ -0,0 +1,16 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/compat_tests.py b/cli/src/tests/ama_cli/compat_tests.py
new file mode 100644
index 0000000..f93cdd8
--- /dev/null
+++ b/cli/src/tests/ama_cli/compat_tests.py
@@ -0,0 +1,24 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import unittest
+from amaterasu.cli import compat
+
+class TestCompatRunSubprocess(unittest.TestCase):
+
+    def test_run_subprocess_with_valid_input_should_execute_subprocess_successfully(self):
+        inpt = ['echo', 'HELLO']
+        compat.run_subprocess(*inpt)
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/creation_of_new_repo.feature b/cli/src/tests/ama_cli/creation_of_new_repo.feature
new file mode 100644
index 0000000..ac987e0
--- /dev/null
+++ b/cli/src/tests/ama_cli/creation_of_new_repo.feature
@@ -0,0 +1,92 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+Feature: Support of creating a new Amaterasu job repository
+  The feature is support to result in a new git repository that has a very specific structure:
+  /root_dir
+    |__/src ## This is where the source code resides
+    |   |
+    |   |__task1.scala
+    |   |
+    |   |__task2.py
+    |   |
+    |   |__task3.sql
+    |
+    |__/env ## This is a configuration directory for each environment the user defines, there should be a "default" env.
+    |   |
+    |   |__/default
+    |   |  |
+    |   |  |__job.yml
+    |   |  |
+    |   |  |__spark.yml
+    |   |
+    |   |__/test
+    |
+    |__maki.yml ## The job definition
+
+  Scenario: Invoking the InitRepository handler with a valid path should result in a new git repository
+    Given The absolute path "/tmp/amaterasu/test"
+    When InitRepository handler is invoked with the given path
+    Then A directory with path "/tmp/amaterasu/test" should be created
+    And The directory in path "/tmp/amaterasu/test" should be a git repository
+    And The "/tmp/amaterasu/test" directory should have a "maki.yml" file
+    And The "/tmp/amaterasu/test" directory should have a "src" subdirectory
+    And The "/tmp/amaterasu/test" directory should have a "env" subdirectory
+    And the "/tmp/amaterasu/test/env" directory should have a "default" subdirectory
+    And the "/tmp/amaterasu/test/env/default" directory should have a "job.yml" file
+    And the "/tmp/amaterasu/test/env/default" directory should have a "spark.yml" file
+
+  Scenario: Invoking the InitRepository handler with a path that doesn't exist should result in an exception
+    Given The invalid absolute path "/aaa/bbb/ccc"
+    When InitRepository handler is invoked with the given path
+    Then An HandlerError should be raised
+
+  Scenario: Invoking the InitRepository handler with a valid path that is already a empty repository should create all the required Amaterasu file structure
+    Given The absolute path "/tmp/amaterasu/test"
+    Given The path is a repository
+    When InitRepository handler is invoked with the given path
+    Then The directory in path "/tmp/amaterasu/test" should be a git repository
+    And The "/tmp/amaterasu/test" directory should have a "maki.yml" file
+    And The "/tmp/amaterasu/test" directory should have a "src" subdirectory
+    And The "/tmp/amaterasu/test" directory should have a "env" subdirectory
+    And the "/tmp/amaterasu/test/env" directory should have a "default" subdirectory
+    And the "/tmp/amaterasu/test/env/default" directory should have a "job.yml" file
+    And the "/tmp/amaterasu/test/env/default" directory should have a "spark.yml" file
+
+  Scenario: Invoking the InitRepository handler with a valid path that is already a repository that is missing a maki file, should only create the maki file
+    Given The absolute path "/tmp/amaterasu/test"
+    Given The path is a repository
+    Given The "/tmp/amaterasu/test" directory has a "src" subdirectory
+    Given The "/tmp/amaterasu/test" directory has a "env" subdirectory
+    Given The "/tmp/amaterasu/test/env" directory has a "default" subdirectory
+    Given The "/tmp/amaterasu/test/env/default" directory has a "job.yml" file
+    Given The "/tmp/amaterasu/test/env/default" directory has a "spark.yml" file
+    When InitRepository handler is invoked with the given path
+    Then The "/tmp/amaterasu/test" directory should have a "maki.yml" file
+    And Only "maki.yml" should have changed
+
+  Scenario: Invoking the InitRepository handler with a valid path that is already a repository that is missing the env directory, should only create the env directory
+    Given The absolute path "/tmp/amaterasu/test"
+    Given The path is a repository
+    Given The "/tmp/amaterasu/test" directory has a "src" subdirectory
+    Given The "/tmp/amaterasu/test" directory has a "maki.yml" file
+    When InitRepository handler is invoked with the given path
+    Then The "/tmp/amaterasu/test" directory should have a "env" subdirectory
+    And the "/tmp/amaterasu/test/env" directory should have a "default" subdirectory
+    And the "/tmp/amaterasu/test/env/default" directory should have a "job.yml" file
+    And the "/tmp/amaterasu/test/env/default" directory should have a "spark.yml" file
+    And Only "env,env/default,env/default/job.yml,env/default/spark.yml" should have changed
+
diff --git a/cli/src/tests/ama_cli/environment.py b/cli/src/tests/ama_cli/environment.py
new file mode 100644
index 0000000..c85a642
--- /dev/null
+++ b/cli/src/tests/ama_cli/environment.py
@@ -0,0 +1,53 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import os
+import shutil
+import stat
+import errno
+from amaterasu.cli.compat import *
+from amaterasu.cli.common import Resources
+
+def handleRemoveReadonly(func, path, exc):
+    excvalue = exc[1]
+    if func in (os.rmdir, os.remove, os.unlink) and excvalue.errno == errno.EACCES:
+        os.chmod(path, stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO)  # 0777
+        func(path)
+    else:
+        raise exc[0]
+
+
+def before_all(context):
+    context.test_resources = Resources(os.path.join(os.getcwd(), 'tests'))
+    context.first_run = False  # overridden in run_pipeline_unit.feature::It is the first time the user runs a pipeline
+
+
+def before_scenario(context, scenario):
+    try:
+        shutil.rmtree("/tmp/amaterasu-repos", onerror=handleRemoveReadonly)
+    except (FileNotFoundError, WindowsError):
+        pass
+    context.stats_before = {}
+    context.stats_after = {}
+    try:
+        shutil.rmtree(os.path.abspath('tmp'), onerror=handleRemoveReadonly)
+    except (FileNotFoundError, WindowsError):
+        pass
+    os.mkdir(os.path.abspath('tmp'))
+
+
+def after_scenario(context, scenario):
+    shutil.rmtree(os.path.abspath('tmp'), onerror=handleRemoveReadonly)
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/run_pipeline_unit.feature b/cli/src/tests/ama_cli/run_pipeline_unit.feature
new file mode 100644
index 0000000..fca86f4
--- /dev/null
+++ b/cli/src/tests/ama_cli/run_pipeline_unit.feature
@@ -0,0 +1,52 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+Feature: Run an amaterasu pipeline
+
+
+  Scenario: Run a pipeline on Mesos for a valid repository, should not raise an error and produce a valid command
+
+    Given A valid repository
+    When Running a pipeline on Mesos with the given repository
+    Then An HandlerError should not be raised
+    And The resulting command looks like this
+      """
+      java -cp /tmp/amaterasu/assets/bin/leader-0.2.0-incubating-rc3-all.jar -Djava.library.path=/usr/lib org.apache.amaterasu.leader.mesos.MesosJobLauncher --home /tmp/amaterasu/assets --repo http://git.sunagakure.com/ama-job-valid.git --env default --report code --branch master --config-home /tmp/amaterasu
+      """
+
+  Scenario: Run a pipeline on YARN for a valid repository, should not raise an error and produce a valid command
+    Given A valid repository
+    When Running a pipeline on YARN with the given repository
+    Then An HandlerError should not be raised
+    And The resulting command looks like this
+      """
+      yarn jar /tmp/amaterasu/assets/bin/leader-0.2.0-incubating-rc3-all.jar org.apache.amaterasu.leader.yarn.Client --home /tmp/amaterasu/assets --repo http://git.sunagakure.com/ama-job-valid.git --env default --report code --branch master --config-home /Users/nadavh/.amaterasu
+      """
+
+  Scenario: Run a pipeline for a repository that doesn't exist, should raise an error
+    Given A repository that doesn't exist
+    When Running a pipeline on Mesos with the given repository
+    Then An HandlerError should be raised
+
+  Scenario: Run a pipeline for a repository that is not amaterasu compliant, should raise an error
+    Given A repository that is not Amaterasu compliant
+    When Running a pipeline on Mesos with the given repository
+    Then An HandlerError should be raised
+
+  Scenario: Run a pipeline for a valid repository by file URI should raise an error
+    Given A valid file URI repository
+    When Running a pipeline on Mesos with the given repository
+    Then Amaterasu should run
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/steps/creation_of_new_repo.py b/cli/src/tests/ama_cli/steps/creation_of_new_repo.py
new file mode 100644
index 0000000..0eae2e6
--- /dev/null
+++ b/cli/src/tests/ama_cli/steps/creation_of_new_repo.py
@@ -0,0 +1,171 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import os
+import sys
+
+from behave import *
+from hamcrest import *
+from unittest import mock
+from amaterasu.cli import common
+from amaterasu.cli.handlers.base import HandlerError
+from amaterasu.cli.handlers.init import InitRepositoryHandler
+from tests.utils import collect_stats
+import git
+
+@given('The relative path "{path}"')
+def step_impl(context, path):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(path)
+    if not os.path.exists(abs_path):
+        os.makedirs(abs_path, exist_ok=True)
+    context.given_path = abs_path
+
+
+@given('The absolute path "{path}"')
+def step_impl(context, path):
+    """
+    :type context: behave.runner.Context
+    """
+    context.given_path = path
+
+
+@when("InitRepository handler is invoked with the given path")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    try:
+        with mock.patch('amaterasu.cli.handlers.init.InitRepositoryHandler._config_user', return_value=common.User('Naruto Uzumaki', 'naruto@konoha.village')):
+            handler = InitRepositoryHandler(path=context.given_path)
+            handler.handle()
+        collect_stats(context, context.given_path)
+    except HandlerError as ex:
+        context.ex = ex
+
+
+@then('A directory with path "{expected_path}" should be created')
+def step_impl(context, expected_path):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(expected_path)
+    path_exists = os.path.exists(abs_path)
+    assert_that(path_exists, is_(True))
+
+
+@step('The directory in path "{expected_repo_path}" should be a git repository')
+def step_impl(context, expected_repo_path):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(expected_repo_path)
+    git_meta_path = os.path.join(abs_path, '.git')
+    repo_exists = os.path.exists(git_meta_path)
+    assert_that(repo_exists, is_(True))
+
+
+@step('The "{expected_path}" directory should have a "{expected_file}" file')
+def step_impl(context, expected_path, expected_file):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(expected_path)
+    file_path = os.path.join(abs_path, expected_file)
+    file_exists = os.path.exists(file_path)
+    assert_that(file_exists, is_(True))
+
+
+@step('The "{expected_path}" directory should have a "{expected_subdir}" subdirectory')
+def step_impl(context, expected_path, expected_subdir):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(expected_path)
+    subdir_path = os.path.join(abs_path, expected_subdir)
+    subdir_exists = os.path.exists(subdir_path)
+    assert_that(subdir_exists, is_(True))
+
+
+@then("An HandlerError should be raised")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    assert_that(context, has_property('ex', instance_of(HandlerError)))
+
+
+@given("The path is a repository")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    git.Repo.init(context.given_path)
+
+
+@given('The "{given_path}" directory has a "{given_subdir}" subdirectory')
+def step_impl(context, given_path, given_subdir):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(given_path)
+    subdir_path = os.path.join(abs_path, given_subdir)
+    os.makedirs(subdir_path, exist_ok=True)
+
+
+@given('The "{given_path}" directory has a "{given_file}" file')
+def step_impl(context, given_path, given_file):
+    """
+    :type context: behave.runner.Context
+    """
+    abs_path = os.path.abspath(given_path)
+    file_path = os.path.join(abs_path, given_file)
+    with open(file_path, 'w'):
+        pass
+    stat = os.lstat(file_path)
+    context.stats_before[file_path] = stat
+
+@given('The invalid absolute path "{given_path}"')
+def step_impl(context, given_path):
+    """
+    :type context: behave.runner.Context
+    """
+    if sys.platform == 'win32':
+        given_path = 'xxxzzz:\\{}'.format(given_path)
+    context.given_path = given_path
+
+
+@step('Only "{expected_changed_files_str}" should have changed')
+def step_impl(context, expected_changed_files_str):
+    """
+    :type context: behave.runner.Context
+    """
+    expected_changed_files = [fname.strip() for fname in expected_changed_files_str.split(',')]
+    for fname in expected_changed_files:
+        path = os.path.abspath(os.path.join(context.given_path, fname))
+        after = context.stats_after[path]
+        before = context.stats_before.get(path, None)
+        if before:
+            assert_that(before.st_mtime, is_not(equal_to(after.st_mtime)))
+
+    expected_unchanged_files = set(context.stats_before) - set(expected_changed_files)
+    for fname in expected_unchanged_files:
+        path = os.path.abspath(os.path.join(context.given_path, fname))
+        after = context.stats_after[path]
+        before = context.stats_before.get(path, None)
+        assert_that(before.st_mtime, is_(equal_to(after.st_mtime)), path)
diff --git a/cli/src/tests/ama_cli/steps/run_pipeline.py b/cli/src/tests/ama_cli/steps/run_pipeline.py
new file mode 100644
index 0000000..c266afc
--- /dev/null
+++ b/cli/src/tests/ama_cli/steps/run_pipeline.py
@@ -0,0 +1,157 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from functools import partial
+from unittest import mock
+from behave import *
+from hamcrest import *
+
+from amaterasu.cli import consts
+from amaterasu.cli.handlers.base import HandlerError
+from amaterasu.cli.handlers.run import RunMesosPipelineHandler, RunYarnPipelineHandler
+from uuid import uuid4
+
+import os
+import git
+
+
+
+def mock_git_clone(uid, context, url, dest_dir):
+    repository_dest = os.path.abspath('/tmp/amaterasu/repos/{}'.format(uid))
+    if url == 'http://git.sunagakure.com/ama-job-non-exist.git':
+        raise git.GitError("failed to send request: The server name or address could not be resolved")
+    elif url == "http://git.sunagakure.com/ama-job-valid.git":
+        os.makedirs(repository_dest, exist_ok=True)
+        os.makedirs(os.path.join(repository_dest, 'src'), exist_ok=True)
+        os.makedirs(os.path.join(repository_dest, 'env'), exist_ok=True)
+        os.makedirs(os.path.join(repository_dest, 'env', 'default'), exist_ok=True)
+        with open(os.path.join(repository_dest, 'maki.yml'), 'w') as maki:
+            maki.write(context.test_resources['maki_valid.yml'])
+        with open(os.path.join(repository_dest, 'env', 'default', consts.SPARK_CONF), 'w') as spark:
+            spark.write(context.test_resources[consts.SPARK_CONF])
+        with open(os.path.join(repository_dest, 'env', 'default', consts.JOB_FILE), 'w') as job:
+            job.write(context.test_resources[consts.JOB_FILE])
+
+    elif url == 'http://git.sunagakure.com/some-repo.git':
+        os.makedirs(repository_dest)
+        os.mkdir(os.path.join(repository_dest, 'sasuke'))
+        os.mkdir(os.path.join(repository_dest, 'sasuke', 'is'))
+        os.mkdir(os.path.join(repository_dest, 'sasuke', 'is', 'lame'))  # (NOT) TODO: Na nach nachman me oman
+    else:
+        raise NotImplementedError()
+
+
+def mock_subprocess_run(context, cmd, cwd):
+    context.command = cmd
+
+
+@given("A valid repository")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    context.repository_uri = 'http://git.sunagakure.com/ama-job-valid.git'
+
+
+@when("Running a pipeline on Mesos with the given repository")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    uid = uuid4()
+    with mock.patch('git.Repo.clone_from', partial(mock_git_clone, uid, context)), \
+         mock.patch('uuid.uuid4', lambda: uid), \
+         mock.patch('amaterasu.cli.compat.run_subprocess', partial(mock_subprocess_run, context)):
+        handler = RunMesosPipelineHandler(repository_url=context.repository_uri, env='default', name=None, report='code', branch='master', job_id=None, config_home='/tmp/amaterasu')
+        handler.amaterasu_root = '/tmp/amaterasu/assets'
+        os.makedirs(handler.amaterasu_root, exist_ok=True)
+        try:
+            handler.handle()
+        except HandlerError as ex:
+            context.ex = ex
+
+
+@given("A valid file URI repository")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    context.repository_uri = 'http://git.sunagakure.com/ama-job-valid.git'
+
+
+@given("A repository that doesn't exist")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    context.repository_uri = 'http://git.sunagakure.com/ama-job-non-exist.git'
+
+
+@given("A repository that is not Amaterasu compliant")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    context.repository_uri = 'http://git.sunagakure.com/some-repo.git'
+
+
+@then("Amaterasu should run")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    pass
+
+
+@given("It is the first time the user runs a pipeline")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    context.first_run = True
+
+
+@step("The resulting command looks like this")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    command = ' '.join(context.command)
+    assert_that(command, is_(equal_to(context.text)))
+
+
+@when("Running a pipeline on YARN with the given repository")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    uid = uuid4()
+    with mock.patch('git.Repo.clone_from',
+                    partial(mock_git_clone, uid, context)), \
+         mock.patch('uuid.uuid4', lambda: uid), \
+         mock.patch('amaterasu.cli.compat.run_subprocess',
+                    partial(mock_subprocess_run, context)):
+        handler = RunYarnPipelineHandler(repository_url=context.repository_uri,
+                                          env='default', name=None,
+                                          report='code', branch='master',
+                                          job_id=None,
+                                          config_home='/tmp/amaterasu')
+        handler.amaterasu_root = '/tmp/amaterasu/assets'
+        os.makedirs(handler.amaterasu_root, exist_ok=True)
+        try:
+            handler.handle()
+        except HandlerError as ex:
+            context.ex = ex
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/steps/update_of_existing_repo.py b/cli/src/tests/ama_cli/steps/update_of_existing_repo.py
new file mode 100644
index 0000000..367533e
--- /dev/null
+++ b/cli/src/tests/ama_cli/steps/update_of_existing_repo.py
@@ -0,0 +1,170 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+import os
+
+import yaml
+from behave import *
+from hamcrest import *
+from unittest import mock
+
+from amaterasu.cli.handlers.base import HandlerError
+from amaterasu.cli.handlers.update import UpdateRepositoryHandler
+from tests.utils import collect_stats
+
+
+@when("Updating the repository using the maki file")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    try:
+        handler = UpdateRepositoryHandler(path=context.given_path)
+        handler.handle()
+        collect_stats(context, context.given_path)
+    except HandlerError as ex:
+        context.ex = ex
+
+
+@given('The "{directory}" directory has an empty maki file')
+def step_impl(context, directory):
+    """
+    :type context: behave.runner.Context
+    """
+    maki_path = os.path.join(context.given_path, 'maki.yml')
+    with open(maki_path, 'w'):
+        pass
+
+
+@given('The "{directory}" directory has an invalid maki file')
+def step_impl(context, directory):
+    """
+    :type context: behave.runner.Context
+    """
+    maki_path = os.path.join(context.given_path, 'maki.yml')
+    resources = context.test_resources
+    with open(maki_path, 'w') as f:
+        f.write(resources['maki_invalid.yml'])
+
+
+@given('The "{directory}" directory has another invalid maki file')
+def step_impl(context, directory):
+    """
+    :type context: behave.runner.Context
+    """
+    maki_path = os.path.join(context.given_path, 'maki.yml')
+    resources = context.test_resources
+    with open(maki_path, 'w') as f:
+        f.write(resources['maki_invalid2.yml'])
+
+
+@given('The "{directory}" directory has a valid maki file')
+def step_impl(context, directory):
+    """
+    :type context: behave.runner.Context
+    """
+    maki_path = os.path.join(context.given_path, 'maki.yml')
+    resources = context.test_resources
+    with open(maki_path, 'w') as f:
+        f.write(resources['maki_valid.yml'])
+
+
+@then('"{filename}" should be added to the maki file')
+def step_impl(context, filename):
+    """
+    :type context: behave.runner.Context
+    """
+    maki_path = os.path.join(context.given_path, 'maki.yml')
+    with open(maki_path, 'r') as f:
+        maki = yaml.load(f)
+    source_files = [step['file'] for step in maki['flow']]
+    assert_that(source_files, contains(filename))
+
+
+@step('The "{directory}" shouldn\'t have a "{filename}" file')
+def step_impl(context, directory, filename):
+    """
+    :type context: behave.runner.Context
+    """
+    full_file_path = os.path.abspath(os.path.join(directory, filename))
+    assert_that(os.path.exists(full_file_path), is_not(True))
+
+
+@then("An HandlerError should not be raised")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    assert_that(context, is_not(has_property('ex')), "An exception was raised while invoking the handler")
+
+
+@when("Updating the repository using the maki file, with user keeping source files that are not in the maki")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    try:
+        with mock.patch('amaterasu.cli.handlers.update.UpdateRepositoryHandler._get_user_input_for_source_not_on_maki', return_value='kA'):
+            handler = UpdateRepositoryHandler(path=context.given_path)
+            handler.handle()
+        collect_stats(context, context.given_path)
+    except HandlerError as ex:
+        context.ex = ex
+
+
+@when("Updating the repository using the maki file, with user not keeping source files that are not in the maki")
+def step_impl(context):
+    """
+    :type context: behave.runner.Context
+    """
+    try:
+        with mock.patch('amaterasu.cli.handlers.update.UpdateRepositoryHandler._get_user_input_for_source_not_on_maki', return_value='dA'):
+            handler = UpdateRepositoryHandler(path=context.given_path)
+            handler.handle()
+        collect_stats(context, context.given_path)
+    except HandlerError as ex:
+        context.ex = ex
+
+
+@step('The "{directory}" directory shouldn\'t have a "{filename}" file')
+def step_impl(context, directory, filename):
+    """
+    :type context: behave.runner.Context
+    """
+    file_path = os.path.join(os.path.abspath(directory), filename)
+    assert_that(os.path.exists(file_path), is_(False))
+
+
+
+@when('Updating the repository using the maki file, with user not keeping "{file_to_delete}" and is keeping "{file_to_keep}"')
+def step_impl(context, file_to_delete, file_to_keep):
+    """
+    :type context: behave.runner.Context
+    """
+
+    def mock_user_input(handler, source):
+        if source == file_to_delete:
+            return 'd'
+        else:
+            return 'k'
+
+    try:
+        with mock.patch('amaterasu.cli.handlers.update.UpdateRepositoryHandler._get_user_input_for_source_not_on_maki', new=mock_user_input):
+            handler = UpdateRepositoryHandler(path=context.given_path)
+            handler.handle()
+        collect_stats(context, context.given_path)
+    except HandlerError as ex:
+        context.ex = ex
\ No newline at end of file
diff --git a/cli/src/tests/ama_cli/update_of_existing_repo.feature b/cli/src/tests/ama_cli/update_of_existing_repo.feature
new file mode 100644
index 0000000..7a1afd3
--- /dev/null
+++ b/cli/src/tests/ama_cli/update_of_existing_repo.feature
@@ -0,0 +1,146 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+Feature: Updating an existing repository from a maki file.
+
+  Scenario: Updating an non existing repository, should throw an error
+    Given The relative path "tmp/amaterasu"
+    Given The "tmp/amaterasu" directory has a valid maki file
+    When Updating the repository using the maki file
+    Then An HandlerError should be raised
+    And The "tmp/amaterasu/src" shouldn't have a "example.py" file
+    And The "tmp/amaterasu/src" shouldn't have a "example.scala" file
+
+  Scenario: Updating an existing repository from an empty maki, should throw an error
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has an empty maki file
+    When Updating the repository using the maki file
+    Then An HandlerError should be raised
+    And The "tmp/amaterasu/src" shouldn't have a "example.py" file
+    And The "tmp/amaterasu/src" shouldn't have a "example.scala" file
+
+  Scenario: Updating an existing repository from an invalid maki, should throw an error
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has an invalid maki file
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    When Updating the repository using the maki file
+    Then An HandlerError should be raised
+    And The "tmp/amaterasu/src" shouldn't have a "example.py" file
+    And The "tmp/amaterasu/src" shouldn't have a "example.scala" file
+
+  Scenario: Updating an existing repository from another invalid maki, should throw an error
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has another invalid maki file
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    When Updating the repository using the maki file
+    Then An HandlerError should be raised
+    And The "tmp/amaterasu/src" shouldn't have a "example.py" file
+    And The "tmp/amaterasu/src" shouldn't have a "example.scala" file
+
+
+  Scenario: Updating an existing repository with no sources from a valid maki, should created new sources
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    When Updating the repository using the maki file
+    Then An HandlerError should not be raised
+    And The "tmp/amaterasu/src" directory should have a "example.py" file
+    And The "tmp/amaterasu/src" directory should have a "example.scala" file
+
+  Scenario: Updating an existing repository with sources from a valid maki, and the sources in the repo are a subset of the ones in the maki, then new sources should be created
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    Given The "tmp/amaterasu/src" directory has a "example.scala" file
+    When Updating the repository using the maki file
+    Then An HandlerError should not be raised
+    And The "tmp/amaterasu/src" directory should have a "example.py" file
+
+  Scenario: Updating an existing repository with sources from a valid maki, and the sources in the repo are the same as the ones in the maki, then nothing should happen
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    Given The "tmp/amaterasu/src" directory has a "example.scala" file
+    Given The "tmp/amaterasu/src" directory has a "example.py" file
+    When Updating the repository using the maki file
+    Then An HandlerError should not be raised
+
+  Scenario: Updating an existing repository with sources from a valid maki, and the sources in the repo are a superset of the ones in the maki, the user is prompted to take action and chooses to keep the file and not update the maki, then nothing should happen
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    Given The "tmp/amaterasu/src" directory has a "example.scala" file
+    Given The "tmp/amaterasu/src" directory has a "example.py" file
+    Given The "tmp/amaterasu/src" directory has a "example.sql" file
+    When Updating the repository using the maki file, with user keeping source files that are not in the maki
+    Then An HandlerError should not be raised
+
+
+  Scenario: Updating an existing repository with sources from a valid maki, and the sources in the repo are a superset of the ones in the maki, the user chooses to not keep the files, then the extra files should be deleted
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    Given The "tmp/amaterasu/src" directory has a "example.scala" file
+    Given The "tmp/amaterasu/src" directory has a "example.py" file
+    Given The "tmp/amaterasu/src" directory has a "example.sql" file
+    Given The "tmp/amaterasu/src" directory has a "example.R" file
+    When Updating the repository using the maki file, with user not keeping source files that are not in the maki
+    Then An HandlerError should not be raised
+    And The "tmp/amaterasu/src" directory shouldn't have a "example.sql" file
+    And The "tmp/amaterasu/src" directory shouldn't have a "example.R" file
+
+
+
+  Scenario: Updating an existing repository with sources from a valid maki, and the sources in the repo are a superset of the ones in the maki, the user chooses to delete example.sql, only example.sql is deleted, example.R should stay
+    Given The relative path "tmp/amaterasu"
+    Given The path is a repository
+    Given The "tmp/amaterasu" directory has a "src" subdirectory
+    Given The "tmp/amaterasu" directory has a "env" subdirectory
+    Given The "tmp/amaterasu/env" directory has a "default" subdirectory
+    Given The "tmp/amaterasu" directory has a valid maki file
+    Given The "tmp/amaterasu/src" directory has a "example.scala" file
+    Given The "tmp/amaterasu/src" directory has a "example.py" file
+    Given The "tmp/amaterasu/src" directory has a "example.sql" file
+    Given The "tmp/amaterasu/src" directory has a "example.R" file
+    When Updating the repository using the maki file, with user not keeping "example.sql" and is keeping "example.R"
+    Then An HandlerError should not be raised
+    And The "tmp/amaterasu/src" directory shouldn't have a "example.sql" file
+    And The "tmp/amaterasu/src" directory should have a "example.R" file
\ No newline at end of file
diff --git a/cli/src/tests/compat.py b/cli/src/tests/compat.py
new file mode 100644
index 0000000..edb3528
--- /dev/null
+++ b/cli/src/tests/compat.py
@@ -0,0 +1,18 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from six import add_move, MovedModule
+add_move(MovedModule('mock', 'mock', 'unittest.mock'))
diff --git a/cli/src/tests/resources/job.yml b/cli/src/tests/resources/job.yml
new file mode 100644
index 0000000..9aaea8c
--- /dev/null
+++ b/cli/src/tests/resources/job.yml
@@ -0,0 +1,23 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+
+#     http://www.apache.org/licenses/LICENSE-2.0
+
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+name: default
+master: mesos://localhost:5050
+inputRootPath: hdfs://localhost:9000/user/amaterasu/input
+outputRootPath: hdfs://localhost:9000/user/amaterasu/output
+workingDir: alluxio://localhost:19998/
+configuration:
+    spark.cassandra.connection.host: 127.0.0.1,
+    sourceTable: documents
\ No newline at end of file
diff --git a/cli/src/tests/resources/maki_invalid.yml b/cli/src/tests/resources/maki_invalid.yml
new file mode 100644
index 0000000..e366d1e
--- /dev/null
+++ b/cli/src/tests/resources/maki_invalid.yml
@@ -0,0 +1,15 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+
+#     http://www.apache.org/licenses/LICENSE-2.0
+
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+boo!
\ No newline at end of file
diff --git a/cli/src/tests/resources/maki_invalid2.yml b/cli/src/tests/resources/maki_invalid2.yml
new file mode 100644
index 0000000..e81ba82
--- /dev/null
+++ b/cli/src/tests/resources/maki_invalid2.yml
@@ -0,0 +1,19 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+
+#     http://www.apache.org/licenses/LICENSE-2.0
+
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+foo: bar
+junk:
+  - kookoo:
+      a: 1
+      b: 2
\ No newline at end of file
diff --git a/cli/src/tests/resources/maki_valid.yml b/cli/src/tests/resources/maki_valid.yml
new file mode 100644
index 0000000..34cad71
--- /dev/null
+++ b/cli/src/tests/resources/maki_valid.yml
@@ -0,0 +1,30 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+
+#     http://www.apache.org/licenses/LICENSE-2.0
+
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+---
+job-name: amaterasu-test # Replace this with your job's name
+flow:
+    - name: start # Name of this step
+      runner:
+          group: spark # Currently supporting spark only, but expect more here in the future!
+          type: scala # scala, sql, r, python
+      file: example.scala # Source code for the step
+      exports:
+          odd: parquet
+    - name: step2
+      runner:
+          group: spark
+          type: python
+      file: example.py
+...
\ No newline at end of file
diff --git a/cli/src/tests/resources/spark.yml b/cli/src/tests/resources/spark.yml
new file mode 100644
index 0000000..566b033
--- /dev/null
+++ b/cli/src/tests/resources/spark.yml
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+
+#     http://www.apache.org/licenses/LICENSE-2.0
+
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+spark.executor.extraJavaOptions: -XX:+PrintGCDetails
+spark.executor.memory: 1g
\ No newline at end of file
diff --git a/cli/src/tests/utils.py b/cli/src/tests/utils.py
new file mode 100644
index 0000000..de76ba1
--- /dev/null
+++ b/cli/src/tests/utils.py
@@ -0,0 +1,120 @@
+"""
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to You under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+     http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+"""
+from hamcrest.core.base_matcher import BaseMatcher
+import yaml
+import os
+import sys
+
+str_type = str if sys.version_info[0] > 2 else unicode
+
+
+def str_ok(x):
+    return type(x) == str_type and len(x) > 0
+
+
+class MockArgs:
+    def __init__(self, **kwargs):
+        for kwarg, value in kwargs.items():
+            setattr(self, kwarg, value)
+
+def collect_stats(context, path):
+    for base_dir, dirs, files in os.walk(path):
+        if base_dir.endswith('.git'): continue
+        for f in files:
+            try:
+                f_path = os.path.join(base_dir, f)
+                stat = os.lstat(f_path)
+                context.stats_after[f_path] = stat
+            except FileNotFoundError:
+                pass
+        for d in dirs:
+            if d == '.git': continue
+            try:
+                d_path = os.path.join(base_dir, d)
+                stat = os.lstat(d_path)
+                context.stats_after[d_path] = stat
+            except FileNotFoundError:
+                pass
+
+
+# region Custom Matchers
+
+
+class MakiExistsInDirectoryAndIsStatus(BaseMatcher):
+    EMPTY = 0
+    VALID = 1
+    INVALID = 2
+
+    VALID_GROUPS = ['spark']
+    VALID_TYPES = ['scala', 'sql', 'python', 'r']
+
+    def __init__(self, status):
+        if status == self.EMPTY:
+            self._content_matcher = self._content_empty
+        elif status == self.VALID:
+            self._content_matcher = self._content_valid
+        elif status == self.INVALID:
+            self._content_matcher = self._content_invalid
+        self.status = status
+
+    def _matches(self, directory):
+        maki_path = os.path.abspath(os.path.join(directory, 'maki.yml'))
+        if not os.path.exists(maki_path):
+            return False
+        with open(maki_path) as f:
+            content = f.read()
+        return self._content_matcher(content)
+
+    def _content_empty(self, content):
+        return content == '' or content is None
+
+    def _content_valid(self, content):
+        maki = yaml.load(content)
+        first_level_ok = 'job_name' in maki and 'flow' in maki
+        if not first_level_ok:
+            return False
+        job_name_ok = str_ok(maki['job_name'])
+        flow_ok = type(maki['flow']) == list and len(maki['flow']) > 0
+        flow_steps_ok = True
+        for step in maki['flow']:
+            step_name_ok = lambda: 'name' in step and str_ok(step['name'])
+            step_runner_ok = lambda: 'runner' in step and type(step['runner']) == dict \
+                                     and 'group' in step['runner'] and str_ok(step['runner']['group']) and step['runner']['group'] in self.VALID_GROUPS \
+                                     and 'type' in step['runner'] and str_ok(step['runner']['type']) and step['runner']['type'] in self.VALID_TYPES
+            file_ok = lambda: 'file' in step and str_ok(step['file'])
+            step_ok = type(step) == dict and step_name_ok() and step_runner_ok() and file_ok()
+            if not step_ok:
+                flow_steps_ok = False
+                break
+        return job_name_ok and flow_ok and flow_steps_ok
+
+    def _content_invalid(self, content):
+        return not self._content_valid(content)
+
+    def describe_to(self, description):
+        description.append_text('Maki exists and is {}'.format(self.status))
+
+
+has_valid_maki = MakiExistsInDirectoryAndIsStatus(MakiExistsInDirectoryAndIsStatus.VALID)
+has_empty_maki = MakiExistsInDirectoryAndIsStatus(MakiExistsInDirectoryAndIsStatus.EMPTY)
+has_invalid_maki = MakiExistsInDirectoryAndIsStatus(MakiExistsInDirectoryAndIsStatus.INVALID)
+
+
+def noop(*args, **kwargs):
+    pass
+
+# endregion
diff --git a/common/src/main/scala/org/apache/amaterasu/common/configuration/ClusterConfig.scala b/common/src/main/scala/org/apache/amaterasu/common/configuration/ClusterConfig.scala
index 3661b48..f299c86 100755
--- a/common/src/main/scala/org/apache/amaterasu/common/configuration/ClusterConfig.scala
+++ b/common/src/main/scala/org/apache/amaterasu/common/configuration/ClusterConfig.scala
@@ -27,12 +27,10 @@ import scala.collection.mutable
 
 class ClusterConfig extends Logging {
 
-  val DEFAULT_FILE: InputStream = getClass.getResourceAsStream("/src/main/scripts/amaterasu.properties")
-  //val DEFAULT_FILE = getClass().getResourceAsStream("/amaterasu.properties")
+  val DEFAULT_FILE: InputStream = getClass.getResourceAsStream("/src/main/scripts/amaterasu.conf")
   var version: String = ""
   var user: String = ""
   var zk: String = ""
-  var mode: String = ""
   var master: String = "127.0.0.1"
   var masterPort: String = "5050"
   var timeout: Double = 600000
@@ -48,6 +46,17 @@ class ClusterConfig extends Logging {
   var additionalClassPath: String = ""
   var spark: Spark = new Spark()
 
+
+  class CLUSTER {
+    var manager: String = ""
+
+    def load(props: Properties): Unit = {
+      if (props.containsKey("cluster.manager")) manager = props.getProperty("cluster.manager")
+    }
+  }
+
+  val CLUSTER = new CLUSTER()
+
   //this should be a filesystem path that is reachable by all executors (HDFS, S3, local)
 
   class YARN {
@@ -190,8 +199,8 @@ class ClusterConfig extends Logging {
   }
 
   def validationCheck(): Unit = {
-    if (!Array("yarn", "mesos").contains(mode)) {
-      throw new ConfigurationException(s"mode $mode is not legal. Options are 'yarn' or 'mesos'!")
+    if (!Array("yarn", "mesos").contains(CLUSTER.manager)) {
+      throw new ConfigurationException(s"cluster manager ${CLUSTER.manager} is not legal. Options are 'yarn' or 'mesos'!")
     }
   }
 
@@ -207,13 +216,13 @@ class ClusterConfig extends Logging {
     if (props.containsKey("master")) master = props.getProperty("master")
     if (props.containsKey("masterPort")) masterPort = props.getProperty("masterPort")
     if (props.containsKey("timeout")) timeout = props.getProperty("timeout").asInstanceOf[Double]
-    if (props.containsKey("mode")) mode = props.getProperty("mode")
     if (props.containsKey("workingFolder")) workingFolder = props.getProperty("workingFolder", s"/user/$user")
     if (props.containsKey("pysparkPath")) pysparkPath = props.getProperty("pysparkPath")
     // TODO: rethink this
     Jar = this.getClass.getProtectionDomain.getCodeSource.getLocation.toURI.getPath
     JarName = Paths.get(this.getClass.getProtectionDomain.getCodeSource.getLocation.getPath).getFileName.toString
 
+    CLUSTER.load(props)
     Jobs.load(props)
     Webserver.load(props)
     YARN.load(props)
diff --git a/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/PySpark/PySparkRunner.scala b/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/PySpark/PySparkRunner.scala
index 79fe18a..61fd782 100755
--- a/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/PySpark/PySparkRunner.scala
+++ b/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/PySpark/PySparkRunner.scala
@@ -112,7 +112,7 @@ object PySparkRunner {
     if (pyDeps != null)
       condaPkgs = collectCondaPackages()
     var sparkCmd: Seq[String] = Seq()
-    config.mode match {
+    config.CLUSTER.manager match {
       case "yarn" =>
         pysparkPath = s"spark/bin/spark-submit"
         sparkCmd = Seq(pysparkPath, "--py-files", condaPkgs, "--master", "yarn", intpPath, port.toString)
diff --git a/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/SparkRunnersProvider.scala b/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/SparkRunnersProvider.scala
index ba7ff03..b9f6963 100644
--- a/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/SparkRunnersProvider.scala
+++ b/executor/src/main/scala/org/apache/amaterasu/executor/execution/actions/runners/spark/SparkRunnersProvider.scala
@@ -86,7 +86,7 @@ class SparkRunnersProvider extends RunnersProvider with Logging {
     runners.put(sparkScalaRunner.getIdentifier, sparkScalaRunner)
     var pypath = ""
     // TODO: get rid of hard-coded version
-    config.mode match {
+    config.CLUSTER.manager match {
       case "yarn" =>
         pypath = s"$$PYTHONPATH:$$SPARK_HOME/python:$$SPARK_HOME/python/build:${config.spark.home}/python:${config.spark.home}/python/pyspark:${config.spark.home}/python/pyspark/build:${config.spark.home}/python/pyspark/lib/py4j-0.10.4-src.zip:${new File(".").getAbsolutePath}"
       case "mesos" =>
@@ -111,7 +111,7 @@ class SparkRunnersProvider extends RunnersProvider with Logging {
   private def installAnacondaOnNode(): Unit = {
     // TODO: get rid of hard-coded version
 
-    this.clusterConfig.mode match {
+    this.clusterConfig.CLUSTER.manager match {
       case "yarn" => Seq("sh", "-c", "export HOME=$PWD && ./miniconda.sh -b -p miniconda") ! shellLoger
       case "mesos" => Seq("sh", "Miniconda2-latest-Linux-x86_64.sh", "-b", "-p", "miniconda") ! shellLoger
     }
diff --git a/executor/src/main/scala/org/apache/amaterasu/executor/mesos/executors/MesosActionsExecutor.scala b/executor/src/main/scala/org/apache/amaterasu/executor/mesos/executors/MesosActionsExecutor.scala
index 9ab75be..ca1742a 100755
--- a/executor/src/main/scala/org/apache/amaterasu/executor/mesos/executors/MesosActionsExecutor.scala
+++ b/executor/src/main/scala/org/apache/amaterasu/executor/mesos/executors/MesosActionsExecutor.scala
@@ -83,7 +83,7 @@ class MesosActionsExecutor extends Executor with Logging {
     notifier = new MesosNotifier(driver)
     notifier.info(s"Executor ${executorInfo.getExecutorId.getValue} registered")
     val outStream = new ByteArrayOutputStream()
-    providersFactory = ProvidersFactory(data, jobId, outStream, notifier, executorInfo.getExecutorId.getValue, hostName, propFile = "./amaterasu.properties")
+    providersFactory = ProvidersFactory(data, jobId, outStream, notifier, executorInfo.getExecutorId.getValue, hostName, propFile = "./amaterasu.conf")
 
   }
 
diff --git a/executor/src/main/scala/org/apache/amaterasu/executor/yarn/executors/ActionsExecutor.scala b/executor/src/main/scala/org/apache/amaterasu/executor/yarn/executors/ActionsExecutor.scala
index f4f553c..01a22b8 100644
--- a/executor/src/main/scala/org/apache/amaterasu/executor/yarn/executors/ActionsExecutor.scala
+++ b/executor/src/main/scala/org/apache/amaterasu/executor/yarn/executors/ActionsExecutor.scala
@@ -99,6 +99,6 @@ object ActionsExecutorLauncher extends App with Logging {
   val notifier = ActiveNotifier(notificationsAddress)
 
   log.info("Setup notifier")
-  actionsExecutor.providersFactory = ProvidersFactory(execData, jobId, baos, notifier, taskIdAndContainerId, hostName, propFile = "./amaterasu.properties")
+  actionsExecutor.providersFactory = ProvidersFactory(execData, jobId, baos, notifier, taskIdAndContainerId, hostName, propFile = "./amaterasu.conf")
   actionsExecutor.execute()
 }
diff --git a/executor/src/main/scala/org/apache/spark/repl/amaterasu/runners/spark/SparkRunnerHelper.scala b/executor/src/main/scala/org/apache/spark/repl/amaterasu/runners/spark/SparkRunnerHelper.scala
index f2c2afa..12fc25d 100644
--- a/executor/src/main/scala/org/apache/spark/repl/amaterasu/runners/spark/SparkRunnerHelper.scala
+++ b/executor/src/main/scala/org/apache/spark/repl/amaterasu/runners/spark/SparkRunnerHelper.scala
@@ -134,7 +134,7 @@ object SparkRunnerHelper extends Logging {
       env.master
     }
 
-    config.mode match {
+    config.CLUSTER.manager match {
 
       case "mesos" =>
         conf.set("spark.executor.uri", s"http://$getNode:${config.Webserver.Port}/spark-2.2.1-bin-hadoop2.7.tgz")
@@ -160,12 +160,12 @@ object SparkRunnerHelper extends Logging {
           .set("spark.history.fs.logDirectory", "hdfs:///spark2-history/")
           .set("hadoop.home.dir", config.YARN.hadoopHomeDir)
 
-      case _ => throw new Exception(s"mode ${config.mode} is not legal.")
+      case _ => throw new Exception(s"cluster manager ${config.CLUSTER.manager} is not legal.")
     }
 
     if (config.spark.opts != null && config.spark.opts.nonEmpty) {
       config.spark.opts.foreach(kv => {
-        log.info(s"Setting ${kv._1} to ${kv._2} as specified in amaterasu.properties")
+        log.info(s"Setting ${kv._1} to ${kv._2} as specified in amaterasu.conf")
         conf.set(kv._1, kv._2)
       })
     }
diff --git a/executor/src/test/resources/amaterasu.properties b/executor/src/test/resources/amaterasu.properties
index d402fed..1a67320 100755
--- a/executor/src/test/resources/amaterasu.properties
+++ b/executor/src/test/resources/amaterasu.properties
@@ -2,7 +2,7 @@ zk=127.0.0.1
 version=0.2.0-incubating
 master=192.168.33.11
 user=root
-mode=mesos
+cluster.manager=mesos
 webserver.port=8000
 webserver.root=dist
 spark.version=2.1.1-bin-hadoop2.7
diff --git a/executor/src/test/scala/org/apache/amaterasu/spark/SparkTestsSuite.scala b/executor/src/test/scala/org/apache/amaterasu/spark/SparkTestsSuite.scala
index b11a4f9..19cd9ce 100644
--- a/executor/src/test/scala/org/apache/amaterasu/spark/SparkTestsSuite.scala
+++ b/executor/src/test/scala/org/apache/amaterasu/spark/SparkTestsSuite.scala
@@ -80,7 +80,7 @@ class SparkTestsSuite extends Suites(
       new TestNotifier(),
       "test",
       "localhost",
-      getClass.getClassLoader.getResource("amaterasu.properties").getPath)
+      getClass.getClassLoader.getResource("amaterasu.conf").getPath)
     spark = factory.getRunner("spark", "scala").get.asInstanceOf[SparkScalaRunner].spark
 
     this.nestedSuites.filter(s => s.isInstanceOf[RunnersLoadingTests]).foreach(s => s.asInstanceOf[RunnersLoadingTests].factory = factory)
diff --git a/leader/src/main/java/org/apache/amaterasu/leader/yarn/ArgsParser.java b/leader/src/main/java/org/apache/amaterasu/leader/yarn/ArgsParser.java
index be0fc05..b5d90eb 100644
--- a/leader/src/main/java/org/apache/amaterasu/leader/yarn/ArgsParser.java
+++ b/leader/src/main/java/org/apache/amaterasu/leader/yarn/ArgsParser.java
@@ -30,6 +30,7 @@ private static Options getOptions() {
         options.addOption("j", "new-job-id", true, "The jobId - should never be passed by a user");
         options.addOption("r", "report", true, "The level of reporting");
         options.addOption("h", "home", true, "The level of reporting");
+        options.addOption("c", "config-file", true, "Path to where the Amaterasu configuration resides. Usually it should be at ~/.amaterasu/");
 
         return options;
     }
@@ -72,6 +73,10 @@ public static JobOpts getJobOpts(String[] args) throws ParseException {
             opts.name = cli.getOptionValue("name");
         }
 
+        if (cli.hasOption("config-file")) {
+            opts.configFile = cli.getOptionValue("config-file");
+        }
+
         return opts;
     }
 }
diff --git a/leader/src/main/java/org/apache/amaterasu/leader/yarn/Client.java b/leader/src/main/java/org/apache/amaterasu/leader/yarn/Client.java
index e3c2812..f3eb57b 100644
--- a/leader/src/main/java/org/apache/amaterasu/leader/yarn/Client.java
+++ b/leader/src/main/java/org/apache/amaterasu/leader/yarn/Client.java
@@ -73,7 +73,7 @@ private void run(JobOpts opts, String[] args) throws Exception {
 
         LogManager.resetConfiguration();
         ClusterConfig config = new ClusterConfig();
-        config.load(new FileInputStream(opts.home + "/amaterasu.properties"));
+        config.load(new FileInputStream(opts.configFile));
 
         // Create yarnClient
         YarnClient yarnClient = YarnClient.createYarnClient();
@@ -138,6 +138,7 @@ private void run(JobOpts opts, String[] args) throws Exception {
                 for (File f : home.listFiles()) {
                     fs.copyFromLocalFile(false, true, new Path(f.getAbsolutePath()), jarPathQualified);
                 }
+                fs.copyFromLocalFile(false, true, new Path(opts.configFile), Path.mergePaths(jarPathQualified, new Path("/amaterasu.conf")));
 
                 // setup frameworks
                 FrameworkProvidersFactory frameworkFactory = FrameworkProvidersFactory.apply(opts.env, config);
@@ -182,24 +183,25 @@ private void run(JobOpts opts, String[] args) throws Exception {
 
         try {
             leaderJar = setLocalResourceFromPath(mergedPath);
-            propFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/amaterasu.properties")));
+            propFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/etc/amaterasu/amaterasu.conf")));
             log4jPropFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/log4j.properties")));
         } catch (IOException e) {
             LOGGER.error("Error initializing yarn local resources.", e);
+            System.out.printf("ERROR: %s", e.getMessage());
             exit(4);
         }
 
         // set local resource on master container
         Map<String, LocalResource> localResources = new HashMap<>();
         localResources.put("leader.jar", leaderJar);
-        localResources.put("amaterasu.properties", propFile);
+        localResources.put("amaterasu.conf", propFile);
         localResources.put("log4j.properties", log4jPropFile);
         amContainer.setLocalResources(localResources);
 
         // Setup CLASSPATH for ApplicationMaster
         Map<String, String> appMasterEnv = new HashMap<>();
         setupAppMasterEnv(appMasterEnv);
-        appMasterEnv.put("AMA_CONF_PATH", String.format("%s/amaterasu.properties", config.YARN().hdfsJarsPath()));
+        appMasterEnv.put("AMA_CONF_PATH", String.format("%s/amaterasu.conf", config.YARN().hdfsJarsPath()));
         amContainer.setEnvironment(appMasterEnv);
 
         // Set up resource type requirements for ApplicationMaster
diff --git a/leader/src/main/java/org/apache/amaterasu/leader/yarn/JobOpts.java b/leader/src/main/java/org/apache/amaterasu/leader/yarn/JobOpts.java
index b8c29b7..0019012 100644
--- a/leader/src/main/java/org/apache/amaterasu/leader/yarn/JobOpts.java
+++ b/leader/src/main/java/org/apache/amaterasu/leader/yarn/JobOpts.java
@@ -25,4 +25,5 @@
     public String newJobId = null;
     public String report ="code";
     public String home ="";
+    public String configFile = "";
 }
\ No newline at end of file
diff --git a/leader/src/main/scala/org/apache/amaterasu/leader/mesos/schedulers/JobScheduler.scala b/leader/src/main/scala/org/apache/amaterasu/leader/mesos/schedulers/JobScheduler.scala
index 87a8f5d..1764112 100755
--- a/leader/src/main/scala/org/apache/amaterasu/leader/mesos/schedulers/JobScheduler.scala
+++ b/leader/src/main/scala/org/apache/amaterasu/leader/mesos/schedulers/JobScheduler.scala
@@ -205,7 +205,7 @@ class JobScheduler extends AmaterasuScheduler {
                     .setExtract(false)
                     .build())
                   .addUris(URI.newBuilder()
-                    .setValue(s"http://${sys.env("AMA_NODE")}:${config.Webserver.Port}/amaterasu.properties")
+                    .setValue(s"http://${sys.env("AMA_NODE")}:${config.Webserver.Port}/amaterasu.conf")
                     .setExecutable(false)
                     .setExtract(false)
                     .build())
diff --git a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/Args.scala b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/Args.scala
index c005256..7721a3d 100644
--- a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/Args.scala
+++ b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/Args.scala
@@ -24,10 +24,11 @@ case class Args(
                  jobId: String = null,
                  report: String = "code",
                  home: String = "",
+                 configFile: String = "",
                  newJobId: String = ""
                ) {
   def toCmdString: String = {
-    var cmd = s""" --repo $repo --branch $branch --env $env --name $name --report $report --home $home"""
+    var cmd = s""" --repo $repo --branch $branch --env $env --name $name --report $report --home $home --config-file $configFile"""
     if(jobId != null && !jobId.isEmpty) {
       cmd += s" --job-id $jobId"
     }
@@ -77,6 +78,9 @@ object Args {
       opt[String]('h', "home") action { (x, c) =>
         c.copy(home = x)
       }
+      opt[String]('c', "config-file") action { (x, c) =>
+        c.copy(configFile = x)
+      } text "Path to where the Amaterasu configuration resides. Default: /etc/amaterasu/amaterasu.conf"
     }
   }
 }
diff --git a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/BaseJobLauncher.scala b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/BaseJobLauncher.scala
index d1d0c53..f8d75e6 100644
--- a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/BaseJobLauncher.scala
+++ b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/BaseJobLauncher.scala
@@ -31,7 +31,7 @@ abstract class BaseJobLauncher extends App with Logging {
 
     case Some(arguments: Args) =>
 
-      val config = ClusterConfig(new FileInputStream(s"${arguments.home}/amaterasu.properties"))
+      val config = ClusterConfig(new FileInputStream(arguments.configFile))
       val resume = arguments.jobId != null
 
       run(arguments, config, resume)
diff --git a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/HttpServer.scala b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/HttpServer.scala
index 2e01963..5d852eb 100644
--- a/leader/src/main/scala/org/apache/amaterasu/leader/utilities/HttpServer.scala
+++ b/leader/src/main/scala/org/apache/amaterasu/leader/utilities/HttpServer.scala
@@ -44,6 +44,11 @@ object HttpServer extends Logging {
 
     BasicConfigurator.configure()
     initLogging()
+    log.debug(s"serverRoot $serverRoot")
+    log.info(s"serverRoot $serverRoot")
+    log.error(s"serverRoot $serverRoot")
+    log.warn(s"serverRoot $serverRoot")
+    println(s"serverRoot $serverRoot")
 
     server = new Server()
     val connector = new ServerConnector(server)
@@ -54,8 +59,11 @@ object HttpServer extends Logging {
     handler.setDirectoriesListed(true)
     handler.setWelcomeFiles(Array[String]("index.html"))
     handler.setResourceBase(serverRoot)
+    val configHandler = new ResourceHandler()
+    configHandler.setDirectoriesListed(true)
+    configHandler.setResourceBase("/etc/amaterasu")
     val handlers = new HandlerList()
-    handlers.setHandlers(Array(handler, new DefaultHandler()))
+    handlers.setHandlers(Array(handler, configHandler, new DefaultHandler()))
 
     server.setHandler(handlers)
     server.start()
diff --git a/leader/src/main/scala/org/apache/amaterasu/leader/yarn/ApplicationMaster.scala b/leader/src/main/scala/org/apache/amaterasu/leader/yarn/ApplicationMaster.scala
index 1828100..f1ce969 100644
--- a/leader/src/main/scala/org/apache/amaterasu/leader/yarn/ApplicationMaster.scala
+++ b/leader/src/main/scala/org/apache/amaterasu/leader/yarn/ApplicationMaster.scala
@@ -103,7 +103,7 @@ class ApplicationMaster extends AMRMClientAsync.CallbackHandler with Logging {
 
     log.info(s"started AM with args $arguments")
 
-    propPath = System.getenv("PWD") + "/amaterasu.properties"
+    propPath = System.getenv("PWD") + "/amaterasu.conf"
     props = new FileInputStream(new File(propPath))
 
     // no need for hdfs double check (nod to Aaron Rodgers)
@@ -139,7 +139,7 @@ class ApplicationMaster extends AMRMClientAsync.CallbackHandler with Logging {
     executorPath = Path.mergePaths(jarPath, new Path(s"/dist/executor-${config.version}-all.jar"))
     log.info("Executor jar path is {}", executorPath)
     executorJar = setLocalResourceFromPath(executorPath)
-    propFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/amaterasu.properties")))
+    propFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/amaterasu.conf")))
     log4jPropFile = setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/log4j.properties")))
 
     log.info("Started execute")
@@ -268,7 +268,7 @@ class ApplicationMaster extends AMRMClientAsync.CallbackHandler with Logging {
 
         val resources = mutable.Map[String, LocalResource](
           "executor.jar" -> executorJar,
-          "amaterasu.properties" -> propFile,
+          "amaterasu.conf" -> propFile,
           // TODO: Nadav/Eyal all of these should move to the executor resource setup
           "miniconda.sh" -> setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/dist/Miniconda2-latest-Linux-x86_64.sh"))),
           "codegen.py" -> setLocalResourceFromPath(Path.mergePaths(jarPath, new Path("/dist/codegen.py"))),
diff --git a/leader/src/main/scripts/ama-start-mesos.sh b/leader/src/main/scripts/ama-start-mesos.sh
index 18dbed9..3072688 100755
--- a/leader/src/main/scripts/ama-start-mesos.sh
+++ b/leader/src/main/scripts/ama-start-mesos.sh
@@ -130,7 +130,7 @@ if [ ! -f ${BASEDIR}/dist/Miniconda2-latest-Linux-x86_64.sh ]; then
     echo "${bold}Fetching miniconda distributable ${NC}"
     wget https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh -P ${BASEDIR}/dist
 fi
-cp ${BASEDIR}/amaterasu.properties ${BASEDIR}/dist
+cp /etc/amaterasu/amaterasu.conf ${BASEDIR}/dist
 eval $CMD | grep "===>"
 
 echo ""
diff --git a/leader/src/main/scripts/amaterasu.properties b/leader/src/main/scripts/amaterasu.conf
similarity index 100%
rename from leader/src/main/scripts/amaterasu.properties
rename to leader/src/main/scripts/amaterasu.conf
diff --git a/leader/src/test/resources/amaterasu.properties b/leader/src/test/resources/amaterasu.conf
similarity index 95%
rename from leader/src/test/resources/amaterasu.properties
rename to leader/src/test/resources/amaterasu.conf
index f16a0e1..09aad15 100755
--- a/leader/src/test/resources/amaterasu.properties
+++ b/leader/src/test/resources/amaterasu.conf
@@ -19,7 +19,7 @@ master=192.168.33.11
 
 user=root
 mode=mesos
-
+cluster.manager=mesos
 webserver.port=8000
 
 webserver.root=dist
diff --git a/leader/src/test/scala/org/apache/amaterasu/leader/mesos/ClusterSchedulerTests.scala b/leader/src/test/scala/org/apache/amaterasu/leader/mesos/ClusterSchedulerTests.scala
index ac5af36..19cb92f 100755
--- a/leader/src/test/scala/org/apache/amaterasu/leader/mesos/ClusterSchedulerTests.scala
+++ b/leader/src/test/scala/org/apache/amaterasu/leader/mesos/ClusterSchedulerTests.scala
@@ -26,7 +26,7 @@ class ClusterSchedulerTests extends FlatSpec with Matchers {
   "an offer" should "be accepted if has enough resources" in {
 
     val kami = Kami()
-    val config = ClusterConfig(getClass.getResourceAsStream("/amaterasu.properties"))
+    val config = ClusterConfig(getClass.getResourceAsStream("/amaterasu.conf"))
     config.Jobs.cpus = 1
     config.Jobs.mem = 1024
     config.Jobs.repoSize = 1024
@@ -42,7 +42,7 @@ class ClusterSchedulerTests extends FlatSpec with Matchers {
   it should "not be accepted if has missing resources" in {
 
     val kami = Kami()
-    val config = ClusterConfig(getClass.getResourceAsStream("/amaterasu.properties"))
+    val config = ClusterConfig(getClass.getResourceAsStream("/amaterasu.conf"))
     config.Jobs.cpus = 1
     config.Jobs.mem = 1024
     config.Jobs.repoSize = 1024
diff --git a/settings.gradle b/settings.gradle
index 1056e01..51ff8f1 100644
--- a/settings.gradle
+++ b/settings.gradle
@@ -18,5 +18,6 @@ include 'leader'
 include 'executor'
 include 'common'
 include 'sdk'
+include 'cli'
 findProject(':sdk')?.name = 'amaterasu-sdk'
 


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services