You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@climate.apache.org by jo...@apache.org on 2014/07/01 16:50:14 UTC

[50/56] [partial] gh-pages clean up

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/data_source/data_sources.rst
----------------------------------------------------------------------
diff --git a/docs/source/data_source/data_sources.rst b/docs/source/data_source/data_sources.rst
deleted file mode 100644
index c368d86..0000000
--- a/docs/source/data_source/data_sources.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-Data Sources
-************
-
-Local Module
-============
-.. automodule:: local
-    :members:
-
-RCMED Module
-============
-.. automodule:: rcmed
-    :members:
-
-DAP Module
-==========
-.. automodule:: dap
-    :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/index.rst
----------------------------------------------------------------------
diff --git a/docs/source/index.rst b/docs/source/index.rst
deleted file mode 100644
index f57dd82..0000000
--- a/docs/source/index.rst
+++ /dev/null
@@ -1,29 +0,0 @@
-.. Apache Open Climate Workbench documentation master file, created by
-   sphinx-quickstart on Fri Oct 25 07:58:45 2013.
-   You can adapt this file completely to your liking, but it should at least
-   contain the root `toctree` directive.
-
-Welcome to Apache Open Climate Workbench's documentation!
-=========================================================
-
-Contents:
-
-.. toctree::
-   :maxdepth: 4
-
-   ocw/dataset
-   ocw/dataset_processor
-   ocw/evaluation
-   ocw/metrics
-   ocw/plotter
-   data_source/data_sources
-   ui-backend/backend.rst
-
-
-Indices and tables
-==================
-
-* :ref:`genindex`
-* :ref:`modindex`
-* :ref:`search`
-

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ocw/dataset.rst
----------------------------------------------------------------------
diff --git a/docs/source/ocw/dataset.rst b/docs/source/ocw/dataset.rst
deleted file mode 100644
index 7e8d172..0000000
--- a/docs/source/ocw/dataset.rst
+++ /dev/null
@@ -1,12 +0,0 @@
-Dataset Module
-**************
-
-Bounds
-======
-.. autoclass:: dataset.Bounds
-    :members:
-
-Dataset
-=======
-.. autoclass:: dataset.Dataset
-    :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ocw/dataset_processor.rst
----------------------------------------------------------------------
diff --git a/docs/source/ocw/dataset_processor.rst b/docs/source/ocw/dataset_processor.rst
deleted file mode 100644
index b221ef8..0000000
--- a/docs/source/ocw/dataset_processor.rst
+++ /dev/null
@@ -1,5 +0,0 @@
-Dataset Processor Module
-************************
-
-.. automodule:: dataset_processor
-   :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ocw/evaluation.rst
----------------------------------------------------------------------
diff --git a/docs/source/ocw/evaluation.rst b/docs/source/ocw/evaluation.rst
deleted file mode 100644
index 63a15a5..0000000
--- a/docs/source/ocw/evaluation.rst
+++ /dev/null
@@ -1,5 +0,0 @@
-Evaluation Module
-*****************
-
-.. autoclass:: evaluation.Evaluation
-    :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ocw/metrics.rst
----------------------------------------------------------------------
diff --git a/docs/source/ocw/metrics.rst b/docs/source/ocw/metrics.rst
deleted file mode 100644
index 93d7ce3..0000000
--- a/docs/source/ocw/metrics.rst
+++ /dev/null
@@ -1,22 +0,0 @@
-Metrics Module
-**************
-
-UnaryMetric
-===========
-.. autoclass:: metrics.UnaryMetric
-    :members:
-
-BinaryMetric
-============
-.. autoclass:: metrics.BinaryMetric
-    :members:
-
-Bias
-====
-.. autoclass:: metrics.Bias
-    :members:
-
-TemporalStdDev
-==============
-.. autoclass:: metrics.TemporalStdDev
-    :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ocw/plotter.rst
----------------------------------------------------------------------
diff --git a/docs/source/ocw/plotter.rst b/docs/source/ocw/plotter.rst
deleted file mode 100644
index 1972efc..0000000
--- a/docs/source/ocw/plotter.rst
+++ /dev/null
@@ -1,5 +0,0 @@
-Plotter Module
-**************
-
-.. automodule:: plotter
-    :members:

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/docs/source/ui-backend/backend.rst
----------------------------------------------------------------------
diff --git a/docs/source/ui-backend/backend.rst b/docs/source/ui-backend/backend.rst
deleted file mode 100644
index f2fe6fb..0000000
--- a/docs/source/ui-backend/backend.rst
+++ /dev/null
@@ -1,80 +0,0 @@
-Evaluation UI Webservices
-*************************
-
-The OCW evaluation UI is a demonstration web application that is built upon the
-OCW toolkit. The web services for the application are written in Python on top
-of the Bottle Web Framework.
-
-Configuration and Dependencies
-==============================
-
-The Evaluation UI is built on top of the OCW toolkit and as such requires it to
-function properly. Please check the toolkit's documentation for relevant
-installation instructions. You will also need to ensure that you have Bottle
-installed. You can install it with:
-
-.. code::
-    
-    pip install bottle
-
-The backend serves the static files for the evaluation frontend as well. If you
-plan to use the frontend you need to ensure that the *app* directory is present
-in the main web service directory. The easiest way to do this is to create a
-symbolic link where the *run_webservices* module is located. Assuming you have
-the entire *ocw-ui* directory, you can do this with the following command.
-
-.. code::
-
-    cd ocw-ui/backend
-    ln -s ../frontend/app app
-
-Finally, to start the backend just run the following command.
-
-.. code::
-
-    python run_webservices.py
-    
-Web Service Explanation
-=======================
-
-The backend endpoints are broken up into a number of modules for ease of
-maintenance and understanding. The *run_webservices* module is the primary
-application module. It brings together all the various submodules into a
-useful system. It also defines a number of helpful endpoints for returning
-static files such as the index page, CSS files, JavaScript files, and more.
-
-Local File Metadata Extractors
-------------------------------
-
-The *local_file_metadata_extractors* module contains all the endpoints that are
-used to strip information out of various objects for display in the UI. At the
-moment, the main functionality is stripping out metadata from NetCDF files when
-a user wishes to *load* a local file into the evaluation.
-
-.. autobottle:: local_file_metadata_extractors:lfme_app
-
-Directory Helpers
------------------
-
-The *directory_helpers* module contains a number of endpoints for working
-directory manipulation. The frontend uses these endpoints to grab directory
-information (within a prefix path for security), return result directory
-information, and other things.
-
-.. autobottle:: directory_helpers:dir_app
-
-RCMED Helpers
--------------
-
-The *rcmed_helpers* module contains endpoints for loading datasets from the
-Regional Climate Model Evaluation Database at NASA's Jet Propulsion Laboratory.
-
-.. autobottle:: rcmed_helpers:rcmed_app
-
-Processing Endpoints
---------------------
-
-The *processing* module contains all the endpoints related to the running of
-evaluations.
-
-.. autobottle:: processing:processing_app

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/easy-ocw/install-osx.sh
----------------------------------------------------------------------
diff --git a/easy-ocw/install-osx.sh b/easy-ocw/install-osx.sh
deleted file mode 100755
index ca7581f..0000000
--- a/easy-ocw/install-osx.sh
+++ /dev/null
@@ -1,220 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export ARCHFLAGS=-Wno-error=unused-command-line-argument-hard-error-in-future
-
-help()
-{
-cat << ENDHELP
-
-Easy OCW assists with the building of the Apache Open Climate Workbench and its dependencies
-
-Flags:
-    -h  Display this help message.
-    -e  Install and configure a virtualenv environment before installation.
-    -q  Quiet install. User prompts are removed (when possible).
-
-It is recommended that you pass -e when running this script. If you don't, parts
-of this installation will pollute your global Python install. If you're unsure,
-pass -e just to be safe!
-
-ENDHELP
-}
-
-header()
-{
-    echo
-    echo $1
-}
-
-task()
-{
-    echo " - " $1
-}
-
-subtask()
-{
-    echo "     " $1
-}
-
-echo
-echo "---------------------------------------------------------------------------"
-echo "                         Welcome to Easy OCW"
-echo "---------------------------------------------------------------------------"
-echo
-
-WITH_VIRTUAL_ENV=0
-WITH_HOMEBREW=0
-WITH_INTERACT=1
-INIT_PWD=$PWD
-while getopts ":h :e :q" FLAG
-do
-    case $FLAG in
-        h)
-            help
-            exit 1
-            ;;
-        e)
-            WITH_VIRTUAL_ENV=1
-            ;;
-        q)
-            WITH_INTERACT=0
-            ;;
-        ?)
-            help
-            exit 1
-            ;;
-    esac
-done
-
-if [ $WITH_INTERACT == 1 ]; then
-cat << ENDINTRO
-A number of dependencies for OCW will now be installed. Please check the wiki
-for a complete list of dependencies. Additionally, please read the wiki for
-useful installation guidelines and information that may be pertinent to your
-situation. All of this can be found at http://s.apache.org/3p2
-
-ENDINTRO
-
-if [ $WITH_VIRTUAL_ENV != 1 ]; then
-cat << VIRTUALENV_WARNING
-It is highly recommended that you allow Easy OCW to install the dependencies
-into a virtualenv environment to ensure that your global Python install is
-not affected. If you're unsure, you should pass the -e flag
-to this script. If you aren't concerned, or you want to create your own
-virtualenv environment, then feel free to ignore this message.
-
-VIRTUALENV_WARNING
-fi
-
-read -p "Press [ENTER] to begin installation ..."
-fi
-
-header "Checking for pip ..."
-if [ ! command -v pip >/dev/null 2>&1 ]; then
-    task "Unable to locate pip."
-    task "Installing Distribute"
-    curl http://python-distribute.org/distribute_setup.py | python >> install_log
-    subtask "done"
-
-    task "Installing Pip"
-    curl -O http://pypi.python.org/packages/source/p/pip/pip-1.2.1.tar.gz >> install_log
-    tar xzf pip-1.2.1.tar.gz >> install_log
-    cd pip-1.2.1/
-    python setup.py install >> install_log
-    subtask "done"
-fi
-
-if [ $WITH_VIRTUAL_ENV == 1 ]; then
-    header "Setting up a virtualenv ..."
-
-    # Check if virtualenv is installed. If it's not, we'll install it for the user.
-    if ! command -v virtualenv >/dev/null 2>&1; then
-        task "Installing virtualenv ..."
-        pip install virtualenv >> install_log
-        subtask "done"
-    fi
-
-    header "Checking for previously installed  ocw virtual environment..."
-    if [ -e ~/ocw/bin/python ]; then
-        echo "We found an existing 'ocw' virtualenv on your system in ~/ocw."
-        read -n1 -p "Do you want to replace it with a clean install? y/n :" replace 
-        if [ "$replace" == "y" ]; then
-            echo ""
-            echo "WARNING this will delete all file and data in ~/ocw on your system."
-            read  -p "To confirm and proceed type YES or ENTER to quit:" confirm
-            if [ "$confirm" == "YES" ]; then
-                echo "Deleting contents of ~/ocw" >> install_log
-                rm -rf ~/ocw
-            else
-                echo ""
-                echo "Stopping Open Climate Workbench Installation"
-                exit
-            fi
-        else
-            echo ""
-            echo "Stopping Open Climate Workbench Installation because an existing 'ocw'"
-            echo "virtual environment already exists."
-            exit
-        fi
-    fi
-
-    # Create a new environment for OCW work
-    task "Creating a new environment in ~/ocw..."
-    cd ~
-    virtualenv ocw >> install_log
-    source ~/ocw/bin/activate >> install_log
-    cd $INIT_PWD
-    subtask "done"
-fi
-
-header "Installing conda for package management ..."
-
-pip install conda >> install_log
-conda init >> install_log
-
-
-header "Installing dependencies with conda ..."
-echo | conda install --file ocw-conda-dependencies.txt
-
-# We only use conda for the annoying dependencies like numpy,
-# scipy, matplotlib, and basemap. For everything else, we stick
-# with pip.
-header "Installing additional Python packages"
-pip install -r ocw-pip-dependencies.txt >> install_log
-
-
-if [ $WITH_VIRTUAL_ENV == 1 ]; then
-    echo "***POST INSTALLATION NOTE***
-
-If you are familiar with virtualenv you should know that activating
-the new 'ocw' environment is different because we use conda to install
-packages.  An example of the command you want to run is listed in the
-alias below.
-
-To make it easier to change into the 'ocw' virtualenv add the
-following alias to your ~/.bash_profile
-
-    alias ocw='source ~/ocw/bin/activate ~/ocw/'
-
-When you want to use ocw in the future, you just have to type 'ocw'
-in your terminal."
-else
-    echo "***POST INSTALLATION NOTE***
-
-If you have run this script within your own virtualenv you need to know
-a couple of caveats/side effects that are caused by using conda to install
-packages within the virtualenv.
-
-- Virtualenv wrapper will throw errors like those outlined here:
-https://issues.apache.org/jira/browse/CLIMATE-451
-
-- You will not be able to 'activate' the environment using the normal
-virtualenv command, you must instead use the conda activate command as follows:
-
-source path/to/your_env/bin/activate path/to/your_env
-
-Example:  (assuming your env is in ~/.virtualenv/ocw)
-
-source ~/.virtualenv/ocw/bin/activate ~/.virtualenv/ocw
-
-"
-
-fi
-

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/easy-ocw/install-ubuntu-12_04.sh
----------------------------------------------------------------------
diff --git a/easy-ocw/install-ubuntu-12_04.sh b/easy-ocw/install-ubuntu-12_04.sh
deleted file mode 100755
index 2270eb0..0000000
--- a/easy-ocw/install-ubuntu-12_04.sh
+++ /dev/null
@@ -1,189 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-help()
-{
-cat << ENDHELP
-
-Easy OCW assists with the building of the Apache Open Climate Workbench and its dependencies
-
-Flags:
-    -h  Display this help message.
-    -e  Install and configure a virtualenv environment before installation.
-    -q  Quiet install. User prompts are removed (when possible).
-
-It is recommended that you pass -e when running this script. If you don't, parts
-of this installation will pollute your global Python install. If you're unsure,
-pass -e just to be safe!
-
-ENDHELP
-}
-
-header()
-{
-    echo
-    echo $1
-}
-
-task()
-{
-    echo " - " $1
-}
-
-subtask()
-{
-    echo "     " $1
-}
-
-echo
-echo "---------------------------------------------------------------------------"
-echo "                         Welcome to Easy OCW"
-echo "---------------------------------------------------------------------------"
-echo
-
-WITH_VIRTUAL_ENV=0
-WITH_HOMEBREW=0
-WITH_INTERACT=1
-
-while getopts ":h :e :q" FLAG
-do
-    case $FLAG in
-        h)
-            help
-            exit 1
-            ;;
-        e)
-            WITH_VIRTUAL_ENV=1
-            ;;
-        q)
-            WITH_INTERACT=0
-            ;;
-        ?)
-            help
-            exit 1
-            ;;
-    esac
-done
-
-if [ $WITH_INTERACT == 1 ]; then
-cat << ENDINTRO
-A number of dependencies for OCW will now be installed. Please check the wiki
-for a complete list of dependencies. Additionally, please read the wiki for
-useful installation guidelines and information that may be pertinent to your
-situation. All of this can be found at http://s.apache.org/3p2
-
-ENDINTRO
-
-if [ $WITH_VIRTUAL_ENV != 1 ]; then
-cat << VIRTUALENV_WARNING
-It is highly recommended that you allow Easy OCW to install the dependencies
-into a virtualenv environment to ensure that your global Python install is
-not affected. If you're unsure, you should pass the -e flag
-to this script. If you aren't concerned, or you want to create your own
-virtualenv environment, then feel free to ignore this message.
-
-VIRTUALENV_WARNING
-fi
-
-read -p "Press [ENTER] to begin installation ..."
-fi
-
-header "Checking for pip ..."
-if [ ! command -v pip >/dev/null 2>&1 ]; then
-    task "Unable to locate pip."
-    task "Installing Pip"
-    sudo apt-get install python-pip >> install_log
-    subtask "done"
-fi
-
-if [ $WITH_VIRTUAL_ENV == 1 ]; then
-    header "Setting up a virtualenv ..."
-
-    # Check if virtualenv is installed. If it's not, we'll install it for the user.
-    if ! command -v virtualenv >/dev/null 2>&1; then
-        task "Installing virtualenv ..."
-        sudo apt-get install -y python-virtualenv >> install_log
-        subtask "done"
-    fi
-
-    # Create a new environment for OCW work
-    task "Creating a new environment ..."
-    virtualenv ocw >> install_log
-    source ocw/bin/activate
-    subtask "done"
-fi
-
-# Install Continuum Analytics Anaconda Python distribution. This gives
-# almost all the dependencies that OCW needs in a single, easy to
-# install package.
-
-header "Installing Anaconda Python distribution ..."
-echo
-echo "*** NOTE *** When asked to update your PATH, you should respond YES."
-read -p "Press [ENTER] to continue ..."
-
-cd
-task "Downloading Anaconda ..."
-wget -O Anaconda-1.9.2-Linux-x86_64.sh "http://repo.continuum.io/archive/Anaconda-1.9.2-Linux-x86_64.sh" 2> install_log
-subtask "done"
-
-task "Installing ..."
-bash Anaconda-1.9.2-Linux-x86_64.sh
-export PATH="/home/vagrant/anaconda/bin:$PATH"
-subtask "done"
-
-# Install Basemap. Conda cannot be used for this install since
-# it fails to analyse the dependencies (at the time of writing). This
-# will install it manually. At some point, this should be replaced with
-# 'conda install basemap' once it is working again!
-header "Handling Basemap install ..."
-
-cd
-task "Downloading basemap ..."
-wget -O basemap-1.0.7.tar.gz "http://sourceforge.net/projects/matplotlib/files/matplotlib-toolkits/basemap-1.0.7/basemap-1.0.7.tar.gz/download" 2> install_log
-tar xzf basemap-1.0.7.tar.gz >> install_log
-subtask "done"
-
-# Install GEOS
-task "Installing GEOS dependency ..."
-cd basemap-1.0.7/geos-3.3.3
-export GEOS_DIR=/usr/local
-./configure --prefix=$GEOS_DIR >> install_log
-sudo make >> install_log
-sudo make install >> install_log
-subtask "done"
-
-# Install basemap
-task "Installing Basemap ..."
-cd ..
-python setup.py install >> install_log
-subtask "done"
-
-cd
-
-# Install miscellaneous Python packages needed for OCW. Some of these
-# can be installed with Conda, but since none of them have an annoying
-# compiled component we just installed them with Pip.
-header "Installing additional Python packages"
-pip install -r ocw-pip-dependencies.txt >> install_log
-
-# Ensure that the climate code is included in the Python Path
-header "Updating PYTHONPATH ..."
-echo "export PYTHONPATH=/home/vagrant/climate:/home/vagrant/climate/ocw/:/home/vagrant/climate/rcmet/src/main/python/rcmes" >> /home/vagrant/.bashrc
-subtask "done"

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/easy-ocw/ocw-conda-dependencies.txt
----------------------------------------------------------------------
diff --git a/easy-ocw/ocw-conda-dependencies.txt b/easy-ocw/ocw-conda-dependencies.txt
deleted file mode 100644
index f6c5a26..0000000
--- a/easy-ocw/ocw-conda-dependencies.txt
+++ /dev/null
@@ -1,22 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-numpy=1.8.1
-scipy=0.13.3
-matplotlib=1.3.1
-basemap=1.0.7
-netcdf4=1.0.8

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/easy-ocw/ocw-pip-dependencies.txt
----------------------------------------------------------------------
diff --git a/easy-ocw/ocw-pip-dependencies.txt b/easy-ocw/ocw-pip-dependencies.txt
deleted file mode 100644
index 929f1a7..0000000
--- a/easy-ocw/ocw-pip-dependencies.txt
+++ /dev/null
@@ -1,9 +0,0 @@
-requests==2.3.0
-bottle==0.11.6
-pydap==3.1.1
-webtest==2.0.15
-nose==1.3.3
-nose-exclude==0.2.0
-pylint==1.2.1
-sphinx==1.2.1
-sphinxcontrib-httpdomain==1.2.1

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/README
----------------------------------------------------------------------
diff --git a/esgf/README b/esgf/README
deleted file mode 100644
index 019eb24..0000000
--- a/esgf/README
+++ /dev/null
@@ -1,17 +0,0 @@
-PROTOTYPE PYTHON PACKAGE FOR ESGF-RCMES INTEGRATION
-
-Python dependencies:
-    - Python 2.7
-    - PyOpenSSL (sudo easy_install PyOpenSSL==0.10 - NOTE: latest PyOpenSSL 0.13 needs latest SSL installed)
-    - myproxyclient (sudo easy_install MyProxyClient)
-    - esgf-pyclient (sudo easy_install esgf-pyclient)
-    
-Pre-requisites:
-    - user must be registered with ESGF and be assigned an OpenID to logon
-    - user must have been granted membership in group CMIP5 Research to be able to download CMIP5 datasets
-    
-Usage:
-    - install package under some directory: <INSTALL_DIR>
-    - export PYTHONPATH=<INSTALL_DIR>/src
-    - edit <INSTALL_DIR>/src/esgf/rcmes/main.py to configure openid, password, output directory
-    - python <INSTALL_DIR>/src/esgf/rcmes/main.py
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/__init__.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/__init__.py b/esgf/src/esgf/__init__.py
deleted file mode 100644
index 2872cad..0000000
--- a/esgf/src/esgf/__init__.py
+++ /dev/null
@@ -1,18 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/__init__.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/__init__.py b/esgf/src/esgf/rcmes/__init__.py
deleted file mode 100644
index 2872cad..0000000
--- a/esgf/src/esgf/rcmes/__init__.py
+++ /dev/null
@@ -1,18 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/constants.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/constants.py b/esgf/src/esgf/rcmes/constants.py
deleted file mode 100644
index a6baf9f..0000000
--- a/esgf/src/esgf/rcmes/constants.py
+++ /dev/null
@@ -1,35 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''Module containing constant parameters for ESGF RCMES integration.'''
-
-# default location of ESGF user credentials
-ESGF_CREDENTIALS = "~/.esg/credentials.pem"
-
-# URL for ESGF certificate service
-#CERT_SERVICE_URL = "https://localhost:8443/esgf-idp/idp/getcert.htm"
-CERT_SERVICE_URL = "https://esg-datanode.jpl.nasa.gov/esgf-idp/idp/getcert.htm"
-
-# Basic authentication realm
-REALM = "ESGF"
-
-# DN of JPL MyProxy server (needs to be explicitely set somtimes)
-JPL_MYPROXY_SERVER_DN = "/O=ESGF/OU=esg-datanode.jpl.nasa.gov/CN=host/esg-vm.jpl.nasa.gov"
-
-# URL of ESGF search service to contact
-JPL_SEARCH_SERVICE_URL = "http://esg-datanode.jpl.nasa.gov/esg-search/search"

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/download.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/download.py b/esgf/src/esgf/rcmes/download.py
deleted file mode 100644
index 389b982..0000000
--- a/esgf/src/esgf/rcmes/download.py
+++ /dev/null
@@ -1,67 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''
-RCMES module to download a file from ESGF.
-
-'''
-
-import urllib2, httplib
-from os.path import expanduser, join
-
-from esgf.rcmes.constants import ESGF_CREDENTIALS
-
-class HTTPSClientAuthHandler(urllib2.HTTPSHandler):
-    '''
-    HTTP handler that transmits an X509 certificate as part of the request
-    '''
-    
-    def __init__(self, key, cert):
-            urllib2.HTTPSHandler.__init__(self)
-            self.key = key
-            self.cert = cert
-    def https_open(self, req):
-            return self.do_open(self.getConnection, req)
-    def getConnection(self, host, timeout=300):
-            return httplib.HTTPSConnection(host, key_file=self.key, cert_file=self.cert)
-
-def download(url, toDirectory="/tmp"):
-    '''
-    Function to download a single file from ESGF.
-    
-    :param url: the URL of the file to download
-    :param toDirectory: target directory where the file will be written
-    '''
-    
-    # setup HTTP handler
-    certFile = expanduser(ESGF_CREDENTIALS)
-    opener = urllib2.build_opener(HTTPSClientAuthHandler(certFile,certFile))
-    opener.add_handler(urllib2.HTTPCookieProcessor())
-    
-    # download file
-    localFilePath = join(toDirectory,url.split('/')[-1])
-    print "\nDownloading url: %s to local path: %s ..." % (url, localFilePath)
-    localFile=open( localFilePath, 'w')
-    webFile=opener.open(url)
-    localFile.write(webFile.read())
-    
-    # cleanup
-    localFile.close()
-    webFile.close()
-    opener.close()
-    print "... done"

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/logon.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/logon.py b/esgf/src/esgf/rcmes/logon.py
deleted file mode 100644
index 0a6f4f7..0000000
--- a/esgf/src/esgf/rcmes/logon.py
+++ /dev/null
@@ -1,44 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''
-RCMES module to logon onto the ESGF.
-'''
-
-from pyesgf.logon import LogonManager
-import os
-
-from esgf.rcmes.constants import JPL_MYPROXY_SERVER_DN
-
-def logon(openid, password):
-    '''
-    Function to retrieve a short-term X.509 certificate that can be used to authenticate with ESGF.
-    The certificate is written in the location ~/.esg/credentials.pem.
-    The trusted CA certificates are written in the directory ~/.esg/certificates.
-    '''
-    
-    # Must configure the DN of the JPL MyProxy server if using a JPL openid
-    if "esg-datanode.jpl.nasa.gov" in openid:  
-        os.environ['MYPROXY_SERVER_DN'] = JPL_MYPROXY_SERVER_DN
-        
-    lm = LogonManager()
-    lm.logon_with_openid(openid,password)
-    return lm.is_logged_on()
-    
-    
-    

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/logon2.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/logon2.py b/esgf/src/esgf/rcmes/logon2.py
deleted file mode 100644
index 11a808a..0000000
--- a/esgf/src/esgf/rcmes/logon2.py
+++ /dev/null
@@ -1,43 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''
-RCMES module to logon onto the ESGF by contacting the IdP RESTful service.
-'''
-
-from esgf.rcmes.constants import ESGF_CREDENTIALS, CERT_SERVICE_URL, REALM
-
-import urllib2
-from os.path import expanduser
-
-def logon2(openid, password):
-    '''
-    Function to retrieve a short-term X.509 certificate that can be used to authenticate with ESGF.
-    The certificate is written in the location specified by ESGF_CREDENTIALS.
-    '''
-    
-    password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
-    password_mgr.add_password(REALM, CERT_SERVICE_URL, openid, password)
-    handler = urllib2.HTTPBasicAuthHandler(password_mgr)
-    opener = urllib2.build_opener(urllib2.HTTPHandler, handler)
-    request = opener.open(CERT_SERVICE_URL)
-    #print request.read()
-    
-    localFilePath = expanduser(ESGF_CREDENTIALS)
-    certFile=open(localFilePath, 'w')
-    certFile.write(request.read())
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/main.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/main.py b/esgf/src/esgf/rcmes/main.py
deleted file mode 100644
index 4ffadcc..0000000
--- a/esgf/src/esgf/rcmes/main.py
+++ /dev/null
@@ -1,116 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''
-Example main program for ESGF-RCMES integration.
-    
-'''
-
-# constant parameters
-USER_OPENID = "https://esg-datanode.jpl.nasa.gov/esgf-idp/openid/lucacinquini"
-USER_PASSWORD = "*****"
-DATA_DIRECTORY = "/tmp"
-
-from esgf.rcmes.logon import logon
-from esgf.rcmes.logon2 import logon2
-from esgf.rcmes.search import SearchClient
-from esgf.rcmes.download import download
-
-def main():
-    '''Example driver program'''
-    
-    # step 1: obtain short-term certificate
-    print 'Retrieving ESGF certificate...'
-    # logon using client-side MyProxy libraries
-    #if logon(USER_OPENID, USER_PASSWORD):
-    #    print "...done."
-    # logon through server-side MyProxy service
-    if logon2(USER_OPENID, USER_PASSWORD):
-        print "...done"
-    
-    # step 2: execute faceted search for files
-    urls = main_obs4mips()
-    #urls = main_cmip5()
-    
-    # step 3: download file(s)
-    for i, url in enumerate(urls):
-        if i>=1:
-            break
-        download(url, toDirectory=DATA_DIRECTORY)
-
-    
-def main_cmip5():
-    '''
-    Example workflow to search for CMIP5 files
-    '''
-    
-    searchClient = SearchClient(searchServiceUrl="http://pcmdi9.llnl.gov/esg-search/search", distrib=False)
-    
-    print '\nAvailable projects=%s' % searchClient.getFacets('project')
-    searchClient.setConstraint(project='CMIP5')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable models=%s' % searchClient.getFacets('model')
-    searchClient.setConstraint(model='INM-CM4')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable experiments=%s' % searchClient.getFacets('experiment')
-    searchClient.setConstraint(experiment='historical')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable time frequencies=%s' % searchClient.getFacets('time_frequency')
-    searchClient.setConstraint(time_frequency='mon')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-
-    print '\nAvailable CF standard names=%s' % searchClient.getFacets('cf_standard_name')
-    searchClient.setConstraint(cf_standard_name='air_temperature')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    urls = searchClient.getFiles()
-    return urls
-    
-    
-def main_obs4mips():
-    '''
-    Example workflow to search for obs4MIPs files.
-    '''
-    
-    searchClient = SearchClient(distrib=False)
-    
-    # obs4MIPs
-    print '\nAvailable projects=%s' % searchClient.getFacets('project')
-    searchClient.setConstraint(project='obs4MIPs')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable variables=%s' % searchClient.getFacets('variable')
-    searchClient.setConstraint(variable='hus')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable time frequencies=%s' % searchClient.getFacets('time_frequency')
-    searchClient.setConstraint(time_frequency='mon')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    print '\nAvailable models=%s' % searchClient.getFacets('model')
-    searchClient.setConstraint(model='Obs-MLS')
-    print "Number of Datasets=%d" % searchClient.getNumberOfDatasets()
-    
-    urls = searchClient.getFiles()
-    return urls
-
-if __name__ == '__main__':
-    main()
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/esgf/src/esgf/rcmes/search.py
----------------------------------------------------------------------
diff --git a/esgf/src/esgf/rcmes/search.py b/esgf/src/esgf/rcmes/search.py
deleted file mode 100644
index 3a81f24..0000000
--- a/esgf/src/esgf/rcmes/search.py
+++ /dev/null
@@ -1,89 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-'''
-RCMES module to execute a faceted search for ESGF files.
-
-'''
-
-from pyesgf.search import SearchConnection
-
-from esgf.rcmes.constants import JPL_SEARCH_SERVICE_URL
-
-class SearchClient():
-    """
-    Simple ESGF search client for RCMES.
-    This class is a thin layer on top of the esgfpy-client package.
-    Note: this class always searches for latest versions, no replicas.
-    """
-    
-    def __init__(self, searchServiceUrl=JPL_SEARCH_SERVICE_URL, distrib=True):
-        """
-        :param searchServiceUrl: URL of ESGF search service to query
-        :param distrib: True to execute a federation-wide search, 
-                        False to search only the specified search service
-        """
-        connection = SearchConnection(searchServiceUrl, distrib=distrib)
-        
-        # dictionary of query constraints
-        self.constraints = { "latest":True, "replica":False, "distrib":distrib } 
-    
-        # initial search context
-        self.context = connection.new_context( **self.constraints )
-        
-        
-    def setConstraint(self, **constraints):
-        """
-        Sets one or more facet constraints.
-        :param constraints: dictionary of (facet name, facet value) constraints.
-        """
-        for key in constraints:
-            print 'Setting constraint: %s=%s' % (key, constraints[key])
-            self.constraints[key] = constraints[key]
-        self.context = self.context.constrain(**constraints)
-        
-    def getNumberOfDatasets(self):
-        """
-        :return: the number of datasets matching the current constraints.
-        """
-        return self.context.hit_count
-        
-    def getFacets(self, facet):
-        """
-        :return: a dictionary of (facet value, facet count) for the specified facet and current constraints.
-        Example (for facet='project'): {u'COUND': 4, u'CMIP5': 2657, u'obs4MIPs': 7} 
-        """
-        return self.context.facet_counts[facet]
-    
-    def getFiles(self):
-        """
-        Executes a search for files with the current constraints.
-        :return: list of file download URLs.
-        """
-        datasets = self.context.search()
-        urls = []
-        for dataset in datasets:
-            print "\nSearching files for dataset=%s with constraints: %s" % (dataset.dataset_id, self.constraints)
-            files = dataset.file_context().search(**self.constraints)
-            for file in files:
-                print 'Found file=%s' % file.download_url
-                urls.append(file.download_url)
-        return urls
-        
-    
-    

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/examples/knmi_to_cru31_full_bias.py
----------------------------------------------------------------------
diff --git a/examples/knmi_to_cru31_full_bias.py b/examples/knmi_to_cru31_full_bias.py
deleted file mode 100644
index b90fd4c..0000000
--- a/examples/knmi_to_cru31_full_bias.py
+++ /dev/null
@@ -1,175 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-import datetime
-import urllib
-from os import path
-
-import numpy as np
-
-import ocw.data_source.local as local
-import ocw.data_source.rcmed as rcmed
-from ocw.dataset import Bounds as Bounds
-import ocw.dataset_processor as dsp
-import ocw.evaluation as evaluation
-import ocw.metrics as metrics
-import ocw.plotter as plotter
-
-# File URL leader
-FILE_LEADER = "http://zipper.jpl.nasa.gov/dist/"
-# This way we can easily adjust the time span of the retrievals
-YEARS = 3
-# Two Local Model Files 
-MODEL = "AFRICA_KNMI-RACMO2.2b_CTL_ERAINT_MM_50km_1989-2008_tasmax.nc"
-# Filename for the output image/plot (without file extension)
-OUTPUT_PLOT = "cru_31_tmax_knmi_africa_bias_full"
-
-# Download necessary NetCDF file if not present
-if path.exists(MODEL):
-    pass
-else:
-    urllib.urlretrieve(FILE_LEADER + MODEL, MODEL)
-
-""" Step 1: Load Local NetCDF File into OCW Dataset Objects """
-print("Loading %s into an OCW Dataset Object" % (MODEL,))
-knmi_dataset = local.load_file(MODEL, "tasmax")
-print("KNMI_Dataset.values shape: (times, lats, lons) - %s \n" % (knmi_dataset.values.shape,))
-
-""" Step 2: Fetch an OCW Dataset Object from the data_source.rcmed module """
-print("Working with the rcmed interface to get CRU3.1 Daily-Max Temp")
-metadata = rcmed.get_parameters_metadata()
-
-cru_31 = [m for m in metadata if m['parameter_id'] == "39"][0]
-
-""" The RCMED API uses the following function to query, subset and return the 
-raw data from the database:
-
-rcmed.parameter_dataset(dataset_id, parameter_id, min_lat, max_lat, min_lon, 
-                        max_lon, start_time, end_time)
-
-The first two required params are in the cru_31 variable we defined earlier
-"""
-# Must cast to int since the rcmed api requires ints
-dataset_id = int(cru_31['dataset_id'])
-parameter_id = int(cru_31['parameter_id'])
-
-print("We are going to use the Model to constrain the Spatial Domain")
-#  The spatial_boundaries() function returns the spatial extent of the dataset
-print("The KNMI_Dataset spatial bounds (min_lat, max_lat, min_lon, max_lon) are: \n"
-      "%s\n" % (knmi_dataset.spatial_boundaries(), ))
-print("The KNMI_Dataset spatial resolution (lat_resolution, lon_resolution) is: \n"
-      "%s\n\n" % (knmi_dataset.spatial_resolution(), ))
-min_lat, max_lat, min_lon, max_lon = knmi_dataset.spatial_boundaries()
-
-print("Calculating the Maximum Overlap in Time for the datasets")
-
-cru_start = datetime.datetime.strptime(cru_31['start_date'], "%Y-%m-%d")
-cru_end = datetime.datetime.strptime(cru_31['end_date'], "%Y-%m-%d")
-knmi_start, knmi_end = knmi_dataset.time_range()
-# Grab the Max Start Time
-start_time = max([cru_start, knmi_start])
-# Grab the Min End Time
-end_time = min([cru_end, knmi_end])
-print("Overlap computed to be: %s to %s" % (start_time.strftime("%Y-%m-%d"),
-                                          end_time.strftime("%Y-%m-%d")))
-print("We are going to grab the first %s year(s) of data" % YEARS)
-end_time = datetime.datetime(start_time.year + YEARS, start_time.month, start_time.day)
-print("Final Overlap is: %s to %s" % (start_time.strftime("%Y-%m-%d"),
-                                          end_time.strftime("%Y-%m-%d")))
-
-print("Fetching data from RCMED...")
-cru31_dataset = rcmed.parameter_dataset(dataset_id,
-                                        parameter_id,
-                                        min_lat,
-                                        max_lat,
-                                        min_lon,
-                                        max_lon,
-                                        start_time,
-                                        end_time)
-
-""" Step 3: Resample Datasets so they are the same shape """
-print("CRU31_Dataset.values shape: (times, lats, lons) - %s" % (cru31_dataset.values.shape,))
-print("KNMI_Dataset.values shape: (times, lats, lons) - %s" % (knmi_dataset.values.shape,))
-print("Our two datasets have a mis-match in time. We will subset on time to %s years\n" % YEARS)
-
-# Create a Bounds object to use for subsetting
-new_bounds = Bounds(min_lat, max_lat, min_lon, max_lon, start_time, end_time)
-knmi_dataset = dsp.subset(new_bounds, knmi_dataset)
-
-print("CRU31_Dataset.values shape: (times, lats, lons) - %s" % (cru31_dataset.values.shape,))
-print("KNMI_Dataset.values shape: (times, lats, lons) - %s \n" % (knmi_dataset.values.shape,))
-
-print("Temporally Rebinning the Datasets to a Single Timestep")
-# To run FULL temporal Rebinning use a timedelta > 366 days.  I used 999 in this example
-knmi_dataset = dsp.temporal_rebin(knmi_dataset, datetime.timedelta(days=999))
-cru31_dataset = dsp.temporal_rebin(cru31_dataset, datetime.timedelta(days=999))
-
-print("KNMI_Dataset.values shape: %s" % (knmi_dataset.values.shape,))
-print("CRU31_Dataset.values shape: %s \n\n" % (cru31_dataset.values.shape,))
- 
-""" Spatially Regrid the Dataset Objects to a 1/2 degree grid """
-# Using the bounds we will create a new set of lats and lons on 1 degree step
-new_lons = np.arange(min_lon, max_lon, 0.5)
-new_lats = np.arange(min_lat, max_lat, 0.5)
- 
-# Spatially regrid datasets using the new_lats, new_lons numpy arrays
-print("Spatially Regridding the KNMI_Dataset...")
-knmi_dataset = dsp.spatial_regrid(knmi_dataset, new_lats, new_lons)
-print("Spatially Regridding the CRU31_Dataset...")
-cru31_dataset = dsp.spatial_regrid(cru31_dataset, new_lats, new_lons)
-print("Final shape of the KNMI_Dataset:%s" % (knmi_dataset.values.shape, ))
-print("Final shape of the CRU31_Dataset:%s" % (cru31_dataset.values.shape, ))
- 
-""" Step 4:  Build a Metric to use for Evaluation - Bias for this example """
-# You can build your own metrics, but OCW also ships with some common metrics
-print("Setting up a Bias metric to use for evaluation")
-bias = metrics.Bias()
-
-""" Step 5: Create an Evaluation Object using Datasets and our Metric """
-# The Evaluation Class Signature is:
-# Evaluation(reference, targets, metrics, subregions=None)
-# Evaluation can take in multiple targets and metrics, so we need to convert
-# our examples into Python lists.  Evaluation will iterate over the lists
-print("Making the Evaluation definition")
-bias_evaluation = evaluation.Evaluation(knmi_dataset, [cru31_dataset], [bias])
-print("Executing the Evaluation using the object's run() method")
-bias_evaluation.run()
- 
-""" Step 6: Make a Plot from the Evaluation.results """
-# The Evaluation.results are a set of nested lists to support many different
-# possible Evaluation scenarios.
-#
-# The Evaluation results docs say:
-# The shape of results is (num_metrics, num_target_datasets) if no subregion
-# Accessing the actual results when we have used 1 metric and 1 dataset is
-# done this way:
-print("Accessing the Results of the Evaluation run")
-results = bias_evaluation.results[0][0]
- 
-# From the bias output I want to make a Contour Map of the region
-print("Generating a contour map using ocw.plotter.draw_contour_map()")
- 
-lats = new_lats
-lons = new_lons
-fname = OUTPUT_PLOT
-gridshape = (1, 1)  # Using a 1 x 1 since we have a single Bias for the full time range
-plot_title = "TASMAX Bias of KNMI Compared to CRU 3.1 (%s - %s)" % (start_time.strftime("%Y/%d/%m"), end_time.strftime("%Y/%d/%m"))
-sub_titles = ["Full Temporal Range"]
- 
-plotter.draw_contour_map(results, lats, lons, fname,
-                         gridshape=gridshape, ptitle=plot_title, 
-                         subtitles=sub_titles)

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/examples/simple_model_to_model_bias.py
----------------------------------------------------------------------
diff --git a/examples/simple_model_to_model_bias.py b/examples/simple_model_to_model_bias.py
deleted file mode 100644
index 32ecf15..0000000
--- a/examples/simple_model_to_model_bias.py
+++ /dev/null
@@ -1,127 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-import datetime
-from os import path
-import urllib
-
-import numpy as np
-
-import ocw.data_source.local as local
-import ocw.dataset_processor as dsp
-import ocw.evaluation as evaluation
-import ocw.metrics as metrics
-import ocw.plotter as plotter
-
-# File URL leader
-FILE_LEADER = "http://zipper.jpl.nasa.gov/dist/"
-# Two Local Model Files 
-FILE_1 = "AFRICA_KNMI-RACMO2.2b_CTL_ERAINT_MM_50km_1989-2008_tasmax.nc"
-FILE_2 = "AFRICA_UC-WRF311_CTL_ERAINT_MM_50km-rg_1989-2008_tasmax.nc"
-# Filename for the output image/plot (without file extension)
-OUTPUT_PLOT = "wrf_bias_compared_to_knmi"
-
-# Download necessary NetCDF files if not present
-if path.exists(FILE_1):
-    pass
-else:
-    urllib.urlretrieve(FILE_LEADER + FILE_1, FILE_1)
-
-if path.exists(FILE_2):
-    pass
-else:
-    urllib.urlretrieve(FILE_LEADER + FILE_2, FILE_2)
-
-""" Step 1: Load Local NetCDF Files into OCW Dataset Objects """
-print("Loading %s into an OCW Dataset Object" % (FILE_1,))
-knmi_dataset = local.load_file(FILE_1, "tasmax")
-print("KNMI_Dataset.values shape: (times, lats, lons) - %s \n" % (knmi_dataset.values.shape,))
-
-print("Loading %s into an OCW Dataset Object" % (FILE_2,))
-wrf_dataset = local.load_file(FILE_2, "tasmax")
-print("WRF_Dataset.values shape: (times, lats, lons) - %s \n" % (wrf_dataset.values.shape,))
-
-""" Step 2: Temporally Rebin the Data into an Annual Timestep """
-print("Temporally Rebinning the Datasets to an Annual Timestep")
-knmi_dataset = dsp.temporal_rebin(knmi_dataset, datetime.timedelta(days=365))
-wrf_dataset = dsp.temporal_rebin(wrf_dataset, datetime.timedelta(days=365))
-print("KNMI_Dataset.values shape: %s" % (knmi_dataset.values.shape,))
-print("WRF_Dataset.values shape: %s \n\n" % (wrf_dataset.values.shape,))
-
-""" Step 3: Spatially Regrid the Dataset Objects to a 1 degree grid """
-#  The spatial_boundaries() function returns the spatial extent of the dataset
-print("The KNMI_Dataset spatial bounds (min_lat, max_lat, min_lon, max_lon) are: \n"
-      "%s\n" % (knmi_dataset.spatial_boundaries(), ))
-print("The KNMI_Dataset spatial resolution (lat_resolution, lon_resolution) is: \n"
-      "%s\n\n" % (knmi_dataset.spatial_resolution(), ))
-
-min_lat, max_lat, min_lon, max_lon = knmi_dataset.spatial_boundaries()
-
-# Using the bounds we will create a new set of lats and lons on 1 degree step
-new_lons = np.arange(min_lon, max_lon, 1)
-new_lats = np.arange(min_lat, max_lat, 1)
-
-# Spatially regrid datasets using the new_lats, new_lons numpy arrays
-print("Spatially Regridding the KNMI_Dataset...")
-knmi_dataset = dsp.spatial_regrid(knmi_dataset, new_lats, new_lons)
-print("Final shape of the KNMI_Dataset: \n"
-      "%s\n" % (knmi_dataset.values.shape, ))
-print("Spatially Regridding the WRF_Dataset...")
-wrf_dataset = dsp.spatial_regrid(wrf_dataset, new_lats, new_lons)
-print("Final shape of the WRF_Dataset: \n"
-      "%s\n" % (wrf_dataset.values.shape, ))
-
-""" Step 4:  Build a Metric to use for Evaluation - Bias for this example """
-# You can build your own metrics, but OCW also ships with some common metrics
-print("Setting up a Bias metric to use for evaluation")
-bias = metrics.Bias()
-
-""" Step 5: Create an Evaluation Object using Datasets and our Metric """
-# The Evaluation Class Signature is:
-# Evaluation(reference, targets, metrics, subregions=None)
-# Evaluation can take in multiple targets and metrics, so we need to convert
-# our examples into Python lists.  Evaluation will iterate over the lists
-print("Making the Evaluation definition")
-bias_evaluation = evaluation.Evaluation(knmi_dataset, [wrf_dataset], [bias])
-print("Executing the Evaluation using the object's run() method")
-bias_evaluation.run()
-
-""" Step 6: Make a Plot from the Evaluation.results """
-# The Evaluation.results are a set of nested lists to support many different
-# possible Evaluation scenarios.
-#
-# The Evaluation results docs say:
-# The shape of results is (num_metrics, num_target_datasets) if no subregion
-# Accessing the actual results when we have used 1 metric and 1 dataset is
-# done this way:
-print("Accessing the Results of the Evaluation run")
-results = bias_evaluation.results[0][0]
-print("The results are of type: %s" % type(results))
-
-# From the bias output I want to make a Contour Map of the region
-print("Generating a contour map using ocw.plotter.draw_contour_map()")
-
-lats = new_lats
-lons = new_lons
-fname = OUTPUT_PLOT
-gridshape = (4, 5) # 20 Years worth of plots. 20 rows in 1 column
-plot_title = "TASMAX Bias of WRF Compared to KNMI (1989 - 2008)"
-sub_titles = range(1989, 2009, 1)
-
-plotter.draw_contour_map(results, lats, lons, fname, 
-                         gridshape=gridshape, ptitle=plot_title, 
-                         subtitles=sub_titles)

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/examples/simple_model_tstd.py
----------------------------------------------------------------------
diff --git a/examples/simple_model_tstd.py b/examples/simple_model_tstd.py
deleted file mode 100644
index 4c87813..0000000
--- a/examples/simple_model_tstd.py
+++ /dev/null
@@ -1,89 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-from os import path
-import urllib
-
-import ocw.data_source.local as local
-import ocw.evaluation as evaluation
-import ocw.metrics as metrics
-import ocw.plotter as plotter
-
-# File URL leader
-FILE_LEADER = "http://zipper.jpl.nasa.gov/dist/"
-# One Local Model Files
-FILE_1 = "AFRICA_KNMI-RACMO2.2b_CTL_ERAINT_MM_50km_1989-2008_tasmax.nc"
-
-# Filename for the output image/plot (without file extension)
-OUTPUT_PLOT = "knmi_temporal_std"
-
-# Download necessary NetCDF file if needed
-if path.exists(FILE_1):
-    pass
-else:
-    urllib.urlretrieve(FILE_LEADER + FILE_1, FILE_1)
-
-""" Step 1: Load Local NetCDF File into OCW Dataset Objects """
-print "Loading %s into an OCW Dataset Object" % (FILE_1,)
-# 'tasmax' is variable name of values
-knmi_dataset = local.load_file(FILE_1, "tasmax")
-
-print "KNMI_Dataset.values shape: (times, lats, lons) - %s \n" % (knmi_dataset.values.shape,)
-
-# Acessing latittudes and longitudes of netCDF file
-lats = knmi_dataset.lats
-lons = knmi_dataset.lons
-
-""" Step 2:  Build a Metric to use for Evaluation - Temporal STD for this example """
-# You can build your own metrics, but OCW also ships with some common metrics
-print "Setting up a Temporal STD metric to use for evaluation"
-std = metrics.TemporalStdDev()
-
-""" Step 3: Create an Evaluation Object using Datasets and our Metric """
-# The Evaluation Class Signature is:
-# Evaluation(reference, targets, metrics, subregions=None)
-# Evaluation can take in multiple targets and metrics, so we need to convert
-# our examples into Python lists.  Evaluation will iterate over the lists
-print "Making the Evaluation definition"
-# Temporal STD Metric gets one target dataset then reference dataset should be None
-std_evaluation = evaluation.Evaluation(None, [knmi_dataset], [std])
-print "Executing the Evaluation using the object's run() method"
-std_evaluation.run()
-
-""" Step 4: Make a Plot from the Evaluation.results """
-# The Evaluation.results are a set of nested lists to support many different
-# possible Evaluation scenarios.
-#
-# The Evaluation results docs say:
-# The shape of results is (num_metrics, num_target_datasets) if no subregion
-# Accessing the actual results when we have used 1 metric and 1 dataset is
-# done this way:
-print "Accessing the Results of the Evaluation run"
-results = std_evaluation.unary_results[0][0]
-print "The results are of type: %s" % type(results)
-
-# From the temporal std output I want to make a Contour Map of the region
-print "Generating a contour map using ocw.plotter.draw_contour_map()"
-
-fname = OUTPUT_PLOT
-gridshape = (4, 5) # 20 Years worth of plots. 20 rows in 1 column
-plot_title = "TASMAX Temporal Standard Deviation (1989 - 2008)"
-sub_titles = range(1989, 2009, 1)
-
-plotter.draw_contour_map(results, lats, lons, fname,
-                         gridshape=gridshape, ptitle=plot_title,
-                         subtitles=sub_titles)

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/examples/taylor_diagram_example.py
----------------------------------------------------------------------
diff --git a/examples/taylor_diagram_example.py b/examples/taylor_diagram_example.py
deleted file mode 100644
index f60bfe6..0000000
--- a/examples/taylor_diagram_example.py
+++ /dev/null
@@ -1,115 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-import datetime
-import sys
-from os import path
-import urllib
-
-import numpy
-
-from ocw.dataset import Bounds
-import ocw.data_source.local as local
-import ocw.dataset_processor as dsp
-import ocw.evaluation as evaluation
-import ocw.metrics as metrics
-import ocw.plotter as plotter
-
-FILE_LEADER = "http://zipper.jpl.nasa.gov/dist/"
-FILE_1 = "AFRICA_KNMI-RACMO2.2b_CTL_ERAINT_MM_50km_1989-2008_tasmax.nc"
-FILE_2 = "AFRICA_UC-WRF311_CTL_ERAINT_MM_50km-rg_1989-2008_tasmax.nc"
-
-# Download some example NetCDF files for the evaluation
-################################################################################
-if not path.exists(FILE_1):
-    urllib.urlretrieve(FILE_LEADER + FILE_1, FILE_1)
-
-if not path.exists(FILE_2):
-    urllib.urlretrieve(FILE_LEADER + FILE_2, FILE_2)
-
-# Load the example datasets into OCW Dataset objects. We want to load
-# the 'tasmax' variable values. We'll also name the datasets for use
-# when plotting.
-################################################################################
-knmi_dataset = local.load_file(FILE_1, "tasmax")
-wrf_dataset = local.load_file(FILE_2, "tasmax")
-
-knmi_dataset.name = "knmi"
-wrf_dataset.name = "wrf"
-
-# Date values from loaded datasets might not always fall on reasonable days.
-# With monthly data, we could have data falling on the 1st, 15th, or some other
-# day of the month. Let's fix that real quick.
-################################################################################
-knmi_dataset = dsp.normalize_dataset_datetimes(knmi_dataset, 'monthly')
-wrf_dataset = dsp.normalize_dataset_datetimes(wrf_dataset, 'monthly')
-
-# We're only going to run this evaluation over a years worth of data. We'll
-# make a Bounds object and use it to subset our datasets.
-################################################################################
-subset = Bounds(-45, 42, -24, 60, datetime.datetime(1989, 1, 1), datetime.datetime(1989, 12, 1))
-knmi_dataset = dsp.subset(subset, knmi_dataset)
-wrf_dataset = dsp.subset(subset, wrf_dataset)
-
-# Temporally re-bin the data into a monthly timestep.
-################################################################################
-knmi_dataset = dsp.temporal_rebin(knmi_dataset, datetime.timedelta(days=30))
-wrf_dataset = dsp.temporal_rebin(wrf_dataset, datetime.timedelta(days=30))
-
-# Spatially regrid the datasets onto a 1 degree grid.
-################################################################################
-# Get the bounds of the reference dataset and use it to create a new
-# set of lat/lon values on a 1 degree step
-# Using the bounds we will create a new set of lats and lons on 1 degree step
-min_lat, max_lat, min_lon, max_lon = knmi_dataset.spatial_boundaries()
-new_lons = numpy.arange(min_lon, max_lon, 1)
-new_lats = numpy.arange(min_lat, max_lat, 1)
-
-# Spatially regrid datasets using the new_lats, new_lons numpy arrays
-knmi_dataset = dsp.spatial_regrid(knmi_dataset, new_lats, new_lons)
-wrf_dataset = dsp.spatial_regrid(wrf_dataset, new_lats, new_lons)
-
-# Load the metrics that we want to use for the evaluation.
-################################################################################
-sstdr = metrics.SpatialStdDevRatio()
-pc = metrics.PatternCorrelation()
-
-# Create our new evaluation object. The knmi dataset is the evaluations
-# reference dataset. We then provide a list of 1 or more target datasets
-# to use for the evaluation. In this case, we only want to use the wrf dataset.
-# Then we pass a list of all the metrics that we want to use in the evaluation.
-################################################################################
-test_evaluation = evaluation.Evaluation(knmi_dataset, [wrf_dataset], [sstdr, pc])
-test_evaluation.run()
-
-# Pull our the evaluation results and prepare them for drawing a Taylor diagram.
-################################################################################
-spatial_stddev_ratio = test_evaluation.results[0][0]
-# Pattern correlation results are a tuple, so we need to index and grab
-# the component we care about.
-spatial_correlation = test_evaluation.results[0][1][0]
-
-taylor_data = numpy.array([[spatial_stddev_ratio], [spatial_correlation]]).transpose()
-
-# Draw our taylor diagram!
-################################################################################
-plotter.draw_taylor_diagram(taylor_data,
-                            [wrf_dataset.name],
-                            knmi_dataset.name,
-                            fname='taylor_plot',
-                            fmt='png',
-                            frameon=False)

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/obs4MIPs/INSTALLATION
----------------------------------------------------------------------
diff --git a/obs4MIPs/INSTALLATION b/obs4MIPs/INSTALLATION
deleted file mode 100644
index 082b787..0000000
--- a/obs4MIPs/INSTALLATION
+++ /dev/null
@@ -1,275 +0,0 @@
-INSTALLATION
-============
-
-Pre-requisite
-==============
-
-Python Magic
--------------
-This package is used to identify the input file type.
-https://pypi.python.org/pypi/python-magic/
- python setup.py install --prefix=/whatever/directory/
-
-cdat_lite, full CDAT or UVCDAT installation
---------------------------------------------
-This package is mainly used to read grads CTL files.
-https://pypi.python.org/pypi/cdat-lite
-
-    Download cdat_lite
-    -------------------
-    http://ndg.nerc.ac.uk/dist/
-
-http://uv-cdat.org
-
-
-cmor2
------
-Main package used to convert data into CMIP5 file format using Python.
-
-http://www2-pcmdi.llnl.gov/cmor/download/python setup.py install --prefix=/whatever/directory/
-
-Download the required libraries following the installation procedure.
---------------------------------------------
-http://www2-pcmdi.llnl.gov/cmor/installation
-
-The CMOR2 requires external libraries: the zlib, netCDF-4.0 , HDF5 and UDUNITS-2 packages to be installed first.
-
-        zlilb
-        HDF5 library
-        netCDF4 library
-        Unidata units library - version 2
-        uuid library
-
-xlrd
-----
-Needed to read Excel Spreadsheet .xls format 97 or 95.
-https://pypi.python.org/pypi/xlrd
-
-NetCDF4-Python
---------------
-NetCDF4 package from google-code. 
-http://code.google.com/p/netcdf4-python/
-
- 
-
-RESSOURCE file documentation
-=============================
-
-In order to run osb4MIPs, the first thing to do is to create a resource file giving information about your original data. The resource files contain all Global Attributes and information needed to convert a variable to a CMIP5 variable.  
-
-Please refer to the section “Example of TRMM data obs4MIPs resource file” to follow this example
-
-The parameters “file_template” and “years” are used to find files that you want to process. The program will understand matab/grads/netCDF file format as well as a filelist containing files in these specified formats.
-
-Here is an example to convert TRMM data using a filelist:
-
-TRMM.rc
-…
-years=199,200,201
-file_template        = "v7/TRMM_%sx.lst"
-…
-
-The program will first look for files that match the file template:
-v7/TRMM_199x.lst
-v7/TRMM_200x.lst
-v7/TRMM_201x.lst
-
-Each file include a “filelist” that will be read and aggregated in time.
-In this case we want 1 decade per CMIP5 variable.
-
-i.e.
-cat v7/TRMM_199x.lst
-v7/3B43.19980101.7.nc
-v7/3B43.19980201.7.nc
-….
-v7/3B43.19991201.7.nc
-
-
-
-The resource files specifies all Global Attributes that need to be added to the netCDF Files.
-
-The CMIP5 table has to be in the directory set by the key “inpath”.
-The CMIP5 table name is set using the key “Table”.
-
-If the original file has internal time units, you can set the “InputTimeUnits” to “internal”.  You can also set it to any time units that correspond to your input file.
-
-The key “OutputTimeUnits” will be used to convert the time to the request unit.  ESGF requires different time values for a specific experiment, so make sure that you have unique time values in all your files.  It is not appropriate to start all the time values at “0” and change the time units for each file. ESGF use the file time values, so there will be confusion in the system.
-
-DelGlbAttributes is used to remove global attributes that are present in the original files that you no longer need in the final files. 
-
-SetGlbAttributes is used to add new global attributes needed in the final files.  To add a new global attribute, you need first to create a parameter and assign this attribute using this new parameter in form of a Python list.
-
-i.e  
-processing_version   = '7'
-SetGlbAttributes     = "[(\processing_version\,rc[\processing_version\])]”
-
-The program will ingest all the resources into a Python dictionary named “rc”, so rc[“processing_version”] will correspond to the value ‘7’.   
-NOTE:  Quotes in the resource files are replace by a ‘\’ character in the Python list.   A parser will replace all the ‘\’ for the corresponding ‘“’ character and create a global attributes which will follow the (key,value) pair.  In this case, Python will create a netCDF attribute that will correspond to:  
-
-:processing_version = "7"
-
-
-Example of TRMM data obs4MIPs resource file
-============================================
-#
-# TRMM resource file for CMIP5 conversion
-#
-
-# Program will loop on years and substitute in template.
-# Here we are creating decadal files using netcdf files
-#
-years=199,200,201
-#
-# Can be a grads, matlab, netCDF or a list of files.
-#
-file_template        = "v7/TRMM_%sx.lst"
-#
-# CMIP5 experiment ID that match the Amon Table. 
-# for obs4MIPs, all experiment_id are set to “obs”
-#
-experiment_id        = 'obs'
-institution          = 'NASA Goddard Space Flight Center, Greenbelt MD, USA'
-#
-# Calendar is used by CMOR2
-#
-calendar             = 'gregorian'
-institute_id         = 'NASA-GSFC'
-model_id             = 'Obs-TRMM'
-source               = 'Global Precipitation Climatology Project (TRMM)'
-contact              = "George Huffman george.j.huffman@nasa.gov"
-references           = 'http://science.nasa.gov/missions/trmm/'
-instrument           = 'TRMM'
-processing_version   = '7'
-processing_level     = 'L3'
-mip_specs            = 'CMIP5'
-data_structure       = 'grid'
-source_type          = 'satellite_retrieval_and_gauge_analysis'
-#
-# source_fn will be add to the filename (up to first _ character)
-# source_fn can be set to “SYNOPTIC”, to fetch the synoptic time
-# from the filename. The program will extract 2 characters before
-# “z.”  i.e.  (TRMM_1990_03z.nc) rc[“SYNOPTIC”] = ‘03’
-source_fn            = ''
-source_id            = 'TRMM'
-#
-# CMOR2 table realm
-#
-realm                = 'atmos'
-obs_project          = 'TRMM'
-#
-# CMOR2 table path
-#
-inpath               = "Tables"
-#
-# CMOR2 table.
-#
-table                = 'CMIP5_Amon_obs'
-#
-# CMOR variable
-#
-cmor_var             = '[\pr\]'
-#
-# If we have 4D data (time, level, lat, lon )
-#
-level                = '[\\]'
-#
-# Python will evaluate this equation to convert the units to kg m-2 s-1
-#
-equation             = '[\(data*2.78e-4)\]'
-#
-# Original variable to read
-#
-original_var         = '[\pcp\]'
-#
-# Since we have an equation, we can say that the original units are now # the same as the CMIP5 units.
-#
-original_units       = '[\kg m-2 s-1\]'
-#
-# Time will be converted to these new time units.
-# The output file will contain these new units in the time variable.
-#
-OutputTimeUnits      = "months since 1900-1-1"
-#
-# Use internal time units found in the file
-#
-InputTimeUnits       = "internal"
-#
-# Define all new Global Attributes as a Python list of tuple. 
-# The \ character will be replaced internally by “.
-#   :global=rc[“product”] (found in the Amon table)
-#   :processing_version=rc[“processing_version”]
-#   :title=”obs-TRMM output prepared for obs4MIPs 
-#           NASA-GSFC observations”
-# 
-SetGlbAttributes     = "[(\global\,rc[\product\]),(\processing_version\,rc[\processing_version\]),(\title\,\obs-TRMM output prepared for obs4MIPs NASA-GSFC observations\)]"
-#
-# Delete the following Global Attributes found in the original files
-# and that are no longer valid for obs4MIPs.
-#
-DelGlbAttributes     = "[\realization\,\experiment\,\physics_version\,\initialization_method\]"
-
-
-Create variables match using Excel spreadsheet
-----------------------------------------------
-It is possible to create a excel spreadsheet to create matching variable from the original file to CMIP5.
-
-Add the parameter “excel_file” in the resource file and point it to your Excel spreadsheet file.  Your excel file will replace the parameters: cmor_var, original_var, origina_units, level and equation.  So you should not have those in your resource file if you use “excel_file” parameter.  This was developped in order to make the process of creating many variables at once more readable.
-
-The Excel spreadsheet needs to be saved in “xls” format “97” or “95” which are understood by the Python package ‘xlrd’.  The program looks 
-for the sheet named “Variables”
-
-Column 0 corresponds to “cmor_var”
-Column 1 corresponds to “original_var”
-Column 2 corresponds to “original_units”
-Column 3 corresponds to “level”
-Column 6 corresponds to “equation”
-
- 
-
-Here is an example of a “Variables” excel sheet.
-
-obs4MIPs ECMWF
-CMOR Variable Name	User Variable Name 	User Units	Height
-Var.name	Positive Up/Down	Relative Path to Data	equation
-ta	t	K	levelist			data
-ua	u	m s-1	levelist			data
-va	v	m s-1	levelist			data
-hur	r	%	levelist			data
-hus	q	%	levelist			data
-
-
-How to run obs4MIPs.py
-========================
-
-Set you PYTHONPATH to your corresponding Python package.
-
-./obs4MIPS_process.py [-h] -r resource
-   resource:   File containing Global attributes
-
-
-i.e. ./obs4MIPs_process.py –r TRMM.rc
-Working on variable pcp 
-reading v7/3B43.19980201.7.nc
-reading v7/3B43.19980301.7.nc
-reading v7/3B43.19980401.7.nc
-reading v7/3B43.19980501.7.nc
-reading v7/3B43.19980601.7.nc
-reading v7/3B43.19980701.7.nc
-…
-obs4MIPs/observations/NASA-GSFC/Obs-TRMM/obs/mon/atmos/pr/r1i1p1/pr_Amon_obs_Obs-TRMM_obs_r1i1p1_199801-199912.nc
-Deleting attribute: realization
-Deleting attribute: experiment
-Deleting attribute: physics_version
-Deleting attribute: initialization_method
-Assigning attribute (global,observations)
-Assigning attribute (processing_version,7)
-Assigning attribute (title,obs-TRMM output prepared for obs4MIPs NASA-GSFC observations)
-obs4MIPs/observations/NASA-GSFC/Obs-TRMM/obs/mon/atmos/pr/r1i1p1/pr_Amon_obs_Obs-TRMM_obs_r1i1p1_199801-199912.nc
-NASA-GSFC/TRMM/obs4MIPs/observations/atmos/pr/mon/grid/NASA-GSFC/TRMM/V7/pr_TRMM-L3_v7_199801-199912.nc
-
-
-
-
-
-
-

http://git-wip-us.apache.org/repos/asf/climate/blob/a53e3af5/obs4MIPs/README
----------------------------------------------------------------------
diff --git a/obs4MIPs/README b/obs4MIPs/README
deleted file mode 100644
index f9983b8..0000000
--- a/obs4MIPs/README
+++ /dev/null
@@ -1,20 +0,0 @@
-
------------
-About
------------
-obs4MIPs.py, is a front end to an existing free software package, CMOR2 (Climate Model Output Rewriter), written by Lawrence Livermore National Laboratory (LLNL), and reads in a multitude of standard data formats, such as netcdf3, netcdf4, Grads control files, Matlab data files or a list of netcdf files, and converts the data into the CMIP5 data format to allow publication on the Earth System Grid Federation (ESGF) data node.
-
----------------
-Examples
---------------
-Convert trmm data to CMIP5 format 
-obs4MIPs.py -r trmm.rc
-
-Convert trmm data to CMIP5 format using excel spreadsheet 
-obs4MIPs.py -r ecmwf.rc
-
---------------
-Installation 
----------------
-Installation and documentation is provided in the INSTALLATION file.
-