You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@zeppelin.apache.org by mi...@apache.org on 2017/03/17 03:22:31 UTC

[01/23] zeppelin git commit: [ZEPPELIN-1964] Layout info is lost after refresh

Repository: zeppelin
Updated Branches:
  refs/heads/branch-0.7 de65886f1 -> 4d80ec461


[ZEPPELIN-1964] Layout info is lost after refresh

### What is this PR for?
This PR is for `branch-0.7` of https://github.com/apache/zeppelin/pull/2053.

### What type of PR is it?
Bug Fix

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-1964

### How should this be tested?
Please do resize paragraph and then refresh browser.

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: hyungsung <as...@gmail.com>
Author: AhyoungRyu <fb...@hanmail.net>
Author: Jeff Zhang <zj...@apache.org>

Closes #2080 from astroshim/fix/layoutbroken and squashes the following commits:

b7c9397 [hyungsung] fix layout broken issue
5df4975 [Jeff Zhang] [MINOR] add pig wiki page to pig doc
04332c9 [AhyoungRyu] [DOCS][ZEPPELIN-2140] Add docs for notebookRepo REST API


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/f9630a58
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/f9630a58
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/f9630a58

Branch: refs/heads/branch-0.7
Commit: f9630a58a86a4a456a8024c0bae73525634aa22c
Parents: de65886
Author: hyungsung <as...@gmail.com>
Authored: Wed Mar 1 03:51:11 2017 +0900
Committer: ahyoungryu <ah...@apache.org>
Committed: Sun Mar 5 16:12:14 2017 +0900

----------------------------------------------------------------------
 docs/_includes/themes/zeppelin/_navigation.html |   1 +
 .../zeppelin/img/pig_zeppelin_tutorial.png      | Bin 0 -> 280450 bytes
 docs/index.md                                   |   4 +
 docs/interpreter/pig.md                         |  62 +++++--
 docs/rest-api/rest-helium.md                    |   2 +-
 docs/rest-api/rest-notebookRepo.md              | 179 +++++++++++++++++++
 notebook/2C57UKYWR/note.json                    |  32 ++--
 .../notebook/paragraph/paragraph.controller.js  |   6 +-
 8 files changed, 250 insertions(+), 36 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/_includes/themes/zeppelin/_navigation.html
----------------------------------------------------------------------
diff --git a/docs/_includes/themes/zeppelin/_navigation.html b/docs/_includes/themes/zeppelin/_navigation.html
index bc06b85..401c691 100644
--- a/docs/_includes/themes/zeppelin/_navigation.html
+++ b/docs/_includes/themes/zeppelin/_navigation.html
@@ -105,6 +105,7 @@
                 <li class="title"><span><b>REST API</b><span></li>
                 <li><a href="{{BASE_PATH}}/rest-api/rest-interpreter.html">Interpreter API</a></li>
                 <li><a href="{{BASE_PATH}}/rest-api/rest-notebook.html">Notebook API</a></li>
+                <li><a href="{{BASE_PATH}}/rest-api/rest-notebookRepo.html">Notebook Repository API</a></li>
                 <li><a href="{{BASE_PATH}}/rest-api/rest-configuration.html">Configuration API</a></li>
                 <li><a href="{{BASE_PATH}}/rest-api/rest-credential.html">Credential API</a></li>
                 <li><a href="{{BASE_PATH}}/rest-api/rest-helium.html">Helium API</a></li>

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/assets/themes/zeppelin/img/pig_zeppelin_tutorial.png
----------------------------------------------------------------------
diff --git a/docs/assets/themes/zeppelin/img/pig_zeppelin_tutorial.png b/docs/assets/themes/zeppelin/img/pig_zeppelin_tutorial.png
new file mode 100644
index 0000000..b90b982
Binary files /dev/null and b/docs/assets/themes/zeppelin/img/pig_zeppelin_tutorial.png differ

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/index.md
----------------------------------------------------------------------
diff --git a/docs/index.md b/docs/index.md
index 5010830..543242a 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -163,8 +163,10 @@ Join to our [Mailing list](https://zeppelin.apache.org/community.html) and repor
 * REST API: available REST API list in Apache Zeppelin
   * [Interpreter API](./rest-api/rest-interpreter.html)
   * [Notebook API](./rest-api/rest-notebook.html)
+  * [Notebook Repository API](./rest-api/rest-notebookRepo.html)
   * [Configuration API](./rest-api/rest-configuration.html)
   * [Credential API](./rest-api/rest-credential.html)
+  * [Helium API](./rest-api/rest-helium.html)
 * Security: available security support in Apache Zeppelin
   * [Authentication for NGINX](./security/authentication.html)
   * [Shiro Authentication](./security/shiroauthentication.html)
@@ -179,6 +181,8 @@ Join to our [Mailing list](https://zeppelin.apache.org/community.html) and repor
 * Contribute
   * [Writing Zeppelin Interpreter](./development/writingzeppelininterpreter.html)
   * [Writing Zeppelin Application (Experimental)](./development/writingzeppelinapplication.html)
+  * [Writing Zeppelin Spell (Experimental)](./development/writingzeppelinspell.html)
+  * [Writing Zeppelin Visualization (Experimental)](./development/writingzeppelinvisualization.html)
   * [How to contribute (code)](./development/howtocontribute.html)
   * [How to contribute (documentation website)](./development/howtocontributewebsite.html)
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/interpreter/pig.md
----------------------------------------------------------------------
diff --git a/docs/interpreter/pig.md b/docs/interpreter/pig.md
index ad2e80a..d1f18fa 100644
--- a/docs/interpreter/pig.md
+++ b/docs/interpreter/pig.md
@@ -15,14 +15,16 @@ group: manual
 [Apache Pig](https://pig.apache.org/) is a platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. The salient property of Pig programs is that their structure is amenable to substantial parallelization, which in turns enables them to handle very large data sets.
 
 ## Supported interpreter type
-  - `%pig.script` (default)
+  - `%pig.script` (default Pig interpreter, so you can use `%pig`)
     
-    All the pig script can run in this type of interpreter, and display type is plain text.
+    `%pig.script` is like the Pig grunt shell. Anything you can run in Pig grunt shell can be run in `%pig.script` interpreter, it is used for running Pig script where you don\u2019t need to visualize the data, it is suitable for data munging. 
   
   - `%pig.query`
  
-    Almost the same as `%pig.script`. The only difference is that you don't need to add alias in the last statement. And the display type is table.   
-
+    `%pig.query` is a little different compared with `%pig.script`. It is used for exploratory data analysis via Pig latin where you can leverage Zeppelin\u2019s visualization ability. There're 2 minor differences in the last statement between `%pig.script` and `%pig.query`
+    - No pig alias in the last statement in `%pig.query` (read the examples below).
+    - The last statement must be in single line in `%pig.query`
+    
 ## Supported runtime mode
   - Local
   - MapReduce
@@ -52,8 +54,8 @@ group: manual
 ### How to configure interpreter
 
 At the Interpreters menu, you have to create a new Pig interpreter. Pig interpreter has below properties by default.
-And you can set any pig properties here which will be passed to pig engine. (like tez.queue.name & mapred.job.queue.name).
-Besides, we use paragraph title as job name if it exists, else use the last line of pig script. So you can use that to find app running in YARN RM UI.
+And you can set any Pig properties here which will be passed to Pig engine. (like tez.queue.name & mapred.job.queue.name).
+Besides, we use paragraph title as job name if it exists, else use the last line of Pig script. So you can use that to find app running in YARN RM UI.
 
 <table class="table-configuration">
     <tr>
@@ -95,22 +97,52 @@ Besides, we use paragraph title as job name if it exists, else use the last line
 ```
 %pig
 
-raw_data = load 'dataset/sf_crime/train.csv' using PigStorage(',') as (Dates,Category,Descript,DayOfWeek,PdDistrict,Resolution,Address,X,Y);
-b = group raw_data all;
-c = foreach b generate COUNT($1);
-dump c;
+bankText = load 'bank.csv' using PigStorage(';');
+bank = foreach bankText generate $0 as age, $1 as job, $2 as marital, $3 as education, $5 as balance; 
+bank = filter bank by age != '"age"';
+bank = foreach bank generate (int)age, REPLACE(job,'"','') as job, REPLACE(marital, '"', '') as marital, (int)(REPLACE(balance, '"', '')) as balance;
+store bank into 'clean_bank.csv' using PigStorage(';'); -- this statement is optional, it just show you that most of time %pig.script is used for data munging before querying the data. 
 ```
 
 ##### pig.query
 
+Get the number of each age where age is less than 30
+
+```
+%pig.query
+ 
+bank_data = filter bank by age < 30;
+b = group bank_data by age;
+foreach b generate group, COUNT($1);
+```
+
+The same as above, but use dynamic text form so that use can specify the variable maxAge in textbox. (See screenshot below). Dynamic form is a very cool feature of Zeppelin, you can refer this [link]((../manual/dynamicform.html)) for details.
+
 ```
 %pig.query
+ 
+bank_data = filter bank by age < ${maxAge=40};
+b = group bank_data by age;
+foreach b generate group, COUNT($1) as count;
+```
+
+Get the number of each age for specific marital type, also use dynamic form here. User can choose the marital type in the dropdown list (see screenshot below).
 
-b = foreach raw_data generate Category;
-c = group b by Category;
-foreach c generate group as category, COUNT($1) as count;
+```
+%pig.query
+ 
+bank_data = filter bank by marital=='${marital=single,single|divorced|married}';
+b = group bank_data by age;
+foreach b generate group, COUNT($1) as count;
 ```
 
+The above examples are in the Pig tutorial note in Zeppelin, you can check that for details. Here's the screenshot.
+
+<img class="img-responsive" width="1024px" style="margin:0 auto; padding: 26px;" src="../assets/themes/zeppelin/img/pig_zeppelin_tutorial.png" />
+
+
 Data is shared between `%pig` and `%pig.query`, so that you can do some common work in `%pig`, and do different kinds of query based on the data of `%pig`. 
-Besides, we recommend you to specify alias explicitly so that the visualization can display the column name correctly. Here, we name `COUNT($1)` as `count`, if you don't do this,
-then we will name it using position, here we will use `col_1` to represent `COUNT($1)` if you don't specify alias for it. There's one pig tutorial note in zeppelin for your reference.
+Besides, we recommend you to specify alias explicitly so that the visualization can display the column name correctly. In the above example 2 and 3 of `%pig.query`, we name `COUNT($1)` as `count`. If you don't do this,
+then we will name it using position. E.g. in the above first example of `%pig.query`, we will use `col_1` in chart to represent `COUNT($1)`.
+
+

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/rest-api/rest-helium.md
----------------------------------------------------------------------
diff --git a/docs/rest-api/rest-helium.md b/docs/rest-api/rest-helium.md
index 8d2ff4d..b78b576 100644
--- a/docs/rest-api/rest-helium.md
+++ b/docs/rest-api/rest-helium.md
@@ -103,7 +103,7 @@ If you work with Apache Zeppelin and find a need for an additional REST API, ple
         "enabled": false
       }
     ],
-    "zeppelin_horizontalbar": [
+    "zeppelin\_horizontalbar": [
       {
         "registry": "local",
         "pkg": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/docs/rest-api/rest-notebookRepo.md
----------------------------------------------------------------------
diff --git a/docs/rest-api/rest-notebookRepo.md b/docs/rest-api/rest-notebookRepo.md
new file mode 100644
index 0000000..a11d387
--- /dev/null
+++ b/docs/rest-api/rest-notebookRepo.md
@@ -0,0 +1,179 @@
+---
+layout: page
+title: "Apache Zeppelin notebook repository REST API"
+description: "This page contains Apache Zeppelin notebook repository REST API information."
+group: rest-api
+---
+<!--
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+-->
+{% include JB/setup %}
+
+# Apache Zeppelin Notebook Repository API
+
+<div id="toc"></div>
+
+## Overview
+Apache Zeppelin provides several REST APIs for interaction and remote activation of zeppelin functionality.
+All REST APIs are available starting with the following endpoint `http://[zeppelin-server]:[zeppelin-port]/api`. 
+Note that Apache Zeppelin REST APIs receive or return JSON objects, it is recommended for you to install some JSON viewers such as [JSONView](https://chrome.google.com/webstore/detail/jsonview/chklaanhfefbnpoihckbnefhakgolnmc).
+
+If you work with Apache Zeppelin and find a need for an additional REST API, please [file an issue or send us an email](http://zeppelin.apache.org/community.html).
+
+## Notebook Repository REST API List
+
+### List all available notebook repositories
+
+  <table class="table-configuration">
+    <col width="200">
+    <tr>
+      <td>Description</td>
+      <td>This ```GET``` method returns all the available notebook repositories.</td>
+    </tr>
+    <tr>
+      <td>URL</td>
+      <td>```http://[zeppelin-server]:[zeppelin-port]/api/notebook-repositories```</td>
+    </tr>
+    <tr>
+      <td>Success code</td>
+      <td>200</td>
+    </tr>
+    <tr>
+      <td>Fail code</td>
+      <td>500</td>
+    </tr>
+    <tr>
+      <td>Sample JSON response</td>
+      <td>
+        <pre>
+{
+  "status": "OK",
+  "message": "",
+  "body": [
+    {
+      "name": "GitNotebookRepo",
+      "className": "org.apache.zeppelin.notebook.repo.GitNotebookRepo",
+      "settings": [
+        {
+          "type": "INPUT",
+          "value": [],
+          "selected": "ZEPPELIN_HOME/zeppelin/notebook/",
+          "name": "Notebook Path"
+        }
+      ]
+    }
+  ]
+}
+        </pre>
+      </td>
+    </tr>
+  </table>
+
+<br/>
+
+### Reload a notebook repository
+
+  <table class="table-configuration">
+    <col width="200">
+    <tr>
+      <td>Description</td>
+      <td>This ```GET``` method triggers reloading and broadcasting of the note list.</td>
+    </tr>
+    <tr>
+      <td>URL</td>
+      <td>```http://[zeppelin-server]:[zeppelin-port]/api/notebook-repositories/reload```</td>
+    </tr>
+    <tr>
+      <td>Success code</td>
+      <td>200</td>
+    </tr>
+    <tr>
+      <td>Fail code</td>
+      <td>500</td>
+    </tr>
+    <tr>
+      <td>Sample JSON response</td>
+      <td>
+        <pre>
+{
+  "status": "OK",
+  "message": ""
+}
+        </pre>
+      </td>
+    </tr>
+  </table>
+
+<br/>
+
+### Update a specific notebook repository
+
+  <table class="table-configuration">
+    <col width="200">
+    <tr>
+      <td>Description</td>
+      <td>This ```PUT``` method updates a specific notebook repository.</td>
+    </tr>
+    <tr>
+      <td>URL</td>
+      <td>```http://[zeppelin-server]:[zeppelin-port]/api/notebook-repositories```</td>
+    </tr>
+    <tr>
+      <td>Success code</td>
+      <td>200</td>
+    </tr>
+    <tr>
+      <td>Fail code</td>
+      <td>
+        404 when the specified notebook repository doesn't exist <br/> 
+        406 for invalid payload <br/>
+        500 for any other errors
+      </td>
+    </tr>
+    <tr>
+      <td>Sample JSON input</td>
+      <td>
+        <pre>
+{
+  "name":"org.apache.zeppelin.notebook.repo.GitNotebookRepo",
+  "settings":{
+    "Notebook Path":"/tmp/notebook/"
+  }
+}
+        </pre>
+      </td>
+    </tr>
+    <tr>
+      <td>Sample JSON response</td>
+      <td>
+        <pre>
+{
+  "status": "OK",
+  "message": "",
+  "body": {
+    "name": "GitNotebookRepo",
+    "className": "org.apache.zeppelin.notebook.repo.GitNotebookRepo",
+    "settings": [
+      {
+        "type": "INPUT",
+        "value": [],
+        "selected": "/tmp/notebook/",
+        "name": "Notebook Path"
+      }
+    ]
+  }
+}
+        </pre>
+      </td>
+    </tr>
+  </table>

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/notebook/2C57UKYWR/note.json
----------------------------------------------------------------------
diff --git a/notebook/2C57UKYWR/note.json b/notebook/2C57UKYWR/note.json
index 21d1231..22afb2a 100644
--- a/notebook/2C57UKYWR/note.json
+++ b/notebook/2C57UKYWR/note.json
@@ -115,7 +115,7 @@
     {
       "text": "%pig\n\nbankText \u003d load \u0027bank.csv\u0027 using PigStorage(\u0027;\u0027);\nbank \u003d foreach bankText generate $0 as age, $1 as job, $2 as marital, $3 as education, $5 as balance; \nbank \u003d filter bank by age !\u003d \u0027\"age\"\u0027;\nbank \u003d foreach bank generate (int)age, REPLACE(job,\u0027\"\u0027,\u0027\u0027) as job, REPLACE(marital, \u0027\"\u0027, \u0027\u0027) as marital, (int)(REPLACE(balance, \u0027\"\u0027, \u0027\u0027)) as balance;\n\n-- The following statement is optional, it depends on whether your needs.\n-- store bank into \u0027clean_bank.csv\u0027 using PigStorage(\u0027;\u0027);\n\n\n",
       "user": "anonymous",
-      "dateUpdated": "Jan 22, 2017 12:49:11 PM",
+      "dateUpdated": "Feb 24, 2017 5:08:08 PM",
       "config": {
         "colWidth": 12.0,
         "editorMode": "ace/mode/pig",
@@ -138,15 +138,15 @@
       "jobName": "paragraph_1483277250237_-466604517",
       "id": "20161228-140640_1560978333",
       "dateCreated": "Jan 1, 2017 9:27:30 PM",
-      "dateStarted": "Jan 22, 2017 12:49:11 PM",
-      "dateFinished": "Jan 22, 2017 12:49:13 PM",
+      "dateStarted": "Feb 24, 2017 5:08:08 PM",
+      "dateFinished": "Feb 24, 2017 5:08:11 PM",
       "status": "FINISHED",
       "progressUpdateIntervalMs": 500
     },
     {
       "text": "%pig.query\n\nbank_data \u003d filter bank by age \u003c 30;\nb \u003d group bank_data by age;\nforeach b generate group, COUNT($1);\n\n",
       "user": "anonymous",
-      "dateUpdated": "Jan 22, 2017 12:49:16 PM",
+      "dateUpdated": "Feb 24, 2017 5:08:13 PM",
       "config": {
         "colWidth": 4.0,
         "editorMode": "ace/mode/pig",
@@ -183,15 +183,15 @@
       "jobName": "paragraph_1483277250238_-465450270",
       "id": "20161228-140730_1903342877",
       "dateCreated": "Jan 1, 2017 9:27:30 PM",
-      "dateStarted": "Jan 22, 2017 12:49:16 PM",
-      "dateFinished": "Jan 22, 2017 12:49:30 PM",
+      "dateStarted": "Feb 24, 2017 5:08:13 PM",
+      "dateFinished": "Feb 24, 2017 5:08:26 PM",
       "status": "FINISHED",
       "progressUpdateIntervalMs": 500
     },
     {
-      "text": "%pig.query\n\nbank_data \u003d filter bank by age \u003c ${maxAge\u003d40};\nb \u003d group bank_data by age;\nforeach b generate group, COUNT($1);",
+      "text": "%pig.query\n\nbank_data \u003d filter bank by age \u003c ${maxAge\u003d40};\nb \u003d group bank_data by age;\nforeach b generate group, COUNT($1) as count;",
       "user": "anonymous",
-      "dateUpdated": "Jan 22, 2017 12:49:18 PM",
+      "dateUpdated": "Feb 24, 2017 5:08:14 PM",
       "config": {
         "colWidth": 4.0,
         "editorMode": "ace/mode/pig",
@@ -228,7 +228,7 @@
         "msg": [
           {
             "type": "TABLE",
-            "data": "group\tcol_1\n19\t4\n20\t3\n21\t7\n22\t9\n23\t20\n24\t24\n25\t44\n26\t77\n27\t94\n28\t103\n29\t97\n30\t150\n31\t199\n32\t224\n33\t186\n34\t231\n35\t180\n"
+            "data": "group\tcount\n19\t4\n20\t3\n21\t7\n22\t9\n23\t20\n24\t24\n25\t44\n26\t77\n27\t94\n28\t103\n29\t97\n30\t150\n31\t199\n32\t224\n33\t186\n34\t231\n35\t180\n"
           }
         ]
       },
@@ -236,15 +236,15 @@
       "jobName": "paragraph_1483277250239_-465835019",
       "id": "20161228-154918_1551591203",
       "dateCreated": "Jan 1, 2017 9:27:30 PM",
-      "dateStarted": "Jan 22, 2017 12:49:18 PM",
-      "dateFinished": "Jan 22, 2017 12:49:32 PM",
+      "dateStarted": "Feb 24, 2017 5:08:14 PM",
+      "dateFinished": "Feb 24, 2017 5:08:29 PM",
       "status": "FINISHED",
       "progressUpdateIntervalMs": 500
     },
     {
-      "text": "%pig.query\n\nbank_data \u003d filter bank by marital\u003d\u003d\u0027${marital\u003dsingle,single|divorced|married}\u0027;\nb \u003d group bank_data by age;\nforeach b generate group, COUNT($1) as c;\n\n\n",
+      "text": "%pig.query\n\nbank_data \u003d filter bank by marital\u003d\u003d\u0027${marital\u003dsingle,single|divorced|married}\u0027;\nb \u003d group bank_data by age;\nforeach b generate group, COUNT($1) as count;\n\n\n",
       "user": "anonymous",
-      "dateUpdated": "Jan 22, 2017 12:49:20 PM",
+      "dateUpdated": "Feb 24, 2017 5:08:15 PM",
       "config": {
         "colWidth": 4.0,
         "editorMode": "ace/mode/pig",
@@ -292,7 +292,7 @@
         "msg": [
           {
             "type": "TABLE",
-            "data": "group\tc\n23\t3\n24\t11\n25\t11\n26\t18\n27\t26\n28\t23\n29\t37\n30\t56\n31\t104\n32\t105\n33\t103\n34\t142\n35\t109\n36\t117\n37\t100\n38\t99\n39\t88\n40\t105\n41\t97\n42\t91\n43\t79\n44\t68\n45\t76\n46\t82\n47\t78\n48\t91\n49\t87\n50\t74\n51\t63\n52\t66\n53\t75\n54\t56\n55\t68\n56\t50\n57\t78\n58\t67\n59\t56\n60\t36\n61\t15\n62\t5\n63\t7\n64\t6\n65\t4\n66\t7\n67\t5\n68\t1\n69\t5\n70\t5\n71\t5\n72\t4\n73\t6\n74\t2\n75\t3\n76\t1\n77\t5\n78\t2\n79\t3\n80\t6\n81\t1\n83\t2\n86\t1\n87\t1\n"
+            "data": "group\tcount\n23\t3\n24\t11\n25\t11\n26\t18\n27\t26\n28\t23\n29\t37\n30\t56\n31\t104\n32\t105\n33\t103\n34\t142\n35\t109\n36\t117\n37\t100\n38\t99\n39\t88\n40\t105\n41\t97\n42\t91\n43\t79\n44\t68\n45\t76\n46\t82\n47\t78\n48\t91\n49\t87\n50\t74\n51\t63\n52\t66\n53\t75\n54\t56\n55\t68\n56\t50\n57\t78\n58\t67\n59\t56\n60\t36\n61\t15\n62\t5\n63\t7\n64\t6\n65\t4\n66\t7\n67\t5\n68\t1\n69\t5\n70\t5\n71\t5\n72\t4\n73\t6\n74\t2\n75\t3\n76\t1\n77\t5\n78\t2\n79\t3\n80\t6\n81\t1\n83\t2\n86\t1\n87\t1\n"
           }
         ]
       },
@@ -300,8 +300,8 @@
       "jobName": "paragraph_1483277250240_-480070728",
       "id": "20161228-142259_575675591",
       "dateCreated": "Jan 1, 2017 9:27:30 PM",
-      "dateStarted": "Jan 22, 2017 12:49:30 PM",
-      "dateFinished": "Jan 22, 2017 12:49:34 PM",
+      "dateStarted": "Feb 24, 2017 5:08:27 PM",
+      "dateFinished": "Feb 24, 2017 5:08:31 PM",
       "status": "FINISHED",
       "progressUpdateIntervalMs": 500
     },

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/f9630a58/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js b/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
index 8659071..937f89b 100644
--- a/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
+++ b/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
@@ -423,10 +423,8 @@ function ParagraphCtrl($scope, $rootScope, $route, $window, $routeParams, $locat
 
   $scope.changeColWidth = function(paragraph, width) {
     angular.element('.navbar-right.open').removeClass('open');
-    if (width !== paragraph.config.colWidth) {
-      paragraph.config.colWidth = width;
-      commitParagraph(paragraph);
-    }
+    paragraph.config.colWidth = width;
+    commitParagraph(paragraph);
   };
 
   $scope.toggleOutput = function(paragraph) {


[14/23] zeppelin git commit: Fix CI build failure on branch-0.7

Posted by mi...@apache.org.
Fix CI build failure on branch-0.7

### What is this PR for?
CI build is failing on `branch-0.7`. This PR cherry-pick some necessary commits from

https://github.com/apache/zeppelin/pull/2003
https://github.com/apache/zeppelin/pull/2081

and a commit (3ae8760) fixes changes made by https://github.com/apache/zeppelin/pull/2071 for branch-0.7.

### What type of PR is it?
Hot Fix

### Todos
* [ ] - Make CI green

### How should this be tested?
See if CI becomes green

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Lee moon soo <mo...@apache.org>

Closes #2103 from Leemoonsoo/fix-branch-0.7-ci and squashes the following commits:

9539c9b [Lee moon soo] Try start and terminate spark context after each test class
f077980 [Lee moon soo] Correct test implementation with Authentication Enable
0eefb66 [Lee moon soo] Handle multiple Set-Cookie headers
8cfc5f9 [Lee moon soo] Remove unnecessary assert
d4a8807 [Lee moon soo] helium.bundle.js -> vis.bundle.js
9b6ec4a [Lee moon soo] create zeppelin-web/dist directory for test
129b40f [Lee moon soo] reduce build time
7d9489b [Lee moon soo] Reduce log


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/bfa812a9
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/bfa812a9
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/bfa812a9

Branch: refs/heads/branch-0.7
Commit: bfa812a9d0d989e9abf24fcb00e87f0431b21a8b
Parents: 730784b
Author: Lee moon soo <mo...@apache.org>
Authored: Tue Feb 14 02:48:59 2017 +0900
Committer: Lee moon soo <mo...@apache.org>
Committed: Tue Mar 14 20:13:07 2017 -0700

----------------------------------------------------------------------
 .travis.yml                                     | 26 +++-----
 .../spark/PySparkInterpreterMatplotlibTest.java | 48 +++++++-------
 .../zeppelin/spark/PySparkInterpreterTest.java  | 51 ++++++++-------
 .../zeppelin/spark/SparkInterpreterTest.java    | 53 ++++++++--------
 .../zeppelin/spark/SparkSqlInterpreterTest.java | 67 ++++++++++----------
 .../zeppelin/rest/AbstractTestRestApi.java      | 18 +++++-
 .../zeppelin/rest/SecurityRestApiTest.java      | 13 ++--
 .../interpreter/InterpreterSetting.java         |  4 +-
 .../src/main/resources/helium/webpack.config.js |  2 +-
 .../apache/zeppelin/notebook/NotebookTest.java  |  5 +-
 10 files changed, 144 insertions(+), 143 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/.travis.yml
----------------------------------------------------------------------
diff --git a/.travis.yml b/.travis.yml
index c2a47e5..a73f7a8 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -48,21 +48,13 @@ matrix:
     - jdk: "oraclejdk7"
       env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
 
-    # Test all modules with scala 2.10
+    # Test spark module for 1.6.3 with scala 2.10
     - jdk: "oraclejdk7"
-      env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Pr -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Pbeam -Phelium-dev -Pexamples -Pscala-2.10" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
+      env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.10" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
 
-    # Test all modules with scala 2.11
+    # Test spark module for 1.6.3 with scala 2.11
     - jdk: "oraclejdk7"
-      env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Pr -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
-
-    # Test spark module for 1.5.2
-    - jdk: "oraclejdk7"
-      env: SCALA_VER="2.10" SPARK_VER="1.5.2" HADOOP_VER="2.6" PROFILE="-Pspark-1.5 -Pr -Phadoop-2.6 -Ppyspark -Psparkr" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,r -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
-
-    # Test spark module for 1.4.1
-    - jdk: "oraclejdk7"
-      env: SCALA_VER="2.10" SPARK_VER="1.4.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.4 -Pr -Phadoop-2.6 -Ppyspark -Psparkr" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,r -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
+      env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11 -Dscala.version=2.11.7 -Dscala.binary.version=2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
 
     # Test selenium with spark module for 1.6.3
     - jdk: "oraclejdk7"
@@ -70,15 +62,15 @@ matrix:
 
     # Test python/pyspark with python 2
     - jdk: "oraclejdk7"
-      env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python -Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
+      env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
 
     # Test python/pyspark with python 3
     - jdk: "oraclejdk7"
-      env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python -Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
+      env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
 
     # Test livy with spark 1.5.2 and hadoop 2.6
     - jdk: "oraclejdk7"
-      env: SCALA_VER="2.10" $LIVY_VER="0.2.0" SPARK_VER="1.5.2" HADOOP_VER="2.6" PROFILE="-Pspark-1.5 -Phadoop-2.6" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,livy -DfailIfNoTests=false"
+      env: SCALA_VER="2.10" $LIVY_VER="0.2.0" SPARK_VER="1.5.2" HADOOP_VER="2.6" PROFILE="-Pspark-1.5 -Phadoop-2.6" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" MODULES="-pl zeppelin-interpreter,livy" TEST_PROJECTS="-DfailIfNoTests=false"
 
 before_install:
   - echo "MAVEN_OPTS='-Xms1024M -Xmx2048M -XX:MaxPermSize=1024m -XX:-UseGCOverheadLimit -Dorg.slf4j.simpleLogger.defaultLogLevel=warn'" >> ~/.mavenrc
@@ -90,7 +82,7 @@ before_install:
   - source ~/.environ
 
 install:
-  - mvn $BUILD_FLAG $PROFILE -B
+  - mvn $BUILD_FLAG $MODULES $PROFILE -B
 
 before_script:
   - travis_retry ./testing/downloadSpark.sh $SPARK_VER $HADOOP_VER
@@ -101,7 +93,7 @@ before_script:
   - tail conf/zeppelin-env.sh
 
 script:
-  - mvn $TEST_FLAG $PROFILE -B $TEST_PROJECTS
+  - mvn $TEST_FLAG $MODULES $PROFILE -B $TEST_PROJECTS
 
 after_success:
   - echo "Travis exited with ${TRAVIS_TEST_RESULT}"

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterMatplotlibTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterMatplotlibTest.java b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterMatplotlibTest.java
index 17b2128..7fe8b5e 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterMatplotlibTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterMatplotlibTest.java
@@ -39,14 +39,14 @@ import static org.junit.Assert.*;
 @FixMethodOrder(MethodSorters.NAME_ASCENDING)
 public class PySparkInterpreterMatplotlibTest {
 
-  @Rule
-  public TemporaryFolder tmpDir = new TemporaryFolder();
-
-  public static SparkInterpreter sparkInterpreter;
-  public static PySparkInterpreter pyspark;
-  public static InterpreterGroup intpGroup;
-  public static Logger LOGGER = LoggerFactory.getLogger(PySparkInterpreterTest.class);
-  private InterpreterContext context;
+  @ClassRule
+  public static TemporaryFolder tmpDir = new TemporaryFolder();
+
+  static SparkInterpreter sparkInterpreter;
+  static PySparkInterpreter pyspark;
+  static InterpreterGroup intpGroup;
+  static Logger LOGGER = LoggerFactory.getLogger(PySparkInterpreterTest.class);
+  static InterpreterContext context;
   
   public static class AltPySparkInterpreter extends PySparkInterpreter {
     /**
@@ -80,7 +80,7 @@ public class PySparkInterpreterMatplotlibTest {
     }
   }
 
-  private Properties getPySparkTestProperties() throws IOException {
+  private static Properties getPySparkTestProperties() throws IOException {
     Properties p = new Properties();
     p.setProperty("master", "local[*]");
     p.setProperty("spark.app.name", "Zeppelin Test");
@@ -106,24 +106,20 @@ public class PySparkInterpreterMatplotlibTest {
     return version;
   }
 
-  @Before
-  public void setUp() throws Exception {
+  @BeforeClass
+  public static void setUp() throws Exception {
     intpGroup = new InterpreterGroup();
     intpGroup.put("note", new LinkedList<Interpreter>());
 
-    if (sparkInterpreter == null) {
-      sparkInterpreter = new SparkInterpreter(getPySparkTestProperties());
-      intpGroup.get("note").add(sparkInterpreter);
-      sparkInterpreter.setInterpreterGroup(intpGroup);
-      sparkInterpreter.open();
-    }
+    sparkInterpreter = new SparkInterpreter(getPySparkTestProperties());
+    intpGroup.get("note").add(sparkInterpreter);
+    sparkInterpreter.setInterpreterGroup(intpGroup);
+    sparkInterpreter.open();
 
-    if (pyspark == null) {
-      pyspark = new AltPySparkInterpreter(getPySparkTestProperties());
-      intpGroup.get("note").add(pyspark);
-      pyspark.setInterpreterGroup(intpGroup);
-      pyspark.open();
-    }
+    pyspark = new AltPySparkInterpreter(getPySparkTestProperties());
+    intpGroup.get("note").add(pyspark);
+    pyspark.setInterpreterGroup(intpGroup);
+    pyspark.open();
 
     context = new InterpreterContext("note", "id", null, "title", "text",
       new AuthenticationInfo(),
@@ -135,6 +131,12 @@ public class PySparkInterpreterMatplotlibTest {
       new InterpreterOutput(null));
   }
 
+  @AfterClass
+  public static void tearDown() {
+    pyspark.close();
+    sparkInterpreter.close();
+  }
+
   @Test
   public void dependenciesAreInstalled() {
     // matplotlib

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
index 55c405d..3697512 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
@@ -23,10 +23,7 @@ import org.apache.zeppelin.interpreter.*;
 import org.apache.zeppelin.interpreter.thrift.InterpreterCompletion;
 import org.apache.zeppelin.resource.LocalResourcePool;
 import org.apache.zeppelin.user.AuthenticationInfo;
-import org.junit.Before;
-import org.junit.FixMethodOrder;
-import org.junit.Rule;
-import org.junit.Test;
+import org.junit.*;
 import org.junit.rules.TemporaryFolder;
 import org.junit.runners.MethodSorters;
 import org.slf4j.Logger;
@@ -44,16 +41,16 @@ import static org.junit.Assert.*;
 @FixMethodOrder(MethodSorters.NAME_ASCENDING)
 public class PySparkInterpreterTest {
 
-  @Rule
-  public TemporaryFolder tmpDir = new TemporaryFolder();
+  @ClassRule
+  public static TemporaryFolder tmpDir = new TemporaryFolder();
 
-  public static SparkInterpreter sparkInterpreter;
-  public static PySparkInterpreter pySparkInterpreter;
-  public static InterpreterGroup intpGroup;
-  public static Logger LOGGER = LoggerFactory.getLogger(PySparkInterpreterTest.class);
-  private InterpreterContext context;
+  static SparkInterpreter sparkInterpreter;
+  static PySparkInterpreter pySparkInterpreter;
+  static InterpreterGroup intpGroup;
+  static Logger LOGGER = LoggerFactory.getLogger(PySparkInterpreterTest.class);
+  static InterpreterContext context;
 
-  private Properties getPySparkTestProperties() throws IOException {
+  private static Properties getPySparkTestProperties() throws IOException {
     Properties p = new Properties();
     p.setProperty("master", "local[*]");
     p.setProperty("spark.app.name", "Zeppelin Test");
@@ -79,24 +76,20 @@ public class PySparkInterpreterTest {
     return version;
   }
 
-  @Before
-  public void setUp() throws Exception {
+  @BeforeClass
+  public static void setUp() throws Exception {
     intpGroup = new InterpreterGroup();
     intpGroup.put("note", new LinkedList<Interpreter>());
 
-    if (sparkInterpreter == null) {
-      sparkInterpreter = new SparkInterpreter(getPySparkTestProperties());
-      intpGroup.get("note").add(sparkInterpreter);
-      sparkInterpreter.setInterpreterGroup(intpGroup);
-      sparkInterpreter.open();
-    }
+    sparkInterpreter = new SparkInterpreter(getPySparkTestProperties());
+    intpGroup.get("note").add(sparkInterpreter);
+    sparkInterpreter.setInterpreterGroup(intpGroup);
+    sparkInterpreter.open();
 
-    if (pySparkInterpreter == null) {
-      pySparkInterpreter = new PySparkInterpreter(getPySparkTestProperties());
-      intpGroup.get("note").add(pySparkInterpreter);
-      pySparkInterpreter.setInterpreterGroup(intpGroup);
-      pySparkInterpreter.open();
-    }
+    pySparkInterpreter = new PySparkInterpreter(getPySparkTestProperties());
+    intpGroup.get("note").add(pySparkInterpreter);
+    pySparkInterpreter.setInterpreterGroup(intpGroup);
+    pySparkInterpreter.open();
 
     context = new InterpreterContext("note", "id", null, "title", "text",
       new AuthenticationInfo(),
@@ -108,6 +101,12 @@ public class PySparkInterpreterTest {
       new InterpreterOutput(null));
   }
 
+  @AfterClass
+  public static void tearDown() {
+    pySparkInterpreter.close();
+    sparkInterpreter.close();
+  }
+
   @Test
   public void testBasicIntp() {
     if (getSparkVersionNumber() > 11) {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
index 1410890..8552e24 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
@@ -35,10 +35,7 @@ import org.apache.zeppelin.user.AuthenticationInfo;
 import org.apache.zeppelin.display.GUI;
 import org.apache.zeppelin.interpreter.*;
 import org.apache.zeppelin.interpreter.InterpreterResult.Code;
-import org.junit.Before;
-import org.junit.FixMethodOrder;
-import org.junit.Rule;
-import org.junit.Test;
+import org.junit.*;
 import org.junit.rules.TemporaryFolder;
 import org.junit.runners.MethodSorters;
 import org.slf4j.Logger;
@@ -47,19 +44,19 @@ import org.slf4j.LoggerFactory;
 @FixMethodOrder(MethodSorters.NAME_ASCENDING)
 public class SparkInterpreterTest {
 
-  @Rule
-  public TemporaryFolder tmpDir = new TemporaryFolder();
+  @ClassRule
+  public static TemporaryFolder tmpDir = new TemporaryFolder();
 
-  public static SparkInterpreter repl;
-  public static InterpreterGroup intpGroup;
-  private InterpreterContext context;
-  public static Logger LOGGER = LoggerFactory.getLogger(SparkInterpreterTest.class);
+  static SparkInterpreter repl;
+  static InterpreterGroup intpGroup;
+  static InterpreterContext context;
+  static Logger LOGGER = LoggerFactory.getLogger(SparkInterpreterTest.class);
 
   /**
    * Get spark version number as a numerical value.
    * eg. 1.1.x => 11, 1.2.x => 12, 1.3.x => 13 ...
    */
-  public static int getSparkVersionNumber() {
+  public static int getSparkVersionNumber(SparkInterpreter repl) {
     if (repl == null) {
       return 0;
     }
@@ -81,16 +78,14 @@ public class SparkInterpreterTest {
     return p;
   }
 
-  @Before
-  public void setUp() throws Exception {
-    if (repl == null) {
-      intpGroup = new InterpreterGroup();
-      intpGroup.put("note", new LinkedList<Interpreter>());
-      repl = new SparkInterpreter(getSparkTestProperties(tmpDir));
-      repl.setInterpreterGroup(intpGroup);
-      intpGroup.get("note").add(repl);
-      repl.open();
-    }
+  @BeforeClass
+  public static void setUp() throws Exception {
+    intpGroup = new InterpreterGroup();
+    intpGroup.put("note", new LinkedList<Interpreter>());
+    repl = new SparkInterpreter(getSparkTestProperties(tmpDir));
+    repl.setInterpreterGroup(intpGroup);
+    intpGroup.get("note").add(repl);
+    repl.open();
 
     context = new InterpreterContext("note", "id", null, "title", "text",
         new AuthenticationInfo(),
@@ -102,6 +97,11 @@ public class SparkInterpreterTest {
         new InterpreterOutput(null));
   }
 
+  @AfterClass
+  public static void tearDown() {
+    repl.close();
+  }
+
   @Test
   public void testBasicIntp() {
     assertEquals(InterpreterResult.Code.SUCCESS,
@@ -150,7 +150,7 @@ public class SparkInterpreterTest {
 
   @Test
   public void testCreateDataFrame() {
-    if (getSparkVersionNumber() >= 13) {
+    if (getSparkVersionNumber(repl) >= 13) {
       repl.interpret("case class Person(name:String, age:Int)\n", context);
       repl.interpret("val people = sc.parallelize(Seq(Person(\"moon\", 33), Person(\"jobs\", 51), Person(\"gates\", 51), Person(\"park\", 34)))\n", context);
       repl.interpret("people.toDF.count", context);
@@ -166,7 +166,7 @@ public class SparkInterpreterTest {
     String code = "";
     repl.interpret("case class Person(name:String, age:Int)\n", context);
     repl.interpret("val people = sc.parallelize(Seq(Person(\"moon\", 33), Person(\"jobs\", 51), Person(\"gates\", 51), Person(\"park\", 34)))\n", context);
-    if (getSparkVersionNumber() < 13) {
+    if (getSparkVersionNumber(repl) < 13) {
       repl.interpret("people.registerTempTable(\"people\")", context);
       code = "z.show(sqlc.sql(\"select * from people\"))";
     } else {
@@ -182,7 +182,8 @@ public class SparkInterpreterTest {
     assertEquals(Code.SUCCESS, repl.interpret("people.take(3)", context).code());
 
 
-    if (getSparkVersionNumber() <= 11) { // spark 1.2 or later does not allow create multiple SparkContext in the same jvm by default.
+    if (getSparkVersionNumber(repl) <= 11) { // spark 1.2 or later does not allow create multiple
+      // SparkContext in the same jvm by default.
       // create new interpreter
       SparkInterpreter repl2 = new SparkInterpreter(getSparkTestProperties(tmpDir));
       repl2.setInterpreterGroup(intpGroup);
@@ -235,7 +236,7 @@ public class SparkInterpreterTest {
 
   @Test
   public void testEnableImplicitImport() throws IOException {
-    if (getSparkVersionNumber() >= 13) {
+    if (getSparkVersionNumber(repl) >= 13) {
       // Set option of importing implicits to "true", and initialize new Spark repl
       Properties p = getSparkTestProperties(tmpDir);
       p.setProperty("zeppelin.spark.importImplicit", "true");
@@ -252,7 +253,7 @@ public class SparkInterpreterTest {
 
   @Test
   public void testDisableImplicitImport() throws IOException {
-    if (getSparkVersionNumber() >= 13) {
+    if (getSparkVersionNumber(repl) >= 13) {
       // Set option of importing implicits to "false", and initialize new Spark repl
       // this test should return error status when creating DataFrame from sequence
       Properties p = getSparkTestProperties(tmpDir);

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
index 89cd712..5984645 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
@@ -27,9 +27,7 @@ import org.apache.zeppelin.user.AuthenticationInfo;
 import org.apache.zeppelin.display.GUI;
 import org.apache.zeppelin.interpreter.*;
 import org.apache.zeppelin.interpreter.InterpreterResult.Type;
-import org.junit.Before;
-import org.junit.Rule;
-import org.junit.Test;
+import org.junit.*;
 import org.junit.rules.TemporaryFolder;
 
 import static org.junit.Assert.assertEquals;
@@ -37,45 +35,38 @@ import static org.junit.Assert.assertTrue;
 
 public class SparkSqlInterpreterTest {
 
-  @Rule
-  public TemporaryFolder tmpDir = new TemporaryFolder();
+  @ClassRule
+  public static TemporaryFolder tmpDir = new TemporaryFolder();
 
-  private SparkSqlInterpreter sql;
-  private SparkInterpreter repl;
-  private InterpreterContext context;
-  private InterpreterGroup intpGroup;
+  static SparkSqlInterpreter sql;
+  static SparkInterpreter repl;
+  static InterpreterContext context;
+  static InterpreterGroup intpGroup;
 
-  @Before
-  public void setUp() throws Exception {
+  @BeforeClass
+  public static void setUp() throws Exception {
     Properties p = new Properties();
     p.putAll(SparkInterpreterTest.getSparkTestProperties(tmpDir));
     p.setProperty("zeppelin.spark.maxResult", "1000");
     p.setProperty("zeppelin.spark.concurrentSQL", "false");
     p.setProperty("zeppelin.spark.sql.stacktrace", "false");
 
-    if (repl == null) {
-
-      if (SparkInterpreterTest.repl == null) {
-        repl = new SparkInterpreter(p);
-        intpGroup = new InterpreterGroup();
-        repl.setInterpreterGroup(intpGroup);
-        repl.open();
-        SparkInterpreterTest.repl = repl;
-        SparkInterpreterTest.intpGroup = intpGroup;
-      } else {
-        repl = SparkInterpreterTest.repl;
-        intpGroup = SparkInterpreterTest.intpGroup;
-      }
-
-      sql = new SparkSqlInterpreter(p);
-
-      intpGroup = new InterpreterGroup();
-      intpGroup.put("note", new LinkedList<Interpreter>());
-      intpGroup.get("note").add(repl);
-      intpGroup.get("note").add(sql);
-      sql.setInterpreterGroup(intpGroup);
-      sql.open();
-    }
+    repl = new SparkInterpreter(p);
+    intpGroup = new InterpreterGroup();
+    repl.setInterpreterGroup(intpGroup);
+    repl.open();
+    SparkInterpreterTest.repl = repl;
+    SparkInterpreterTest.intpGroup = intpGroup;
+
+    sql = new SparkSqlInterpreter(p);
+
+    intpGroup = new InterpreterGroup();
+    intpGroup.put("note", new LinkedList<Interpreter>());
+    intpGroup.get("note").add(repl);
+    intpGroup.get("note").add(sql);
+    sql.setInterpreterGroup(intpGroup);
+    sql.open();
+
     context = new InterpreterContext("note", "id", null, "title", "text", new AuthenticationInfo(),
         new HashMap<String, Object>(), new GUI(),
         new AngularObjectRegistry(intpGroup.getId(), null),
@@ -83,8 +74,14 @@ public class SparkSqlInterpreterTest {
         new LinkedList<InterpreterContextRunner>(), new InterpreterOutput(null));
   }
 
+  @AfterClass
+  public static void tearDown() {
+    sql.close();
+    repl.close();
+  }
+
   boolean isDataFrameSupported() {
-    return SparkInterpreterTest.getSparkVersionNumber() >= 13;
+    return SparkInterpreterTest.getSparkVersionNumber(repl) >= 13;
   }
 
   @Test

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/zeppelin-server/src/test/java/org/apache/zeppelin/rest/AbstractTestRestApi.java
----------------------------------------------------------------------
diff --git a/zeppelin-server/src/test/java/org/apache/zeppelin/rest/AbstractTestRestApi.java b/zeppelin-server/src/test/java/org/apache/zeppelin/rest/AbstractTestRestApi.java
index 19e40bc..7ea2774 100644
--- a/zeppelin-server/src/test/java/org/apache/zeppelin/rest/AbstractTestRestApi.java
+++ b/zeppelin-server/src/test/java/org/apache/zeppelin/rest/AbstractTestRestApi.java
@@ -31,6 +31,7 @@ import java.util.regex.Pattern;
 import org.apache.commons.exec.CommandLine;
 import org.apache.commons.exec.DefaultExecutor;
 import org.apache.commons.exec.PumpStreamHandler;
+import org.apache.commons.httpclient.Header;
 import org.apache.commons.httpclient.HttpClient;
 import org.apache.commons.httpclient.HttpMethodBase;
 import org.apache.commons.httpclient.cookie.CookiePolicy;
@@ -127,6 +128,11 @@ public abstract class AbstractTestRestApi {
     if (!wasRunning) {
       System.setProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_HOME.getVarName(), "../");
       System.setProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_WAR.getVarName(), "../zeppelin-web/dist");
+
+      // some test profile does not build zeppelin-web.
+      // to prevent zeppelin starting up fail, create zeppelin-web/dist directory
+      new File("../zeppelin-web/dist").mkdirs();
+
       LOG.info("Staring test Zeppelin up...");
       ZeppelinConfiguration conf = ZeppelinConfiguration.create();
 
@@ -328,7 +334,7 @@ public abstract class AbstractTestRestApi {
     GetMethod request = null;
     boolean isRunning = true;
     try {
-      request = httpGet("/");
+      request = httpGet("/version");
       isRunning = request.getStatusCode() == 200;
     } catch (IOException e) {
       LOG.error("AbstractTestRestApi.checkIfServerIsRunning() fails .. ZeppelinServer is not running");
@@ -422,8 +428,14 @@ public abstract class AbstractTestRestApi {
     httpClient.executeMethod(postMethod);
     LOG.info("{} - {}", postMethod.getStatusCode(), postMethod.getStatusText());
     Pattern pattern = Pattern.compile("JSESSIONID=([a-zA-Z0-9-]*)");
-    java.util.regex.Matcher matcher = pattern.matcher(postMethod.getResponseHeaders("Set-Cookie")[0].toString());
-    return matcher.find()? matcher.group(1) : StringUtils.EMPTY;
+    Header[] setCookieHeaders = postMethod.getResponseHeaders("Set-Cookie");
+    for (Header setCookie : setCookieHeaders) {
+      java.util.regex.Matcher matcher = pattern.matcher(setCookie.toString());
+      if (matcher.find()) {
+        return matcher.group(1);
+      }
+    }
+    return StringUtils.EMPTY;
   }
 
   protected static boolean userAndPasswordAreNotBlank(String user, String pwd) {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/zeppelin-server/src/test/java/org/apache/zeppelin/rest/SecurityRestApiTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-server/src/test/java/org/apache/zeppelin/rest/SecurityRestApiTest.java b/zeppelin-server/src/test/java/org/apache/zeppelin/rest/SecurityRestApiTest.java
index b56763a..bc38f74 100644
--- a/zeppelin-server/src/test/java/org/apache/zeppelin/rest/SecurityRestApiTest.java
+++ b/zeppelin-server/src/test/java/org/apache/zeppelin/rest/SecurityRestApiTest.java
@@ -40,7 +40,7 @@ public class SecurityRestApiTest extends AbstractTestRestApi {
 
   @BeforeClass
   public static void init() throws Exception {
-    AbstractTestRestApi.startUpWithAuthenticationEnable();;
+    AbstractTestRestApi.startUpWithAuthenticationEnable();
   }
 
   @AfterClass
@@ -50,21 +50,21 @@ public class SecurityRestApiTest extends AbstractTestRestApi {
 
   @Test
   public void testTicket() throws IOException {
-    GetMethod get = httpGet("/security/ticket");
+    GetMethod get = httpGet("/security/ticket", "admin", "password1");
     get.addRequestHeader("Origin", "http://localhost");
     Map<String, Object> resp = gson.fromJson(get.getResponseBodyAsString(),
         new TypeToken<Map<String, Object>>(){}.getType());
     Map<String, String> body = (Map<String, String>) resp.get("body");
     collector.checkThat("Paramater principal", body.get("principal"),
-        CoreMatchers.equalTo("anonymous"));
+        CoreMatchers.equalTo("admin"));
     collector.checkThat("Paramater ticket", body.get("ticket"),
-        CoreMatchers.equalTo("anonymous"));
+        CoreMatchers.not("anonymous"));
     get.releaseConnection();
   }
 
   @Test
   public void testGetUserList() throws IOException {
-    GetMethod get = httpGet("/security/userlist/admi");
+    GetMethod get = httpGet("/security/userlist/admi", "admin", "password1");
     get.addRequestHeader("Origin", "http://localhost");
     Map<String, Object> resp = gson.fromJson(get.getResponseBodyAsString(),
         new TypeToken<Map<String, Object>>(){}.getType());
@@ -75,7 +75,7 @@ public class SecurityRestApiTest extends AbstractTestRestApi {
         CoreMatchers.equalTo(true));
     get.releaseConnection();
 
-    GetMethod notUser = httpGet("/security/userlist/randomString");
+    GetMethod notUser = httpGet("/security/userlist/randomString", "admin", "password1");
     notUser.addRequestHeader("Origin", "http://localhost");
     Map<String, Object> notUserResp = gson.fromJson(notUser.getResponseBodyAsString(),
         new TypeToken<Map<String, Object>>(){}.getType());
@@ -85,6 +85,5 @@ public class SecurityRestApiTest extends AbstractTestRestApi {
 
     notUser.releaseConnection();
   }
-
 }
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
index bd7d664..3e20d80 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
@@ -140,8 +140,8 @@ public class InterpreterSetting {
       key = SHARED_PROCESS;
     }
 
-    logger.debug("getInterpreterProcessKey: {} for InterpreterSetting Id: {}, Name: {}",
-        key, getId(), getName());
+    //logger.debug("getInterpreterProcessKey: {} for InterpreterSetting Id: {}, Name: {}",
+    //    key, getId(), getName());
     return key;
   }
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/zeppelin-zengine/src/main/resources/helium/webpack.config.js
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/resources/helium/webpack.config.js b/zeppelin-zengine/src/main/resources/helium/webpack.config.js
index 69592ae..ded2d4e 100644
--- a/zeppelin-zengine/src/main/resources/helium/webpack.config.js
+++ b/zeppelin-zengine/src/main/resources/helium/webpack.config.js
@@ -17,7 +17,7 @@
 
 module.exports = {
     entry: './load.js',
-    output: { path: './', filename: 'helium.bundle.js', },
+    output: { path: './', filename: 'vis.bundle.js', },
     module: {
         loaders: [{
             test: /\.js$/,

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/bfa812a9/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
index 434dd5b..48a4e2e 100644
--- a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
+++ b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
@@ -1186,7 +1186,7 @@ public class NotebookTest implements JobListenerFactory{
     assertEquals(notebookAuthorization.getOwners(notePublic.getId()).size(), 1);
     assertEquals(notebookAuthorization.getReaders(notePublic.getId()).size(), 0);
     assertEquals(notebookAuthorization.getWriters(notePublic.getId()).size(), 0);
-    
+
     // case of private note
     System.setProperty(ConfVars.ZEPPELIN_NOTEBOOK_PUBLIC.getVarName(), "false");
     ZeppelinConfiguration conf2 = ZeppelinConfiguration.create();
@@ -1208,8 +1208,7 @@ public class NotebookTest implements JobListenerFactory{
     notes2 = notebook.getAllNotes(user2);
     assertEquals(notes1.size(), 2);
     assertEquals(notes2.size(), 1);
-    assertEquals(notes1.get(1).getId(), notePrivate.getId());
-    
+
     // user1 have all rights
     assertEquals(notebookAuthorization.getOwners(notePrivate.getId()).size(), 1);
     assertEquals(notebookAuthorization.getReaders(notePrivate.getId()).size(), 1);


[22/23] zeppelin git commit: [ZEPPELIN-2113] Paragraph border is not highlighted when focused

Posted by mi...@apache.org.
[ZEPPELIN-2113] Paragraph border is not highlighted when focused

### What is this PR for?
https://github.com/apache/zeppelin/pull/2054 [removes `paragraph-col` css class](https://github.com/apache/zeppelin/pull/2054/files#diff-fe483b8eb5467153a772f202838dfb18L115). As a result paragraph is not highlighted when focused.

This PR apply `paragraph-col` css class when it is not iframe mode.

### What type of PR is it?
Bug Fix

### Todos
* [x] - apply `paragraph-col` css class

### What is the Jira issue?
http://issues.apache.org/jira/browse/ZEPPELIN-2113

### How should this be tested?
Click any paragraph and see if paragraph border is highlighted

### Screenshots (if appropriate)

before
![zeppelin_focus_no](https://cloud.githubusercontent.com/assets/1540981/23927184/78061500-08d5-11e7-8136-aaa53d8a4f35.gif)

after
![zeppelin_focus_on](https://cloud.githubusercontent.com/assets/1540981/23927192/804d93a0-08d5-11e7-9136-35dc17aca902.gif)

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Lee moon soo <mo...@apache.org>

Closes #2136 from Leemoonsoo/ZEPPELIN-2113-followup and squashes the following commits:

45c1717 [Lee moon soo] apply paragraph-col class

(cherry picked from commit a47ff95ed97cfdc52baeb22aa923d784e82afbe4)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/000900fa
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/000900fa
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/000900fa

Branch: refs/heads/branch-0.7
Commit: 000900faf1064cbfce9009cc8dcf8b1717b119ee
Parents: 75cf72e
Author: Lee moon soo <mo...@apache.org>
Authored: Tue Mar 14 16:36:02 2017 -0700
Committer: Lee moon soo <mo...@apache.org>
Committed: Thu Mar 16 08:49:10 2017 -0700

----------------------------------------------------------------------
 zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/000900fa/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js b/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
index 937f89b..222ab12 100644
--- a/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
+++ b/zeppelin-web/src/app/notebook/paragraph/paragraph.controller.js
@@ -417,7 +417,7 @@ function ParagraphCtrl($scope, $rootScope, $route, $window, $routeParams, $locat
     if ($scope.asIframe) {
       return 'col-md-12';
     } else {
-      return 'col-md-' + n;
+      return 'paragraph-col col-md-' + n;
     }
   };
 


[04/23] zeppelin git commit: [ZEPPELIN-2166] HeliumBundleFactoty can't transfile imported es6+

Posted by mi...@apache.org.
[ZEPPELIN-2166] HeliumBundleFactoty can't transfile imported es6+

Currently, we don't use any preset. This cause error messages like
when a helium package imports another packages which include es6+ syntax.

```
SyntaxError: Unexpected token import
    at helium.service.js:36
    at angular.js:10973
    at processQueue (angular.js:15552)
    at angular.js:15568
    at Scope.$eval (angular.js:16820)
    at Scope.$digest (angular.js:16636)
    at Scope.$apply (angular.js:16928)
    at done (angular.js:11266)
    at completeRequest (angular.js:11464)
    at XMLHttpRequest.requestLoaded (angular.js:11405)
```

- https://github.com/1ambda/zeppelin-advanced-transformation/blob/master/examples/example-highcharts-columnrange/index.js#L3
- https://github.com/1ambda/zeppelin-advanced-transformation/blob/master/index.js#L11

[Improvement]

* [x] - Install required NPM packages
* [x] - fix babel configuration

[ZEPPELIN-2166](https://issues.apache.org/jira/browse/ZEPPELIN-2166)

- Should be able to bundle existing helium vis
- Should be able to bundle https://github.com/1ambda/zeppelin-advanced-transformation/tree/master/examples/example-highcharts-columnrange

NONE

* Does the licenses files need update? - NONE
* Is there breaking changes for older versions? - NONE
* Does this needs documentation? - NONE

Author: 1ambda <1a...@gmail.com>

Closes #2071 from 1ambda/ZEPPELIN-2166/fix-webpack-config-for-es6 and squashes the following commits:

40f6b51 [1ambda] fix: Update babel configuration

(cherry picked from commit 252055571bfb444696ddae86b3a4dac24fa16ddf)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/7eef7a82
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/7eef7a82
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/7eef7a82

Branch: refs/heads/branch-0.7
Commit: 7eef7a82f448e4a39c41b245ca8f6998f37d76de
Parents: 3779100
Author: 1ambda <1a...@gmail.com>
Authored: Mon Feb 27 11:35:38 2017 +0900
Committer: Lee moon soo <mo...@apache.org>
Committed: Mon Mar 6 12:06:16 2017 +0900

----------------------------------------------------------------------
 .../src/main/resources/helium/package.json          |  6 ++++--
 .../src/main/resources/helium/webpack.config.js     | 16 ++++++----------
 2 files changed, 10 insertions(+), 12 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/7eef7a82/zeppelin-zengine/src/main/resources/helium/package.json
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/resources/helium/package.json b/zeppelin-zengine/src/main/resources/helium/package.json
index e6ec612..0fb5ac0 100644
--- a/zeppelin-zengine/src/main/resources/helium/package.json
+++ b/zeppelin-zengine/src/main/resources/helium/package.json
@@ -9,7 +9,9 @@
   },
   "devDependencies": {
     "webpack": "^1.12.2",
-    "babel": "^5.8.23",
-    "babel-loader": "^5.3.2"
+    "babel-core": "^6.23.1",
+    "babel-loader": "^6.3.2",
+    "babel-preset-es2015": "^6.22.0",
+    "babel-preset-stage-0": "^6.22.0"
   }
 }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/7eef7a82/zeppelin-zengine/src/main/resources/helium/webpack.config.js
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/resources/helium/webpack.config.js b/zeppelin-zengine/src/main/resources/helium/webpack.config.js
index 2b5015e..69592ae 100644
--- a/zeppelin-zengine/src/main/resources/helium/webpack.config.js
+++ b/zeppelin-zengine/src/main/resources/helium/webpack.config.js
@@ -14,20 +14,16 @@
  * See the License for the specific language governing permissions and
  * limitations under the License.
  */
+
 module.exports = {
-    entry: ['./'],
-    output: {
-        path: './',
-        filename: 'vis.bundle.js',
-    },
-    resolve: {
-        root: __dirname + "/node_modules"
-    },
+    entry: './load.js',
+    output: { path: './', filename: 'helium.bundle.js', },
     module: {
         loaders: [{
             test: /\.js$/,
-            //exclude: /node_modules/,
-            loader: 'babel-loader'
+            // DON'T exclude. since zeppelin will bundle all necessary packages: `exclude: /node_modules/,`
+            loader: 'babel-loader',
+            query: { presets: ['es2015', 'stage-0'] },
         }]
     }
 }


[05/23] zeppelin git commit: [ZEPPELIN-2094] Decrease npm install retry time (for branch-0.7)

Posted by mi...@apache.org.
[ZEPPELIN-2094] Decrease npm install retry time (for branch-0.7)

### What is this PR for?
**This pr is for branch-0.7**
It\u2019s too delayed for npm install when computer do not connected any networks.
Beacause when npm install, it has too long retry timeout.
This PR is to decrease retry timeout when npm install.
[pr for mater](https://github.com/apache/zeppelin/pull/2060) https://github.com/apache/zeppelin/pull/2095

### What type of PR is it?
Improvement

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2094

### How should this be tested?
you must enable any one helium before test

Line 197 In zeppelin-zengine org.apache.zeppelin.helium.HeliumBundleFactory.java

First set with
`String npmCommand = "install \u2014loglevel=error\u201d;`
and You don\u2019t connect any ethernet or wireless internet.
build & run

and set with
`String npmCommand = "install \u2014fetch-retries=2 \u2014fetch-retry-factor=1 \u2014fetch-retry-mintimeout=5000 \u2014loglevel=error\u201d;`
also don\u2019t connect any networks, build & run.

WHY
retries = 2
factor = 1
mintimeout = 5(sec)?

npm use [retry](https://github.com/tim-kos/node-retry) module to retry.
It refers [this article](http://dthain.blogspot.kr/2009/02/exponential-backoff-in-distributed.html) for retry algorithms.
It is a math which structured _Math.min(Math.round(random * minTimeout * Math.pow(factor, attempt)), maxTimeout)_.
In retry source code, between two retries. First retry doesn't care _Math.min()_, just _Math.round(random * minTimeout * Math.pow(factor, attempt))_)

Description | Before | After
------- | ------- | -------
Condition | npm's default setting<br>random = False = 1<br>retry = 2<br>minTimeout = 10 (sec)<br>maxTimeout = 60 (sec)<br>factor = 10 | custom setting<br>random = False = 1<br>retry = 2<br>minTimeout = 5 (sec)<br>maxTimeout = 60 (sec)<br>factor = 1<br>
First retry | Math.round(1 * 10 (sec) * 10^1)) | Math.round(1 * 5 (sec) * 1^1))
First retry result (Approximately) | 100 (sec) | 5 (sec)
Second retry | Math.min(Math.round(1 * 10 (sec) * 10^2), 60 (sec)) | Math.min(Math.round(1 * 5 (sec) * 1^2), 60 (sec))
Second retry result (Approximately) | 60 (sec) | 5 (sec)
Total waiting time (Approximately) | 160 (sec) | 10 (sec)

You can check like this below Screenshots.

### Screenshots
Before | After
-------|-------
<img width="1077" alt="2017-02-24 12 32 06" src="https://cloud.githubusercontent.com/assets/1144643/23267951/9deaec6e-fa2f-11e6-9171-5792f24de76d.png"> | <img width="1081" alt="2017-02-24 12 37 10" src="https://cloud.githubusercontent.com/assets/1144643/23267954/a12c0c0a-fa2f-11e6-99cd-335deef607ac.png">

### Questions:
* Does the licenses files need update? N/A
* Is there breaking changes for older versions? N/A
* Does this needs documentation? N/A

Author: NohSeho <ia...@sehonoh.kr>

Closes #2095 from NohSeho/ZEPPELIN-2094-for-0.7 and squashes the following commits:

4388ff8 [NohSeho] [ZEPPELIN-2094] Decrease npm install retry time


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/e684399b
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/e684399b
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/e684399b

Branch: refs/heads/branch-0.7
Commit: e684399bab5a2da428ae4f4e6685e1cbbaed9566
Parents: 7eef7a8
Author: NohSeho <ia...@sehonoh.kr>
Authored: Sun Mar 5 15:33:42 2017 +0900
Committer: Lee moon soo <mo...@apache.org>
Committed: Tue Mar 7 08:18:56 2017 +0900

----------------------------------------------------------------------
 .../zeppelin/helium/HeliumVisualizationFactory.java | 16 ++++++++++++++--
 1 file changed, 14 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/e684399b/zeppelin-zengine/src/main/java/org/apache/zeppelin/helium/HeliumVisualizationFactory.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/helium/HeliumVisualizationFactory.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/helium/HeliumVisualizationFactory.java
index 624f12a..d6b9c61 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/helium/HeliumVisualizationFactory.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/helium/HeliumVisualizationFactory.java
@@ -41,6 +41,10 @@ public class HeliumVisualizationFactory {
   private final String NODE_VERSION = "v6.9.1";
   private final String NPM_VERSION = "3.10.8";
   private final String DEFAULT_NPM_REGISTRY_URL = "http://registry.npmjs.org/";
+  private final int FETCH_RETRY_COUNT = 2;
+  private final int FETCH_RETRY_FACTOR_COUNT = 1;
+  // Milliseconds
+  private final int FETCH_RETRY_MIN_TIMEOUT = 5000;
 
   private final FrontendPluginFactory frontEndPluginFactory;
   private final File workingDirectory;
@@ -214,7 +218,11 @@ public class HeliumVisualizationFactory {
 
     out.reset();
     try {
-      npmCommand("install");
+      String commandForNpmInstall =
+              String.format("install --fetch-retries=%d --fetch-retry-factor=%d " +
+                              "--fetch-retry-mintimeout=%d",
+                      FETCH_RETRY_COUNT, FETCH_RETRY_FACTOR_COUNT, FETCH_RETRY_MIN_TIMEOUT);
+      npmCommand(commandForNpmInstall);
       npmCommand("run bundle");
     } catch (TaskRunnerException e) {
       throw new IOException(new String(out.toByteArray()));
@@ -334,7 +342,11 @@ public class HeliumVisualizationFactory {
   }
 
   public synchronized void install(HeliumPackage pkg) throws TaskRunnerException {
-    npmCommand("install " + pkg.getArtifact());
+    String commandForNpmInstallArtifact =
+        String.format("install %s --fetch-retries=%d --fetch-retry-factor=%d " +
+                        "--fetch-retry-mintimeout=%d", pkg.getArtifact(),
+                FETCH_RETRY_COUNT, FETCH_RETRY_FACTOR_COUNT, FETCH_RETRY_MIN_TIMEOUT);
+    npmCommand(commandForNpmInstallArtifact);
   }
 
   private void npmCommand(String args) throws TaskRunnerException {


[10/23] zeppelin git commit: ZEPPELIN-2199: Fix lapply issue in sparkR

Posted by mi...@apache.org.
ZEPPELIN-2199: Fix lapply issue in sparkR

### What is this PR for?
Function createRDDFromArray used for creating R RDD expects a JavaSparkContext object instead of spark context. This PR address that concern.

### What type of PR is it?
Bug Fix

### Todos
* [ ] - Task

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2199

### How should this be tested?
Build Zeppelin and Run
%r
families <- c("gaussian", "poisson")
df <- createDataFrame(iris)
train <- function(family)
{
    model <- glm(Sepal.Length ~ Sepal.Width + Species, iris, family = family)
    summary(model)
}
model.summaries <- spark.lapply(families, train)
print(model.summaries)

It fails in current master but will pass in this branch.

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update?
No
* Is there breaking changes for older versions?
Not completely sure about this.
* Does this needs documentation?
No.

Author: Vipul Modi <vi...@Vipuls-MacBook-Air.local>
Author: Vipul Modi <vi...@qubole.com>

Closes #2090 from vipul1409/ZEPPELIN-2199 and squashes the following commits:

8fccad4 [Vipul Modi] Trigger build 2
f351a7a [Vipul Modi] Merge branch 'master' of https://github.com/apache/zeppelin into ZEPPELIN-2199
c89ed1e [Vipul Modi] Trigger build 2
509faf7 [Vipul Modi] Trigger build
b83121e [Vipul Modi] Nullify jsc on close and remove file:/ changes
1d5bd5b [Vipul Modi] Merge branch 'master' of https://github.com/apache/zeppelin into ZEPPELIN-2199
cebf970 [Vipul Modi] Removing dummy file.txt
39e8144 [Vipul Modi] Merge branch 'master' of https://github.com/apache/zeppelin into ZEPPELIN-2199
8a0651d [Vipul Modi] Dummy commit
70b19c1 [Vipul Modi] ZEPPELIN-2199: Fix lapply issue in sparkR

(cherry picked from commit 0e1964877654c56c72473ad07dac1de6f9646816)
Signed-off-by: Felix Cheung <fe...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/6d72db34
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/6d72db34
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/6d72db34

Branch: refs/heads/branch-0.7
Commit: 6d72db34fa67504bb7d528c799fba58aa48f4a12
Parents: c7847c1
Author: Vipul Modi <vi...@Vipuls-MacBook-Air.local>
Authored: Tue Mar 7 09:10:30 2017 +0530
Committer: Felix Cheung <fe...@apache.org>
Committed: Tue Mar 7 23:26:18 2017 -0800

----------------------------------------------------------------------
 .../org/apache/zeppelin/spark/SparkInterpreter.java     | 12 ++++++++++++
 .../org/apache/zeppelin/spark/SparkRInterpreter.java    |  4 ++++
 .../org/apache/zeppelin/spark/ZeppelinRContext.java     |  6 ++++++
 spark/src/main/resources/R/zeppelin_sparkr.R            |  2 +-
 4 files changed, 23 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/6d72db34/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
index 1aecec4..47f8080 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
@@ -38,6 +38,7 @@ import org.apache.spark.SparkContext;
 import org.apache.spark.SparkEnv;
 
 import org.apache.spark.SecurityManager;
+import org.apache.spark.api.java.JavaSparkContext;
 import org.apache.spark.repl.SparkILoop;
 import org.apache.spark.scheduler.ActiveJob;
 import org.apache.spark.scheduler.DAGScheduler;
@@ -123,6 +124,7 @@ public class SparkInterpreter extends Interpreter {
   private SparkVersion sparkVersion;
   private static File outputDir;          // class outputdir for scala 2.11
   private Object classServer;      // classserver for scala 2.11
+  private JavaSparkContext jsc;
 
 
   public SparkInterpreter(Properties property) {
@@ -149,6 +151,15 @@ public class SparkInterpreter extends Interpreter {
     }
   }
 
+  public JavaSparkContext getJavaSparkContext() {
+    synchronized (sharedInterpreterLock) {
+      if (jsc == null) {
+        jsc = JavaSparkContext.fromSparkContext(sc);
+      }
+      return jsc;
+    }
+  }
+
   public boolean isSparkContextInitialized() {
     synchronized (sharedInterpreterLock) {
       return sc != null;
@@ -1390,6 +1401,7 @@ public class SparkInterpreter extends Interpreter {
       }
       sparkSession = null;
       sc = null;
+      jsc = null;
       if (classServer != null) {
         Utils.invokeMethod(classServer, "stop");
         classServer = null;

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/6d72db34/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
index 8f3e93c..e2d01fe 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
@@ -23,6 +23,7 @@ import com.fasterxml.jackson.databind.JsonNode;
 import com.fasterxml.jackson.databind.ObjectMapper;
 import org.apache.spark.SparkContext;
 import org.apache.spark.SparkRBackend;
+import org.apache.spark.api.java.JavaSparkContext;
 import org.apache.zeppelin.interpreter.*;
 import org.apache.zeppelin.interpreter.thrift.InterpreterCompletion;
 import org.apache.zeppelin.scheduler.Scheduler;
@@ -45,6 +46,7 @@ public class SparkRInterpreter extends Interpreter {
   private SparkInterpreter sparkInterpreter;
   private ZeppelinR zeppelinR;
   private SparkContext sc;
+  private JavaSparkContext jsc;
 
   public SparkRInterpreter(Properties property) {
     super(property);
@@ -73,8 +75,10 @@ public class SparkRInterpreter extends Interpreter {
 
     this.sparkInterpreter = getSparkInterpreter();
     this.sc = sparkInterpreter.getSparkContext();
+    this.jsc = sparkInterpreter.getJavaSparkContext();
     SparkVersion sparkVersion = new SparkVersion(sc.version());
     ZeppelinRContext.setSparkContext(sc);
+    ZeppelinRContext.setJavaSparkContext(jsc);
     if (Utils.isSpark2()) {
       ZeppelinRContext.setSparkSession(sparkInterpreter.getSparkSession());
     }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/6d72db34/spark/src/main/java/org/apache/zeppelin/spark/ZeppelinRContext.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/ZeppelinRContext.java b/spark/src/main/java/org/apache/zeppelin/spark/ZeppelinRContext.java
index 935410b..a2fc412 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/ZeppelinRContext.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/ZeppelinRContext.java
@@ -18,6 +18,7 @@
 package org.apache.zeppelin.spark;
 
 import org.apache.spark.SparkContext;
+import org.apache.spark.api.java.JavaSparkContext;
 import org.apache.spark.sql.SQLContext;
 
 /**
@@ -28,6 +29,7 @@ public class ZeppelinRContext {
   private static SQLContext sqlContext;
   private static ZeppelinContext zeppelinContext;
   private static Object sparkSession;
+  private static JavaSparkContext javaSparkContext;
 
   public static void setSparkContext(SparkContext sparkContext) {
     ZeppelinRContext.sparkContext = sparkContext;
@@ -60,4 +62,8 @@ public class ZeppelinRContext {
   public static Object getSparkSession() {
     return sparkSession;
   }
+
+  public static void setJavaSparkContext(JavaSparkContext jsc) { javaSparkContext = jsc; }
+
+  public static JavaSparkContext getJavaSparkContext() { return javaSparkContext; }
 }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/6d72db34/spark/src/main/resources/R/zeppelin_sparkr.R
----------------------------------------------------------------------
diff --git a/spark/src/main/resources/R/zeppelin_sparkr.R b/spark/src/main/resources/R/zeppelin_sparkr.R
index e95513f..525c6c5 100644
--- a/spark/src/main/resources/R/zeppelin_sparkr.R
+++ b/spark/src/main/resources/R/zeppelin_sparkr.R
@@ -45,7 +45,7 @@ assign("sc", get(".sc", envir = SparkR:::.sparkREnv), envir=.GlobalEnv)
 if (version >= 20000) {
   assign(".sparkRsession", SparkR:::callJStatic("org.apache.zeppelin.spark.ZeppelinRContext", "getSparkSession"), envir = SparkR:::.sparkREnv)
   assign("spark", get(".sparkRsession", envir = SparkR:::.sparkREnv), envir = .GlobalEnv)
-  assign(".sparkRjsc", get(".sc", envir = SparkR:::.sparkREnv), envir=SparkR:::.sparkREnv)
+  assign(".sparkRjsc", SparkR:::callJStatic("org.apache.zeppelin.spark.ZeppelinRContext", "getJavaSparkContext"), envir = SparkR:::.sparkREnv)
 }
 assign(".sqlc", SparkR:::callJStatic("org.apache.zeppelin.spark.ZeppelinRContext", "getSqlContext"), envir = SparkR:::.sparkREnv)
 assign("sqlContext", get(".sqlc", envir = SparkR:::.sparkREnv), envir = .GlobalEnv)


[06/23] zeppelin git commit: [ZEPPELIN-2162] [ZEPPELIN-2142] Make travis_check.py work with fork under organization, and show guidance if travis is not configured

Posted by mi...@apache.org.
[ZEPPELIN-2162] [ZEPPELIN-2142] Make travis_check.py work with fork under organization, and show guidance if travis is not configured

### What is this PR for?

When contributor uses zeppelin fork under organization, Jenkins can't check it with current travis_check.py.

This PR updates travis_check.py to distinguish error 1 when it fails with build fail, error 2 when it can't find build.

When it fails with 2, jenkins configuration can retry travis_check.py with organization repo.

Even though it tried again, still get 2 in return, then print instruction how to configure travis-ci.

Jenkins configuration is updated

from
```bash
if [ -f "travis_check.py" ]; then
  git log -n 1
  STATUS=$(curl -s $BUILD_URL | grep -e "GitHub pull request.*from.*" | sed 's/.*GitHub pull request <a href=\"\(https[^"]*\).*from[^"]*.\(https[^"]*\).*/\1 \2/g')
  AUTHOR=$(echo $STATUS | sed 's/.*[/]\(.*\)$/\1/g')
  PR=$(echo $STATUS | awk '{print $1}' | sed 's/.*[/]\(.*\)$/\1/g')
  COMMIT=$(git log -n 1 | grep "^Merge:" | awk '{print $3}')
  if [ -z $COMMIT ]; then
    COMMIT=$(curl -s https://api.github.com/repos/apache/zeppelin/pulls/$PR | grep -e "\"ref\":" -e "\"sha\":" | tr '\n' ' ' | sed 's/\(.*sha[^,]*,\)\(.*ref.*\)/\1 = \2/g' | tr = '\n' | grep -v master | sed 's/.*sha.[^"]*["]\([^"]*\).*/\1/g')
  fi
  sleep 30 # sleep few moment to wait travis starts the build
  python ./travis_check.py ${AUTHOR} ${COMMIT}
else
  echo "travis_check.py does not exists"
  echo "assume it's gh-pages branch"
  echo "return okay"
fi
```

to
```bash
if [ -f "travis_check.py" ]; then
  git log -n 1
  STATUS=$(curl -s $BUILD_URL | grep -e "GitHub pull request.*from.*" | sed 's/.*GitHub pull request <a href=\"\(https[^"]*\).*from[^"]*.\(https[^"]*\).*/\1 \2/g')
  AUTHOR=$(echo $STATUS | sed 's/.*[/]\(.*\)$/\1/g')
  PR=$(echo $STATUS | awk '{print $1}' | sed 's/.*[/]\(.*\)$/\1/g')
  COMMIT=$(git log -n 1 | grep "^Merge:" | awk '{print $3}')
  if [ -z $COMMIT ]; then
    COMMIT=$(curl -s https://api.github.com/repos/apache/zeppelin/pulls/$PR | grep -e "\"ref\":" -e "\"sha\":" | tr '\n' ' ' | sed 's/\(.*sha[^,]*,\)\(.*ref.*\)/\1 = \2/g' | tr = '\n' | grep -v master | sed 's/.*sha.[^"]*["]\([^"]*\).*/\1/g')
  fi
  sleep 30 # sleep few moment to wait travis starts the build
  python ./travis_check.py ${AUTHOR} ${COMMIT}
  RET_CODE=$?
  if [ $RET_CODE -eq 2 ]; then # try with repository name when travis-ci is not available in the account
    AUTHOR=$(curl -s https://api.github.com/repos/apache/zeppelin/pulls/$PR | grep '"full_name":' | grep -v "apache/zeppelin" | sed 's/.*[:][^"]*["]\([^/]*\).*/\1/g')
 	python ./travis_check.py ${AUTHOR} ${COMMIT}
    RET_CODE=$?
  fi

  if [ $RET_CODE -eq 2 ]; then # fail with can't find build information in the travis
    echo "Looks like travis-ci is not configured for your fork."
    echo "Please setup by swich on 'zeppelin' repository at https://travis-ci.org/profile and travis-ci."
    echo "And then make sure 'Build pushes' option is enabled in the settings https://travis-ci.org/${AUTHOR}/zeppelin/settings."
    echo "See http://zeppelin.apache.org/contribution/contributions.html#continuous-integration."
  fi

  exit $RET_CODE
else
  echo "travis_check.py does not exists"
  echo "assume it's gh-pages branch"
  echo "return okay"
fi
```

### What type of PR is it?
Improvement

### Todos
* [x] - distinguish error code travis_check.py
* [x] - Update jenkins configuration

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2162
https://issues.apache.org/jira/browse/ZEPPELIN-2142

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Lee moon soo <mo...@apache.org>

Closes #2094 from Leemoonsoo/minor_update_travis_check and squashes the following commits:

5e3ade4 [Lee moon soo] use different error code when can't find build in travis, to distinguish with build fail

(cherry picked from commit 80e09eb0db2415b55b4bc5486d3be457ef1016c3)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/9c9b0fd7
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/9c9b0fd7
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/9c9b0fd7

Branch: refs/heads/branch-0.7
Commit: 9c9b0fd712b4a6313561b111bc257613d9803455
Parents: e684399
Author: Lee moon soo <mo...@apache.org>
Authored: Sat Mar 4 11:28:19 2017 +0900
Committer: Lee moon soo <mo...@apache.org>
Committed: Tue Mar 7 11:53:06 2017 +0900

----------------------------------------------------------------------
 travis_check.py | 9 ++++++---
 1 file changed, 6 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/9c9b0fd7/travis_check.py
----------------------------------------------------------------------
diff --git a/travis_check.py b/travis_check.py
index a2fa288..cbf9623 100644
--- a/travis_check.py
+++ b/travis_check.py
@@ -54,8 +54,11 @@ def getBuildStatus(author, commit):
     # get latest 25 builds
     resp = requests.get(url=travisApi + "/repos/" + author + "/zeppelin/builds")
     data = json.loads(resp.text)
-
     build = None
+
+    if len(data) == 0:
+        return build;
+
     for b in data:
         if b["commit"][:len(commit)] == commit:
             resp = requests.get(url=travisApi + "/repos/" + author + "/zeppelin/builds/" + str(b["id"]))
@@ -103,8 +106,8 @@ for sleep in check:
     info("Get build status ...")
     build = getBuildStatus(author, commit)
     if build == None:
-        info("Can't find build for commit= " + commit)
-        sys.exit(1)
+        info("Can't find build for commit " + commit + " from " + author)
+        sys.exit(2)
 
     print("Build https://travis-ci.org/" + author + "/zeppelin/builds/" + str(build["id"]))
     failure, running = printBuildStatus(build)


[18/23] zeppelin git commit: [ZEPPELIN-2175] Jdbc interpreter sometime doesn't show detailed error message

Posted by mi...@apache.org.
[ZEPPELIN-2175] Jdbc interpreter sometime doesn't show detailed error message

### What is this PR for?
Zeppelin's JDBC interpreter sometimes doesn't show detailed error message on the notebook ui. It shows only plain "ERROR" text near run button in case of failure. User has to check JDBC interpreter log file in order to see a detailed error message.

This is mostly in case of incompatible JAR and I see errors mentioned below;
```
java.lang.NoSuchMethodError: org.apache.curator.utils.ZKPaths.fixForNamespace(Ljava/lang/String;Ljava/lang/String;Z)Ljava/lang/String;
	at org.apache.curator.framework.imps.NamespaceImpl.fixForNamespace(NamespaceImpl.java:82)
```

```
java.lang.NoSuchMethodError: org.apache.hive.service.auth.HiveAuthFactory.getSocketTransport(Ljava/lang/String;II)Lorg/apache/hive/org/apache/thrift/transport/TTransport;
	at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:447)
```

Hence, IMO instead of catch -> Exception; we should use catch ->Throwable.

### What type of PR is it?
[Improvement]

### What is the Jira issue?
* [ZEPPELIN-2175](https://issues.apache.org/jira/browse/ZEPPELIN-2175)

### How should this be tested?
Use any incompatible in interpreter dependency, which would throw Error instead of Exception.

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? N/A
* Is there breaking changes for older versions? N/A
* Does this needs documentation? N/A

Author: Prabhjyot Singh <pr...@gmail.com>

Closes #2122 from prabhjyotsingh/ZEPPELIN-2175 and squashes the following commits:

666ce8d [Prabhjyot Singh] use Throwable instead of Exception

(cherry picked from commit d0fc54bc782c9cc6bc9a5dda2d8ca92b46cc66c7)
Signed-off-by: Prabhjyot Singh <pr...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/ae45495c
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/ae45495c
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/ae45495c

Branch: refs/heads/branch-0.7
Commit: ae45495cf05d1f35980da66037dc9b51db0647bd
Parents: 7998dd2
Author: Prabhjyot Singh <pr...@gmail.com>
Authored: Sat Mar 11 11:31:49 2017 +0530
Committer: Prabhjyot Singh <pr...@gmail.com>
Committed: Thu Mar 16 15:48:41 2017 +0530

----------------------------------------------------------------------
 jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/ae45495c/jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java
----------------------------------------------------------------------
diff --git a/jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java b/jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java
index 0f6ebad..4051398 100644
--- a/jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java
+++ b/jdbc/src/main/java/org/apache/zeppelin/jdbc/JDBCInterpreter.java
@@ -621,7 +621,7 @@ public class JDBCInterpreter extends Interpreter {
         } catch (SQLException e) { /*ignored*/ }
       }
       getJDBCConfiguration(user).removeStatement(paragraphId);
-    } catch (Exception e) {
+    } catch (Throwable e) {
       if (e.getCause() instanceof TTransportException &&
           Throwables.getStackTraceAsString(e).contains("GSS") &&
           getJDBCConfiguration(user).isConnectionInDBDriverPoolSuccessful(propertyKey)) {


[03/23] zeppelin git commit: [ZEPPELIN-1588]: bumping nvd3 to 1.8.5

Posted by mi...@apache.org.
[ZEPPELIN-1588]: bumping nvd3 to 1.8.5

### What is this PR for?
* bump nvd3 to 1.8.5 (and remove depecrated functions)
* display percentage in pie chart [solve ZEPPELIN-1891]

NB: visualization-scatterchart.js's tooltip content generator has been updated to stop using depecrated tooltip property and use tooltip.contentGenerator instead. However I have commented the code as I think nvd3 scatterchart's default tooltip is far more elegant, open to discussion.

### What type of PR is it?
Improvement

### Todos
* [ ] -

### What is the Jira issue?
* <https://issues.apache.org/jira/browse/ZEPPELIN-1588>
* <https://issues.apache.org/jira/browse/ZEPPELIN-1891>

### How should this be tested?
Visual testing of the builtin visualization.

### Screenshots (if appropriate)
PieChart display before PR:
![piechart-beforepr](https://cloud.githubusercontent.com/assets/8293897/23157700/830bda76-f81c-11e6-92fd-2a7f1a0d6f01.png)
PieChart display with PR:
![piechart-withpr](https://cloud.githubusercontent.com/assets/8293897/23157702/831fcc98-f81c-11e6-9f11-9b0f4bf010cc.png)

ScatterChart tooltip before PR:
![scatterchart-beforepr](https://cloud.githubusercontent.com/assets/8293897/23157699/830ae558-f81c-11e6-8528-85303804a339.png)
ScatterChart tooltip with PR:
![scatterchart-withpr](https://cloud.githubusercontent.com/assets/8293897/23157701/830cac3a-f81c-11e6-8029-73a7a30fe428.png)

### Questions:
* Does the licenses files need update? NO
* Is there breaking changes for older versions? NO
* Does this needs documentation? NO

Author: Remilito <re...@gmail.com>

Closes #2042 from Remilito/ZEPPELIN-1588 and squashes the following commits:

bbb0e9f [Remilito] removing tooltip.contentgenerator override
f0b3961 [Remilito] [ZEPPELIN-1588]: bumping nvd3 to 1.8.5 * remove deprecated calls * solve pie display pct * better tooltip for scatterchart

(cherry picked from commit e4d488b3a039f9565426156a3ec666799f0ade44)
Signed-off-by: ahyoungryu <ah...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/3779100b
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/3779100b
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/3779100b

Branch: refs/heads/branch-0.7
Commit: 3779100b6e48834d9d58cb91267b0c0084600907
Parents: a57a287
Author: Remilito <re...@gmail.com>
Authored: Tue Feb 21 09:47:42 2017 +0100
Committer: ahyoungryu <ah...@apache.org>
Committed: Sun Mar 5 16:23:45 2017 +0900

----------------------------------------------------------------------
 zeppelin-web/bower.json                                  |  2 +-
 .../visualization/builtins/visualization-areachart.js    |  2 +-
 .../app/visualization/builtins/visualization-piechart.js |  7 ++++---
 .../visualization/builtins/visualization-scatterchart.js | 11 -----------
 4 files changed, 6 insertions(+), 16 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3779100b/zeppelin-web/bower.json
----------------------------------------------------------------------
diff --git a/zeppelin-web/bower.json b/zeppelin-web/bower.json
index 7c70110..7f292a9 100644
--- a/zeppelin-web/bower.json
+++ b/zeppelin-web/bower.json
@@ -17,7 +17,7 @@
     "ace-builds": "1.2.6",
     "angular-ui-ace": "0.1.3",
     "jquery.scrollTo": "~1.4.13",
-    "nvd3": "~1.7.1",
+    "nvd3": "~1.8.5",
     "angular-dragdrop": "~1.0.8",
     "perfect-scrollbar": "~0.5.4",
     "ng-sortable": "~1.3.3",

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3779100b/zeppelin-web/src/app/visualization/builtins/visualization-areachart.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/visualization/builtins/visualization-areachart.js b/zeppelin-web/src/app/visualization/builtins/visualization-areachart.js
index 719f3e3..25ed16f 100644
--- a/zeppelin-web/src/app/visualization/builtins/visualization-areachart.js
+++ b/zeppelin-web/src/app/visualization/builtins/visualization-areachart.js
@@ -60,7 +60,7 @@ export default class AreachartVisualization extends Nvd3ChartVisualization {
   configureChart(chart) {
     var self = this;
     chart.xAxis.tickFormat(function(d) {return self.xAxisTickFormat(d, self.xLabels);});
-    chart.yAxisTickFormat(function(d) {return self.yAxisTickFormat(d);});
+    chart.yAxis.tickFormat(function(d) {return self.yAxisTickFormat(d);});
     chart.yAxis.axisLabelDistance(50);
     chart.useInteractiveGuideline(true); // for better UX and performance issue. (https://github.com/novus/nvd3/issues/691)
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3779100b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
index fdc67b1..8c8f8f2 100644
--- a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
+++ b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
@@ -21,7 +21,6 @@ import PivotTransformation from '../../tabledata/pivot';
 export default class PiechartVisualization extends Nvd3ChartVisualization {
   constructor(targetEl, config) {
     super(targetEl, config);
-
     this.pivot = new PivotTransformation(config);
   };
 
@@ -43,7 +42,6 @@ export default class PiechartVisualization extends Nvd3ChartVisualization {
       true,
       false,
       false);
-
     var d = d3Data.d3g;
     var d3g = [];
     if (d.length > 0) {
@@ -67,6 +65,9 @@ export default class PiechartVisualization extends Nvd3ChartVisualization {
   };
 
   configureChart(chart) {
-    chart.x(function(d) { return d.label;}).y(function(d) { return d.value;});
+    chart.x(function(d) { return d.label;})
+	 .y(function(d) { return d.value;})
+	 .showLabels(false)
+	 .showTooltipPercent(true);
   };
 }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3779100b/zeppelin-web/src/app/visualization/builtins/visualization-scatterchart.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/visualization/builtins/visualization-scatterchart.js b/zeppelin-web/src/app/visualization/builtins/visualization-scatterchart.js
index 267693e..410c435 100644
--- a/zeppelin-web/src/app/visualization/builtins/visualization-scatterchart.js
+++ b/zeppelin-web/src/app/visualization/builtins/visualization-scatterchart.js
@@ -69,17 +69,6 @@ export default class ScatterchartVisualization extends Nvd3ChartVisualization {
     chart.xAxis.tickFormat(function(d) {return self.xAxisTickFormat(d, self.xLabels);});
     chart.yAxis.tickFormat(function(d) {return self.yAxisTickFormat(d, self.yLabels);});
 
-    // configure how the tooltip looks.
-    chart.tooltipContent(function(key, x, y, graph, data) {
-      var tooltipContent = '<h3>' + key + '</h3>';
-      if (self.config.size &&
-        self.isValidSizeOption(self.config, self.tableData.rows)) {
-        tooltipContent += '<p>' + data.point.size + '</p>';
-      }
-
-      return tooltipContent;
-    });
-
     chart.showDistX(true).showDistY(true);
     //handle the problem of tooltip not showing when muliple points have same value.
   };


[13/23] zeppelin git commit: [ZEPPELIN-2075] Can't stop infinite `while` statement in pyspark Interpreter.

Posted by mi...@apache.org.
[ZEPPELIN-2075] Can't stop infinite `while` statement in pyspark Interpreter.

### What is this PR for?
If following code runs with Pyspark Interpreter, there is no way to cancel except Zeppelin Server restart.
```
%spark.pyspark
import time

while True:
    time.sleep(1)
    print("running..")
```

### What type of PR is it?
Bug Fix | Improvement

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2075

### How should this be tested?
Run above code with Pyspark Interpreter and try to cancel.

### Screenshots (if appropriate)
- before
![pyspark before](https://cloud.githubusercontent.com/assets/3348133/22696141/615c1206-ed90-11e6-9bbb-339ecdec73fc.gif)

- after
![pyspark after](https://cloud.githubusercontent.com/assets/3348133/22696168/70899172-ed90-11e6-99e1-342eb4094b2c.gif)

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: astroshim <hs...@zepl.com>

Closes #1985 from astroshim/ZEPPELIN-2075 and squashes the following commits:

84bf09a [astroshim] fix testcase
bc12eaa [astroshim] pass pid to java
b60d89a [astroshim] Merge branch 'master' into ZEPPELIN-2075
f26eacf [astroshim] add test-case for canceling.
c0cac4e [astroshim] fix logging
678c183 [astroshim] remove signal handler
65d8cc6 [astroshim] init python pid variable
6731e56 [astroshim] add signal to cancel job

(cherry picked from commit 9f22db91c279b7daf6a13b2d805a874074b070fd)
Signed-off-by: Jongyoul Lee <jo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/730784ba
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/730784ba
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/730784ba

Branch: refs/heads/branch-0.7
Commit: 730784bab1004e5ecf6d938b26380c3cd4ca6d1f
Parents: a90004b
Author: astroshim <hs...@zepl.com>
Authored: Sun Feb 19 00:36:45 2017 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Tue Mar 14 15:32:53 2017 +0900

----------------------------------------------------------------------
 .../zeppelin/spark/PySparkInterpreter.java      | 20 ++++++++++++-
 .../main/resources/python/zeppelin_pyspark.py   |  2 +-
 .../zeppelin/spark/PySparkInterpreterTest.java  | 30 ++++++++++++++++++++
 3 files changed, 50 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/730784ba/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
index 5a8e040..371578c 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
@@ -73,10 +73,12 @@ public class PySparkInterpreter extends Interpreter implements ExecuteResultHand
   private String scriptPath;
   boolean pythonscriptRunning = false;
   private static final int MAX_TIMEOUT_SEC = 10;
+  private long pythonPid;
 
   public PySparkInterpreter(Properties property) {
     super(property);
 
+    pythonPid = -1;
     try {
       File scriptFile = File.createTempFile("zeppelin_pyspark-", ".py");
       scriptPath = scriptFile.getAbsolutePath();
@@ -310,7 +312,8 @@ public class PySparkInterpreter extends Interpreter implements ExecuteResultHand
   boolean pythonScriptInitialized = false;
   Integer pythonScriptInitializeNotifier = new Integer(0);
 
-  public void onPythonScriptInitialized() {
+  public void onPythonScriptInitialized(long pid) {
+    pythonPid = pid;
     synchronized (pythonScriptInitializeNotifier) {
       pythonScriptInitialized = true;
       pythonScriptInitializeNotifier.notifyAll();
@@ -411,10 +414,25 @@ public class PySparkInterpreter extends Interpreter implements ExecuteResultHand
     }
   }
 
+  public void interrupt() throws IOException {
+    if (pythonPid > -1) {
+      logger.info("Sending SIGINT signal to PID : " + pythonPid);
+      Runtime.getRuntime().exec("kill -SIGINT " + pythonPid);
+    } else {
+      logger.warn("Non UNIX/Linux system, close the interpreter");
+      close();
+    }
+  }
+
   @Override
   public void cancel(InterpreterContext context) {
     SparkInterpreter sparkInterpreter = getSparkInterpreter();
     sparkInterpreter.cancel(context);
+    try {
+      interrupt();
+    } catch (IOException e) {
+      logger.error("Error", e);
+    }
   }
 
   @Override

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/730784ba/spark/src/main/resources/python/zeppelin_pyspark.py
----------------------------------------------------------------------
diff --git a/spark/src/main/resources/python/zeppelin_pyspark.py b/spark/src/main/resources/python/zeppelin_pyspark.py
index c59d2f4..d9c68c2 100644
--- a/spark/src/main/resources/python/zeppelin_pyspark.py
+++ b/spark/src/main/resources/python/zeppelin_pyspark.py
@@ -252,7 +252,7 @@ java_import(gateway.jvm, "org.apache.spark.api.python.*")
 java_import(gateway.jvm, "org.apache.spark.mllib.api.python.*")
 
 intp = gateway.entry_point
-intp.onPythonScriptInitialized()
+intp.onPythonScriptInitialized(os.getpid())
 
 jsc = intp.getJavaSparkContext()
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/730784ba/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
index 35b876d..55c405d 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/PySparkInterpreterTest.java
@@ -36,6 +36,8 @@ import java.util.HashMap;
 import java.util.LinkedList;
 import java.util.List;
 import java.util.Properties;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
 
 import static org.junit.Assert.*;
 
@@ -121,4 +123,32 @@ public class PySparkInterpreterTest {
       assertTrue(completions.size() > 0);
     }
   }
+
+  private class infinityPythonJob implements Runnable {
+    @Override
+    public void run() {
+      String code = "import time\nwhile True:\n  time.sleep(1)" ;
+      InterpreterResult ret = pySparkInterpreter.interpret(code, context);
+      assertNotNull(ret);
+      Pattern expectedMessage = Pattern.compile("KeyboardInterrupt");
+      Matcher m = expectedMessage.matcher(ret.message().toString());
+      assertTrue(m.find());
+    }
+  }
+
+  @Test
+  public void testCancelIntp() throws InterruptedException {
+    if (getSparkVersionNumber() > 11) {
+      assertEquals(InterpreterResult.Code.SUCCESS,
+        pySparkInterpreter.interpret("a = 1\n", context).code());
+
+      Thread t = new Thread(new infinityPythonJob());
+      t.start();
+      Thread.sleep(5000);
+      pySparkInterpreter.cancel(context);
+      assertTrue(t.isAlive());
+      t.join(2000);
+      assertFalse(t.isAlive());
+    }
+  }
 }


[02/23] zeppelin git commit: [ZEPPELIN-2130][Doc]Do not use web development port

Posted by mi...@apache.org.
[ZEPPELIN-2130][Doc]Do not use web development port

### What is this PR for?
If user uses web application development port such like 9000 which is default value, Zeppelin is not working because of this [line](https://github.com/apache/zeppelin/blob/master/zeppelin-web/src/components/baseUrl/baseUrl.service.js#L27). So, Zeppelin site need to guide this content until fixing this line (I'll improve to flexible web application development port later).

### What type of PR is it?
[ Documentation ]

### What is the Jira issue?
* [ZEPPELIN-2130](https://issues.apache.org/jira/browse/ZEPPELIN-2130)

### How should this be tested?
1. Run document development mode.
2. Connect `http://localhost:4000/install/configuration.html#zeppelin-properties` on browser.
3. Check the description of `ZEPPELIN_PORT`

### Screenshots (if appropriate)
![z_not_use_port](https://cloud.githubusercontent.com/assets/8110458/23350768/32cf941a-fd00-11e6-8a3c-3390ddf2d7df.png)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? Yes

Author: soralee <so...@zepl.com>

Closes #2073 from soralee/ZEPPELIN-2130_webDevPort_Doc and squashes the following commits:

8ae57cf [soralee] ZEPPELIN-2130_remove_dot
0499bd3 [soralee] ZEPPELIN-2130_update_sentence
094f29e [soralee] ZEPPELIN-2130_update_sentence
ee02c62 [soralee] ZEPPELIN-2130_update_sentence
ac133ca [soralee] ZEPPELIN-2130_do_not_use_webDebPort

(cherry picked from commit 281e326d1cd8e9dbf71152ceb390bd5c6ee7b4de)
Signed-off-by: ahyoungryu <ah...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/a57a2879
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/a57a2879
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/a57a2879

Branch: refs/heads/branch-0.7
Commit: a57a2879f751e5706ae9e1f1bd0d54707b691af1
Parents: f9630a5
Author: soralee <so...@zepl.com>
Authored: Mon Feb 27 20:46:29 2017 +0900
Committer: ahyoungryu <ah...@apache.org>
Committed: Sun Mar 5 16:14:48 2017 +0900

----------------------------------------------------------------------
 docs/install/configuration.md | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/a57a2879/docs/install/configuration.md
----------------------------------------------------------------------
diff --git a/docs/install/configuration.md b/docs/install/configuration.md
index 7a87838..80a7555 100644
--- a/docs/install/configuration.md
+++ b/docs/install/configuration.md
@@ -42,7 +42,9 @@ If both are defined, then the **environment variables** will take priority.
     <td>ZEPPELIN_PORT</td>
     <td>zeppelin.server.port</td>
     <td>8080</td>
-    <td>Zeppelin server port</td>
+    <td>Zeppelin server port </br>
+      <span style="font-style:italic; color: gray"> Note: Please make sure you're not using the same port with 
+      <a href="https://zeppelin.apache.org/contribution/webapplication.html#dev-mode" target="_blank">Zeppelin web application development port</a> (default: 9000).</span></td>
   </tr>
   <tr>
     <td>ZEPPELIN_SSL_PORT</td>


[20/23] zeppelin git commit: [ZEPPELIN-2124] Missing dependencies array in interpreter.json after upgrade from 0.6.2 to 0.7.0

Posted by mi...@apache.org.
[ZEPPELIN-2124] Missing dependencies array in interpreter.json after upgrade from 0.6.2 to 0.7.0

### What is this PR for?
If there is no `dependencies` field specified in `interpreter.json`, front-end throws error because it tries to push new element to undefined variable. This PR fixes this issue by setting initial value of `dependencies` to empty array.

### What type of PR is it?
Bug Fix

### What is the Jira issue?
[ZEPPELIN-2124](https://issues.apache.org/jira/browse/ZEPPELIN-2124)

### How should this be tested?
Remove `dependencies` field from `conf/interpreter.json` and try to add new dependencies in http://localhost:8080/#/interpreter page.

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Mina Lee <mi...@apache.org>

Closes #2142 from minahlee/ZEPPELIN-2124 and squashes the following commits:

01b27eb [Mina Lee] Assign init value for dependencies field

(cherry picked from commit f9a8a6f6e8e9959b5bed0a69723b934a25b7f261)
Signed-off-by: Jongyoul Lee <jo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/56fa8b53
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/56fa8b53
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/56fa8b53

Branch: refs/heads/branch-0.7
Commit: 56fa8b5349fa6889fa4d5097497bef7282a7c647
Parents: adf2b12
Author: Mina Lee <mi...@apache.org>
Authored: Thu Mar 16 10:57:38 2017 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Fri Mar 17 00:00:37 2017 +0900

----------------------------------------------------------------------
 .../java/org/apache/zeppelin/interpreter/InterpreterSetting.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/56fa8b53/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
index 57c6acc..30a4967 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
@@ -65,7 +65,7 @@ public class InterpreterSetting {
   @SerializedName("interpreterGroup")
   private List<InterpreterInfo> interpreterInfos;
   private final transient Map<String, InterpreterGroup> interpreterGroupRef = new HashMap<>();
-  private List<Dependency> dependencies;
+  private List<Dependency> dependencies = new LinkedList<>();
   private InterpreterOption option;
   private transient String path;
 


[23/23] zeppelin git commit: [ZEPPELIN-2234][BUG] Can't display the same chart again (branch-0.7)

Posted by mi...@apache.org.
[ZEPPELIN-2234][BUG] Can't display the same chart again (branch-0.7)

### What is this PR for?

Can't display the same chart again. I attached a screenshot.

- this is the same fix with https://github.com/apache/zeppelin/pull/2110
- except refactoring PR
- based on branch-0.7

and

- CI failure might be related with https://github.com/apache/zeppelin/pull/2103

#### Implementation Details

After https://github.com/apache/zeppelin/pull/2092,

- result.html will draw chart every time since we use `ng-if` instead of `ng-show`
- that means DOM is deleted, and created too
- so we have to create visualization instance every time which requires a newly created DOM.

```js
builtInViz.instance = new Visualization(loadedElem, config); // `loadedElem` is the newly created DOM.
```

### What type of PR is it?
[Bug Fix]

### Todos

NONE

### What is the Jira issue?
* Open an issue on Jira https://issues.apache.org/jira/browse/ZEPPELIN/
* Put link here, and add [ZEPPELIN-*Jira number*] in PR title, eg. [ZEPPELIN-533]

### How should this be tested?

I attached a screenshot

### Screenshots (if appropriate)

##### Before: buggy

![2234](https://cloud.githubusercontent.com/assets/4968473/23694278/4451594e-041c-11e7-9971-f0bb5945a1be.gif)

##### After: fixed

![2234-2](https://cloud.githubusercontent.com/assets/4968473/23694270/34866ba8-041c-11e7-83a8-693a93646fa4.gif)

### Questions:
* Does the licenses files need update? - NO
* Is there breaking changes for older versions? - NO
* Does this needs documentation? - NO

Author: 1ambda <1a...@gmail.com>

Closes #2114 from 1ambda/ZEPPELIN-2234/cant-display-same-chart-again-for-070 and squashes the following commits:

936123d [1ambda] fix: Retry until graph DOM is ready
475b532 [1ambda] fix: Reert #2092 for 0.7.0


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/4d80ec46
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/4d80ec46
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/4d80ec46

Branch: refs/heads/branch-0.7
Commit: 4d80ec4610e55e3a8b2f0459600f5fb530168339
Parents: 000900f
Author: 1ambda <1a...@gmail.com>
Authored: Thu Mar 16 05:37:39 2017 +0900
Committer: Lee moon soo <mo...@apache.org>
Committed: Thu Mar 16 08:51:33 2017 -0700

----------------------------------------------------------------------
 .../src/app/notebook/paragraph/result/result.controller.js    | 7 ++++++-
 zeppelin-web/src/app/notebook/paragraph/result/result.html    | 6 +++---
 2 files changed, 9 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/4d80ec46/zeppelin-web/src/app/notebook/paragraph/result/result.controller.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/paragraph/result/result.controller.js b/zeppelin-web/src/app/notebook/paragraph/result/result.controller.js
index 40f8248..9b95b40 100644
--- a/zeppelin-web/src/app/notebook/paragraph/result/result.controller.js
+++ b/zeppelin-web/src/app/notebook/paragraph/result/result.controller.js
@@ -263,7 +263,12 @@ function ResultCtrl($scope, $rootScope, $route, $window, $routeParams, $location
       renderApp(app);
     } else {
       if (type === 'TABLE') {
-        $scope.renderGraph($scope.graphMode, refresh);
+        var retryRenderer = function() {
+          var elem = angular.element('#p' + $scope.id + '_graph');
+          if (elem.length) { $scope.renderGraph($scope.graphMode, refresh); }
+          else { $timeout(retryRenderer, 10); }
+        };
+        $timeout(retryRenderer);
       } else if (type === 'HTML') {
         renderHtml();
       } else if (type === 'ANGULAR') {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/4d80ec46/zeppelin-web/src/app/notebook/paragraph/result/result.html
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/paragraph/result/result.html b/zeppelin-web/src/app/notebook/paragraph/result/result.html
index d76b8a4..df09c4d 100644
--- a/zeppelin-web/src/app/notebook/paragraph/result/result.html
+++ b/zeppelin-web/src/app/notebook/paragraph/result/result.html
@@ -28,10 +28,10 @@ limitations under the License.
                     && config.graph.optionOpen && !asIframe && !viewOnly">
         <div ng-repeat="viz in builtInTableDataVisualizationList track by $index"
              id="trsetting{{id}}_{{viz.id}}"
-             ng-if="graphMode == viz.id"></div>
+             ng-show="graphMode == viz.id"></div>
         <div ng-repeat="viz in builtInTableDataVisualizationList track by $index"
              id="vizsetting{{id}}_{{viz.id}}"
-             ng-if="graphMode == viz.id"></div>
+             ng-show="graphMode == viz.id"></div>
       </div>
 
       <!-- graph -->
@@ -41,7 +41,7 @@ limitations under the License.
            >
         <div ng-repeat="viz in builtInTableDataVisualizationList track by $index"
              id="p{{id}}_{{viz.id}}"
-             ng-if="graphMode == viz.id">
+             ng-show="graphMode == viz.id">
         </div>
       </div>
 


[07/23] zeppelin git commit: [ZEPPELIN-2139] Interpreters based on scala_2.11 aren't installed correctly

Posted by mi...@apache.org.
[ZEPPELIN-2139] Interpreters based on scala_2.11 aren't installed correctly

pom variables such as `${scala.version}` are not replaced to value when you do `mvn install`.
This makes leaf poms to look for this value in parent pom, which is always `2.10.5` for `${scala.version}` and it causes scala library dependency conflict. For example, zeppelin-flink_2.11 will have scala 2.10.5 as dependency.
This PR fixes this problem by using maven flatten plugin.

Bug Fix

[ZEPPELIN-2139](https://issues.apache.org/jira/browse/ZEPPELIN-2139)

```
$ ./dev/change_scala_version.sh 2.11
$ mvn clean install -pl 'zeppelin-server,zeppelin-zengine,zeppelin-interpreter,flink' -am -DskipRat -DskipTests -Pscala-2.11
```
Open `~/.m2/repository/org/apache/zeppelin/zeppelin-flink_2.11/0.8.0-SNAPSHOT/zeppelin-flink_2.11-0.8.0-SNAPSHOT.pom` file and see if scala related libraries dependency version is set to 2.11.7

* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Mina Lee <mi...@apache.org>

Closes #2059 from minahlee/ZEPPELIN-2139 and squashes the following commits:

62d852a [Mina Lee] Change <scala.version> property in parent pom file
489c843 [Mina Lee] Use maven flatten plugin to make pom.xml variables to be replaced by value
783c014 [Mina Lee] Fix indentation and add default properties to be used in flattened-pom

(cherry picked from commit e6b32c0f244b7c68e243f9076291c341ee2a1eb5)
Signed-off-by: Mina Lee <mi...@apache.org>

Conflicts:
	livy/pom.xml


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/11e897df
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/11e897df
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/11e897df

Branch: refs/heads/branch-0.7
Commit: 11e897dff20b2969dfd9a81eec377c9bed70882e
Parents: 9c9b0fd
Author: Mina Lee <mi...@apache.org>
Authored: Thu Feb 23 18:21:35 2017 +0900
Committer: Mina Lee <mi...@apache.org>
Committed: Tue Mar 7 14:29:40 2017 +0900

----------------------------------------------------------------------
 .gitignore                  |   3 +
 dev/change_scala_version.sh |  24 +-
 livy/pom.xml                | 944 +++++++++++++++++++--------------------
 pom.xml                     |  28 +-
 4 files changed, 518 insertions(+), 481 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/11e897df/.gitignore
----------------------------------------------------------------------
diff --git a/.gitignore b/.gitignore
index 5b638fa..9555435 100644
--- a/.gitignore
+++ b/.gitignore
@@ -104,6 +104,9 @@ Thumbs.db
 target/
 **/target/
 
+# maven flattened pom files
+**/.flattened-pom.xml
+
 # Generated by Jekyll 
 docs/_site/
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/11e897df/dev/change_scala_version.sh
----------------------------------------------------------------------
diff --git a/dev/change_scala_version.sh b/dev/change_scala_version.sh
index cb2c842..0ccfe7e 100755
--- a/dev/change_scala_version.sh
+++ b/dev/change_scala_version.sh
@@ -34,7 +34,7 @@ if [[ ($# -ne 1) || ( $1 == "--help") ||  $1 == "-h" ]]; then
   usage
 fi
 
-TO_VERSION=$1
+TO_VERSION="$1"
 
 check_scala_version() {
   for i in ${VALID_VERSIONS[*]}; do [ $i = "$1" ] && return 0; done
@@ -42,12 +42,14 @@ check_scala_version() {
   exit 1
 }
 
-check_scala_version "$TO_VERSION"
+check_scala_version "${TO_VERSION}"
 
-if [ $TO_VERSION = "2.11" ]; then
+if [ "${TO_VERSION}" = "2.11" ]; then
   FROM_VERSION="2.10"
+  SCALA_LIB_VERSION="2.11.7"
 else
   FROM_VERSION="2.11"
+  SCALA_LIB_VERSION="2.10.5"
 fi
 
 sed_i() {
@@ -57,11 +59,17 @@ sed_i() {
 export -f sed_i
 
 BASEDIR=$(dirname $0)/..
-find "$BASEDIR" -name 'pom.xml' -not -path '*target*' -print \
-  -exec bash -c "sed_i 's/\(artifactId.*\)_'$FROM_VERSION'/\1_'$TO_VERSION'/g' {}" \;
+find "${BASEDIR}" -name 'pom.xml' -not -path '*target*' -print \
+  -exec bash -c "sed_i 's/\(artifactId.*\)_'${FROM_VERSION}'/\1_'${TO_VERSION}'/g' {}" \;
 
-# Also update <scala.binary.version> in parent POM
+# update <scala.binary.version> in parent POM
 # Match any scala binary version to ensure idempotency
-sed_i '1,/<scala\.binary\.version>[0-9]*\.[0-9]*</s/<scala\.binary\.version>[0-9]*\.[0-9]*</<scala.binary.version>'$TO_VERSION'</' \
-  "$BASEDIR/pom.xml"
+sed_i '1,/<scala\.binary\.version>[0-9]*\.[0-9]*</s/<scala\.binary\.version>[0-9]*\.[0-9]*</<scala.binary.version>'${TO_VERSION}'</' \
+  "${BASEDIR}/pom.xml"
 
+# update <scala.version> in parent POM
+# This is to make variables in leaf pom to be substituted to real value when flattened-pom is created. 
+# maven-flatten plugin doesn't take properties defined under profile even if scala-2.11/scala-2.10 is activated via -Pscala-2.11/-Pscala-2.10,
+# and use default defined properties to create flatten pom.
+sed_i '1,/<scala\.version>[0-9]*\.[0-9]*\.[0-9]*</s/<scala\.version>[0-9]*\.[0-9]*\.[0-9]*</<scala.version>'${SCALA_LIB_VERSION}'</' \
+  "${BASEDIR}/pom.xml"

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/11e897df/livy/pom.xml
----------------------------------------------------------------------
diff --git a/livy/pom.xml b/livy/pom.xml
index 3673691..8ea62a2 100644
--- a/livy/pom.xml
+++ b/livy/pom.xml
@@ -16,481 +16,481 @@
   ~ limitations under the License.
   -->
 
-<project xmlns="http://maven.apache.org/POM/4.0.0"
-         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+         xmlns="http://maven.apache.org/POM/4.0.0"
          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
-    <modelVersion>4.0.0</modelVersion>
-
-    <parent>
-        <artifactId>zeppelin</artifactId>
-        <groupId>org.apache.zeppelin</groupId>
-        <version>0.7.1-SNAPSHOT</version>
-        <relativePath>..</relativePath>
-    </parent>
+  <modelVersion>4.0.0</modelVersion>
 
+  <parent>
+    <artifactId>zeppelin</artifactId>
     <groupId>org.apache.zeppelin</groupId>
-    <artifactId>zeppelin-livy</artifactId>
-    <packaging>jar</packaging>
     <version>0.7.1-SNAPSHOT</version>
-    <name>Zeppelin: Livy interpreter</name>
-
-    <properties>
-        <!--library versions-->
-        <commons.exec.version>1.3</commons.exec.version>
-        <httpcomponents.client.version>4.3.4</httpcomponents.client.version>
-        <spring.web.version>4.3.0.RELEASE</spring.web.version>
-        <spring.security.kerberosclient>1.0.1.RELEASE</spring.security.kerberosclient>
-
-        <!--test library versions-->
-        <achilles.version>3.2.4-Zeppelin</achilles.version>
-        <assertj.version>1.7.0</assertj.version>
-        <mockito.version>1.9.5</mockito.version>
-        <livy.version>0.2.0</livy.version>
-        <spark.version>1.5.2</spark.version>
-        <hadoop.version>2.6.0</hadoop.version>
-
-        <!--plugin versions-->
-        <plugin.failsafe.version>2.16</plugin.failsafe.version>
-        <plugin.antrun.version>1.8</plugin.antrun.version>
-    </properties>
-
-    <dependencies>
-        <dependency>
-            <groupId>${project.groupId}</groupId>
-            <artifactId>zeppelin-interpreter</artifactId>
-            <version>${project.version}</version>
-            <scope>provided</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.commons</groupId>
-            <artifactId>commons-exec</artifactId>
-            <version>${commons.exec.version}</version>
-        </dependency>
-
-        <dependency>
-            <groupId>org.slf4j</groupId>
-            <artifactId>slf4j-api</artifactId>
-        </dependency>
-
-        <dependency>
-            <groupId>org.slf4j</groupId>
-            <artifactId>slf4j-log4j12</artifactId>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.httpcomponents</groupId>
-            <artifactId>httpclient</artifactId>
-            <version>${httpcomponents.client.version}</version>
-        </dependency>
-
-        <dependency>
-            <groupId>com.google.code.gson</groupId>
-            <artifactId>gson</artifactId>
-        </dependency>
-
-        <dependency>
-            <groupId>org.springframework.security.kerberos</groupId>
-            <artifactId>spring-security-kerberos-client</artifactId>
-            <version>${spring.security.kerberosclient}</version>
-        </dependency>
-
-        <dependency>
-            <groupId>org.springframework</groupId>
-            <artifactId>spring-web</artifactId>
-            <version>${spring.web.version}</version>
-        </dependency>
-
-        <dependency>
-            <groupId>junit</groupId>
-            <artifactId>junit</artifactId>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.assertj</groupId>
-            <artifactId>assertj-core</artifactId>
-            <version>${assertj.version}</version>
-            <scope>test</scope>
-        </dependency>
-        <dependency>
-            <groupId>org.mockito</groupId>
-            <artifactId>mockito-all</artifactId>
-            <version>${mockito.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>com.cloudera.livy</groupId>
-            <artifactId>livy-integration-test</artifactId>
-            <version>${livy.version}</version>
-            <scope>test</scope>
-            <exclusions>
-                <exclusion>
-                    <groupId>org.xerial.snappy</groupId>
-                    <artifactId>snappy-java</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-core_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-sql_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-streaming_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-hive_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-repl_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-yarn_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-auth</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-common</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-hdfs</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-yarn-client</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-client</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-yarn-server-tests</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-        <dependency>
-            <groupId>com.cloudera.livy</groupId>
-            <artifactId>livy-test-lib</artifactId>
-            <version>${livy.version}</version>
-            <scope>test</scope>
-            <exclusions>
-                <exclusion>
-                    <groupId>org.xerial.snappy</groupId>
-                    <artifactId>snappy-java</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-core_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-sql_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-streaming_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-hive_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-repl_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-yarn_2.10</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-        <dependency>
-            <groupId>com.cloudera.livy</groupId>
-            <artifactId>livy-core</artifactId>
-            <version>${livy.version}</version>
-            <scope>test</scope>
-            <exclusions>
-                <exclusion>
-                    <groupId>org.xerial.snappy</groupId>
-                    <artifactId>snappy-java</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-core_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-sql_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-streaming_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-hive_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-repl_2.10</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.spark</groupId>
-                    <artifactId>spark-yarn_2.10</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.spark</groupId>
-            <artifactId>spark-sql_${scala.binary.version}</artifactId>
-            <version>${spark.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.spark</groupId>
-            <artifactId>spark-streaming_${scala.binary.version}</artifactId>
-            <version>${spark.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.spark</groupId>
-            <artifactId>spark-hive_${scala.binary.version}</artifactId>
-            <version>${spark.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.spark</groupId>
-            <artifactId>spark-repl_${scala.binary.version}</artifactId>
-            <version>${spark.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.spark</groupId>
-            <artifactId>spark-yarn_${scala.binary.version}</artifactId>
-            <version>${spark.version}</version>
-            <scope>test</scope>
-            <exclusions>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-yarn-common</artifactId>
-                </exclusion>
-                <exclusion>
-                    <groupId>org.apache.hadoop</groupId>
-                    <artifactId>hadoop-yarn-server-web-proxy</artifactId>
-                </exclusion>
-            </exclusions>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-auth</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-common</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-common</artifactId>
-            <classifier>tests</classifier>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-hdfs</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-hdfs</artifactId>
-            <classifier>tests</classifier>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-client</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-yarn-client</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-yarn-api</artifactId>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-
-        <dependency>
-            <groupId>org.apache.hadoop</groupId>
-            <artifactId>hadoop-yarn-server-tests</artifactId>
-            <classifier>tests</classifier>
-            <version>${hadoop.version}</version>
-            <scope>test</scope>
-        </dependency>
-    </dependencies>
-
-    <repositories>
-        <repository>
-            <id>ossrh</id>
-            <name>ossrh repository</name>
-            <url>https://oss.sonatype.org/content/repositories/releases/</url>
-            <releases>
-                <enabled>true</enabled>
-            </releases>
-            <snapshots>
-                <enabled>false</enabled>
-            </snapshots>
-        </repository>
-    </repositories>
-
-    <build>
-        <plugins>
-            <plugin>
-                <artifactId>maven-enforcer-plugin</artifactId>
-                <executions>
-                    <execution>
-                        <id>enforce</id>
-                        <phase>none</phase>
-                    </execution>
-                </executions>
-            </plugin>
-
-            <plugin>
-                <artifactId>maven-dependency-plugin</artifactId>
-                <executions>
-                    <execution>
-                        <id>copy-dependencies</id>
-                        <phase>package</phase>
-                        <goals>
-                            <goal>copy-dependencies</goal>
-                        </goals>
-                        <configuration>
-                            <outputDirectory>${project.build.directory}/../../interpreter/livy
-                            </outputDirectory>
-                            <overWriteReleases>false</overWriteReleases>
-                            <overWriteSnapshots>false</overWriteSnapshots>
-                            <overWriteIfNewer>true</overWriteIfNewer>
-                            <includeScope>runtime</includeScope>
-                        </configuration>
-                    </execution>
-                    <execution>
-                        <id>copy-artifact</id>
-                        <phase>package</phase>
-                        <goals>
-                            <goal>copy</goal>
-                        </goals>
-                        <configuration>
-                            <outputDirectory>${project.build.directory}/../../interpreter/livy
-                            </outputDirectory>
-                            <overWriteReleases>false</overWriteReleases>
-                            <overWriteSnapshots>false</overWriteSnapshots>
-                            <overWriteIfNewer>true</overWriteIfNewer>
-                            <includeScope>runtime</includeScope>
-                            <artifactItems>
-                                <artifactItem>
-                                    <groupId>${project.groupId}</groupId>
-                                    <artifactId>${project.artifactId}</artifactId>
-                                    <version>${project.version}</version>
-                                    <type>${project.packaging}</type>
-                                </artifactItem>
-                            </artifactItems>
-                        </configuration>
-                    </execution>
-                </executions>
-            </plugin>
-
-            <plugin>
-                <artifactId>maven-failsafe-plugin</artifactId>
-                <version>${plugin.failsafe.version}</version>
-                <executions>
-                    <execution>
-                        <goals>
-                            <goal>integration-test</goal>
-                            <goal>verify</goal>
-                        </goals>
-                    </execution>
-                </executions>
-                <configuration>
-                    <systemPropertyVariables>
-                        <java.io.tmpdir>${project.build.directory}/tmp</java.io.tmpdir>
-                    </systemPropertyVariables>
-                    <argLine>-Xmx2048m</argLine>
-                </configuration>
-            </plugin>
-
-            <plugin>
-                <groupId>org.apache.maven.plugins</groupId>
-                <artifactId>maven-antrun-plugin</artifactId>
-                <version>${plugin.antrun.version}</version>
-                <executions>
-                    <!-- Cleans up files that tests append to (because we have two test plugins). -->
-                    <execution>
-                        <id>pre-test-clean</id>
-                        <phase>generate-test-resources</phase>
-                        <goals>
-                            <goal>run</goal>
-                        </goals>
-                        <configuration>
-                            <target>
-                                <delete file="${project.build.directory}/unit-tests.log"
-                                        quiet="true"/>
-                                <delete file="${project.build.directory}/jacoco.exec" quiet="true"/>
-                                <delete dir="${project.build.directory}/tmp" quiet="true"/>
-                            </target>
-                        </configuration>
-                    </execution>
-                    <!-- Create the temp directory to be  used by tests. -->
-                    <execution>
-                        <id>create-tmp-dir</id>
-                        <phase>generate-test-resources</phase>
-                        <goals>
-                            <goal>run</goal>
-                        </goals>
-                        <configuration>
-                            <target>
-                                <mkdir dir="${project.build.directory}/tmp"/>
-                            </target>
-                        </configuration>
-                    </execution>
-                </executions>
-            </plugin>
-        </plugins>
-    </build>
+    <relativePath>..</relativePath>
+  </parent>
+
+  <groupId>org.apache.zeppelin</groupId>
+  <artifactId>zeppelin-livy</artifactId>
+  <packaging>jar</packaging>
+  <version>0.7.1-SNAPSHOT</version>
+  <name>Zeppelin: Livy interpreter</name>
+
+  <properties>
+    <!--library versions-->
+    <commons.exec.version>1.3</commons.exec.version>
+    <httpcomponents.client.version>4.3.4</httpcomponents.client.version>
+    <spring.web.version>4.3.0.RELEASE</spring.web.version>
+    <spring.security.kerberosclient>1.0.1.RELEASE</spring.security.kerberosclient>
+
+    <!--test library versions-->
+    <achilles.version>3.2.4-Zeppelin</achilles.version>
+    <assertj.version>1.7.0</assertj.version>
+    <mockito.version>1.9.5</mockito.version>
+    <livy.version>0.2.0</livy.version>
+    <spark.version>1.5.2</spark.version>
+    <hadoop.version>2.6.0</hadoop.version>
+
+    <!--plugin versions-->
+    <plugin.failsafe.version>2.16</plugin.failsafe.version>
+    <plugin.antrun.version>1.8</plugin.antrun.version>
+  </properties>
+
+  <dependencies>
+    <dependency>
+      <groupId>${project.groupId}</groupId>
+      <artifactId>zeppelin-interpreter</artifactId>
+      <version>${project.version}</version>
+      <scope>provided</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.commons</groupId>
+      <artifactId>commons-exec</artifactId>
+      <version>${commons.exec.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.slf4j</groupId>
+      <artifactId>slf4j-api</artifactId>
+    </dependency>
+
+    <dependency>
+      <groupId>org.slf4j</groupId>
+      <artifactId>slf4j-log4j12</artifactId>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.httpcomponents</groupId>
+      <artifactId>httpclient</artifactId>
+      <version>${httpcomponents.client.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>com.google.code.gson</groupId>
+      <artifactId>gson</artifactId>
+    </dependency>
+
+    <dependency>
+      <groupId>org.springframework.security.kerberos</groupId>
+      <artifactId>spring-security-kerberos-client</artifactId>
+      <version>${spring.security.kerberosclient}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>org.springframework</groupId>
+      <artifactId>spring-web</artifactId>
+      <version>${spring.web.version}</version>
+    </dependency>
+
+    <dependency>
+      <groupId>junit</groupId>
+      <artifactId>junit</artifactId>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.assertj</groupId>
+      <artifactId>assertj-core</artifactId>
+      <version>${assertj.version}</version>
+      <scope>test</scope>
+    </dependency>
+    <dependency>
+      <groupId>org.mockito</groupId>
+      <artifactId>mockito-all</artifactId>
+      <version>${mockito.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>com.cloudera.livy</groupId>
+      <artifactId>livy-integration-test</artifactId>
+      <version>${livy.version}</version>
+      <scope>test</scope>
+      <exclusions>
+        <exclusion>
+          <groupId>org.xerial.snappy</groupId>
+          <artifactId>snappy-java</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-core_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-sql_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-streaming_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-hive_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-repl_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-yarn_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-auth</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-common</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-hdfs</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-yarn-client</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-client</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-yarn-server-tests</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
+    <dependency>
+      <groupId>com.cloudera.livy</groupId>
+      <artifactId>livy-test-lib</artifactId>
+      <version>${livy.version}</version>
+      <scope>test</scope>
+      <exclusions>
+        <exclusion>
+          <groupId>org.xerial.snappy</groupId>
+          <artifactId>snappy-java</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-core_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-sql_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-streaming_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-hive_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-repl_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-yarn_2.10</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
+    <dependency>
+      <groupId>com.cloudera.livy</groupId>
+      <artifactId>livy-core</artifactId>
+      <version>${livy.version}</version>
+      <scope>test</scope>
+      <exclusions>
+        <exclusion>
+          <groupId>org.xerial.snappy</groupId>
+          <artifactId>snappy-java</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-core_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-sql_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-streaming_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-hive_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-repl_2.10</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.spark</groupId>
+          <artifactId>spark-yarn_2.10</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.spark</groupId>
+      <artifactId>spark-sql_${scala.binary.version}</artifactId>
+      <version>${spark.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.spark</groupId>
+      <artifactId>spark-streaming_${scala.binary.version}</artifactId>
+      <version>${spark.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.spark</groupId>
+      <artifactId>spark-hive_${scala.binary.version}</artifactId>
+      <version>${spark.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.spark</groupId>
+      <artifactId>spark-repl_${scala.binary.version}</artifactId>
+      <version>${spark.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.spark</groupId>
+      <artifactId>spark-yarn_${scala.binary.version}</artifactId>
+      <version>${spark.version}</version>
+      <scope>test</scope>
+      <exclusions>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-yarn-common</artifactId>
+        </exclusion>
+        <exclusion>
+          <groupId>org.apache.hadoop</groupId>
+          <artifactId>hadoop-yarn-server-web-proxy</artifactId>
+        </exclusion>
+      </exclusions>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-auth</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-common</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-common</artifactId>
+      <classifier>tests</classifier>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-hdfs</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-hdfs</artifactId>
+      <classifier>tests</classifier>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-client</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-yarn-client</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-yarn-api</artifactId>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>org.apache.hadoop</groupId>
+      <artifactId>hadoop-yarn-server-tests</artifactId>
+      <classifier>tests</classifier>
+      <version>${hadoop.version}</version>
+      <scope>test</scope>
+    </dependency>
+  </dependencies>
+
+  <repositories>
+    <repository>
+      <id>ossrh</id>
+      <name>ossrh repository</name>
+      <url>https://oss.sonatype.org/content/repositories/releases/</url>
+      <releases>
+        <enabled>true</enabled>
+      </releases>
+      <snapshots>
+        <enabled>false</enabled>
+      </snapshots>
+    </repository>
+  </repositories>
+
+  <build>
+    <plugins>
+      <plugin>
+        <artifactId>maven-enforcer-plugin</artifactId>
+        <executions>
+          <execution>
+            <id>enforce</id>
+            <phase>none</phase>
+          </execution>
+        </executions>
+      </plugin>
+
+      <plugin>
+        <artifactId>maven-dependency-plugin</artifactId>
+        <executions>
+          <execution>
+            <id>copy-dependencies</id>
+            <phase>package</phase>
+            <goals>
+              <goal>copy-dependencies</goal>
+            </goals>
+            <configuration>
+              <outputDirectory>${project.build.directory}/../../interpreter/livy
+              </outputDirectory>
+              <overWriteReleases>false</overWriteReleases>
+              <overWriteSnapshots>false</overWriteSnapshots>
+              <overWriteIfNewer>true</overWriteIfNewer>
+              <includeScope>runtime</includeScope>
+            </configuration>
+          </execution>
+          <execution>
+            <id>copy-artifact</id>
+            <phase>package</phase>
+            <goals>
+              <goal>copy</goal>
+            </goals>
+            <configuration>
+              <outputDirectory>${project.build.directory}/../../interpreter/livy
+              </outputDirectory>
+              <overWriteReleases>false</overWriteReleases>
+              <overWriteSnapshots>false</overWriteSnapshots>
+              <overWriteIfNewer>true</overWriteIfNewer>
+              <includeScope>runtime</includeScope>
+              <artifactItems>
+                <artifactItem>
+                  <groupId>${project.groupId}</groupId>
+                  <artifactId>${project.artifactId}</artifactId>
+                  <version>${project.version}</version>
+                  <type>${project.packaging}</type>
+                </artifactItem>
+              </artifactItems>
+            </configuration>
+          </execution>
+        </executions>
+      </plugin>
+
+      <plugin>
+        <artifactId>maven-failsafe-plugin</artifactId>
+        <version>${plugin.failsafe.version}</version>
+        <executions>
+          <execution>
+            <goals>
+              <goal>integration-test</goal>
+              <goal>verify</goal>
+            </goals>
+          </execution>
+        </executions>
+        <configuration>
+          <systemPropertyVariables>
+            <java.io.tmpdir>${project.build.directory}/tmp</java.io.tmpdir>
+          </systemPropertyVariables>
+          <argLine>-Xmx2048m</argLine>
+        </configuration>
+      </plugin>
+
+      <plugin>
+        <groupId>org.apache.maven.plugins</groupId>
+        <artifactId>maven-antrun-plugin</artifactId>
+        <version>${plugin.antrun.version}</version>
+        <executions>
+          <!-- Cleans up files that tests append to (because we have two test plugins). -->
+          <execution>
+            <id>pre-test-clean</id>
+            <phase>generate-test-resources</phase>
+            <goals>
+              <goal>run</goal>
+            </goals>
+            <configuration>
+              <target>
+                <delete file="${project.build.directory}/unit-tests.log"
+                        quiet="true"/>
+                <delete file="${project.build.directory}/jacoco.exec" quiet="true"/>
+                <delete dir="${project.build.directory}/tmp" quiet="true"/>
+              </target>
+            </configuration>
+          </execution>
+          <!-- Create the temp directory to be  used by tests. -->
+          <execution>
+            <id>create-tmp-dir</id>
+            <phase>generate-test-resources</phase>
+            <goals>
+              <goal>run</goal>
+            </goals>
+            <configuration>
+              <target>
+                <mkdir dir="${project.build.directory}/tmp"/>
+              </target>
+            </configuration>
+          </execution>
+        </executions>
+      </plugin>
+    </plugins>
+  </build>
 
 </project>

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/11e897df/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index aec34c9..a7e7a5b 100644
--- a/pom.xml
+++ b/pom.xml
@@ -309,6 +309,32 @@
         </configuration>
       </plugin>
 
+      <plugin>
+        <groupId>org.codehaus.mojo</groupId>
+        <artifactId>flatten-maven-plugin</artifactId>
+        <version>1.0.0</version>
+        <configuration>
+          <flattenMode>ossrh</flattenMode>
+          <updatePomFile>true</updatePomFile>
+        </configuration>
+        <executions>
+          <execution>
+            <id>flatten</id>
+            <phase>process-resources</phase>
+            <goals>
+              <goal>flatten</goal>
+            </goals>
+          </execution>
+          <execution>
+            <id>flatten.clean</id>
+            <phase>clean</phase>
+            <goals>
+              <goal>clean</goal>
+            </goals>
+          </execution>
+        </executions>
+      </plugin>
+
       <!-- Test coverage plugin -->
       <plugin>
         <groupId>org.codehaus.mojo</groupId>
@@ -476,7 +502,7 @@
         <version>${plugin.deploy.version}</version>
       </plugin>
 
-      <!--TODO(alex): make part of the build and reconcile conflicts
+    <!--TODO(alex): make part of the build and reconcile conflicts
     <plugin>
       <groupId>com.ning.maven.plugins</groupId>
       <artifactId>maven-duplicate-finder-plugin</artifactId>


[11/23] zeppelin git commit: ZEPPELIN-2219 Apache Ignite version updated to 1.9

Posted by mi...@apache.org.
ZEPPELIN-2219 Apache Ignite version updated to 1.9

### What is this PR for?
Apache Ignite version update to 1.9 in Ignite interpreter

### What type of PR is it?
[Improvement]

### What is the Jira issue?
ZEPPELIN-2219

Author: agura <ag...@apache.org>

Closes #2101 from agura/ZEPPELIN-2219 and squashes the following commits:

7f053d7 [agura] ZEPPELIN-2219 Apache Ignite version updated to 1.9

(cherry picked from commit f0cf85f09b960e49faa1627c959d3a8b11595ccc)
Signed-off-by: ahyoungryu <ah...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/aca96c31
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/aca96c31
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/aca96c31

Branch: refs/heads/branch-0.7
Commit: aca96c3191309c5b8296ebf7d2716531e03bdf12
Parents: 6d72db3
Author: agura <ag...@apache.org>
Authored: Mon Mar 6 21:15:37 2017 +0300
Committer: ahyoungryu <ah...@apache.org>
Committed: Mon Mar 13 20:12:51 2017 +0900

----------------------------------------------------------------------
 docs/install/build.md | 2 +-
 ignite/pom.xml        | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/aca96c31/docs/install/build.md
----------------------------------------------------------------------
diff --git a/docs/install/build.md b/docs/install/build.md
index e452559..985f2da 100644
--- a/docs/install/build.md
+++ b/docs/install/build.md
@@ -210,7 +210,7 @@ mvn clean package -Pspark-1.5 -Pmapr50 -DskipTests
 Ignite Interpreter
 
 ```bash
-mvn clean package -Dignite.version=1.8.0 -DskipTests
+mvn clean package -Dignite.version=1.9.0 -DskipTests
 ```
 
 Scalding Interpreter

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/aca96c31/ignite/pom.xml
----------------------------------------------------------------------
diff --git a/ignite/pom.xml b/ignite/pom.xml
index 2dd5442..91fdacc 100644
--- a/ignite/pom.xml
+++ b/ignite/pom.xml
@@ -32,7 +32,7 @@
   <name>Zeppelin: Apache Ignite interpreter</name>
 
   <properties>
-    <ignite.version>1.8.0</ignite.version>
+    <ignite.version>1.9.0</ignite.version>
   </properties>
 
   <dependencies>


[15/23] zeppelin git commit: ZEPPELIN-2241: JDBC interpreter throws npe on connecting to any db that has a schema with "null" name

Posted by mi...@apache.org.
ZEPPELIN-2241: JDBC interpreter throws npe on connecting to any db that has a schema with "null" name

A few sentences describing the overall goals of the pull request's commits.
First time? Check out the contributing guide - https://zeppelin.apache.org/contribution/contributions.html

Prevents JDBC interpreter from throwing a stacktrace when the database has a schema with no name (null).

[Bug Fix]

* [ ] - Task

* Open an issue on Jira https://issues.apache.org/jira/browse/ZEPPELIN/
* Put link here, and add [ZEPPELIN-*Jira number*] in PR title, eg. [ZEPPELIN-533]

https://issues.apache.org/jira/browse/ZEPPELIN-2241

Outline the steps to test the PR here.

Use JDBC interpreter to connect to any database that has a schema without a name. Apache Phoenix in particular has such a schema by default.

* Does the licenses files need update?

No

* Is there breaking changes for older versions?

No

* Does this needs documentation?

No

Author: Randy Gelhausen <rg...@gmail.com>

Closes #2117 from randerzander/master and squashes the following commits:

49d33f9 [Randy Gelhausen] Removing comment per feedback
79d8a23 [Randy Gelhausen] Added comment to the change
0101296 [Randy Gelhausen] ZEPPELIN-2241: JDBC interpreter throws npe on connecting to any db that has a schema with "null" name

(cherry picked from commit 623b4ace9e5c8f1667bd34c21b944b9d4636a2bd)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/b00e27c7
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/b00e27c7
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/b00e27c7

Branch: refs/heads/branch-0.7
Commit: b00e27c732dd0ad9ca9bbbfab2d6c61ca105917f
Parents: bfa812a
Author: Randy Gelhausen <rg...@gmail.com>
Authored: Mon Mar 13 12:46:47 2017 -0400
Committer: Lee moon soo <mo...@apache.org>
Committed: Wed Mar 15 08:22:36 2017 -0700

----------------------------------------------------------------------
 jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java | 2 ++
 1 file changed, 2 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/b00e27c7/jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java
----------------------------------------------------------------------
diff --git a/jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java b/jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java
index e5c8987..f282992 100644
--- a/jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java
+++ b/jdbc/src/main/java/org/apache/zeppelin/jdbc/SqlCompleter.java
@@ -217,6 +217,8 @@ public class SqlCompleter extends StringsCompleter {
       try {
         while (schemas.next()) {
           String schemaName = schemas.getString("TABLE_SCHEM");
+          if (schemaName == null)
+            schemaName = "";
           if (!isBlank(schemaName)) {
             names.add(schemaName + ".");
           }


[08/23] zeppelin git commit: [HOTFIX][ZEPPELIN-2178] Prevent from cleaning output in "Personalized Mode"

Posted by mi...@apache.org.
[HOTFIX][ZEPPELIN-2178] Prevent from cleaning output in "Personalized Mode"


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/42385220
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/42385220
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/42385220

Branch: refs/heads/branch-0.7
Commit: 42385220e6d43e7757429f4a10df2a2e6fcadbe4
Parents: 11e897d
Author: Jongyoul Lee <jo...@gmail.com>
Authored: Tue Mar 7 14:46:33 2017 +0900
Committer: Jongyoul Lee <jo...@gmail.com>
Committed: Tue Mar 7 14:47:59 2017 +0900

----------------------------------------------------------------------
 .../apache/zeppelin/socket/NotebookServer.java  |  34 +++++-
 .../java/org/apache/zeppelin/notebook/Note.java |   9 ++
 .../org/apache/zeppelin/notebook/Paragraph.java |  32 ++++--
 .../apache/zeppelin/notebook/ParagraphTest.java | 107 +++++++++++++++++++
 4 files changed, 172 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/42385220/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
----------------------------------------------------------------------
diff --git a/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java b/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
index e2ffa0a..6791b63 100644
--- a/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
+++ b/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
@@ -1127,6 +1127,17 @@ public class NotebookServer extends WebSocketServlet
     p.setConfig(config);
     p.setTitle((String) fromMessage.get("title"));
     p.setText((String) fromMessage.get("paragraph"));
+
+    subject = new AuthenticationInfo(fromMessage.principal);
+    if (note.isPersonalizedMode()) {
+      p = p.getUserParagraph(subject.getUser());
+      p.settings.setParams(params);
+      p.setConfig(config);
+      p.setTitle((String) fromMessage.get("title"));
+      p.setText((String) fromMessage.get("paragraph"));
+    }
+
+
     note.persist(subject);
 
     if (note.isPersonalizedMode()) {
@@ -1647,6 +1658,15 @@ public class NotebookServer extends WebSocketServlet
     p.settings.setParams(params);
     p.setConfig(config);
 
+    if (note.isPersonalizedMode()) {
+      p = note.getParagraph(paragraphId);
+      p.setText(text);
+      p.setTitle(title);
+      p.setAuthenticationInfo(subject);
+      p.settings.setParams(params);
+      p.setConfig(config);
+    }
+
     return p;
   }
 
@@ -1767,7 +1787,15 @@ public class NotebookServer extends WebSocketServlet
       InterpreterResult.Type type, String output) {
     Message msg = new Message(OP.PARAGRAPH_UPDATE_OUTPUT).put("noteId", noteId)
         .put("paragraphId", paragraphId).put("index", index).put("type", type).put("data", output);
-    broadcast(noteId, msg);
+    Note note = notebook().getNote(noteId);
+    if (note.isPersonalizedMode()) {
+      String user = note.getParagraph(paragraphId).getUser();
+      if (null != user) {
+        multicastToUser(user, msg);
+      }
+    } else {
+      broadcast(noteId, msg);
+    }
   }
 
 
@@ -2036,7 +2064,9 @@ public class NotebookServer extends WebSocketServlet
         }
       }
       if (job instanceof Paragraph) {
-        notebookServer.broadcastParagraph(note, (Paragraph) job);
+        Paragraph p = (Paragraph) job;
+        p.setStatusToUserParagraph(job.getStatus());
+        notebookServer.broadcastParagraph(note, p);
       }
       try {
         notebookServer.broadcastUpdateNoteJobInfo(System.currentTimeMillis() - 5000);

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/42385220/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
index 35f32f3..f341e16 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
@@ -137,6 +137,15 @@ public class Note implements Serializable, ParagraphJobListener {
       valueString = "false";
     }
     getConfig().put("personalizedMode", valueString);
+    clearUserParagraphs(value);
+  }
+
+  private void clearUserParagraphs(boolean isPersonalized) {
+    if (!isPersonalized) {
+      for (Paragraph p : paragraphs) {
+        p.clearUserParagraphs();
+      }
+    }
   }
 
   public String getId() {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/42385220/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Paragraph.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Paragraph.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Paragraph.java
index f609ecb..cb6e0c7 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Paragraph.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Paragraph.java
@@ -49,6 +49,7 @@ import com.google.common.annotations.VisibleForTesting;
  * Paragraph is a representation of an execution unit.
  */
 public class Paragraph extends Job implements Serializable, Cloneable {
+
   private static final long serialVersionUID = -6328572073497992016L;
 
   private static Logger logger = LoggerFactory.getLogger(Paragraph.class);
@@ -123,6 +124,9 @@ public class Paragraph extends Job implements Serializable, Cloneable {
   }
 
   public Paragraph getUserParagraph(String user) {
+    if (!userParagraphMap.containsKey(user)) {
+      cloneParagraphForUser(user);
+    }
     return userParagraphMap.get(user);
   }
 
@@ -139,12 +143,16 @@ public class Paragraph extends Job implements Serializable, Cloneable {
     p.setTitle(getTitle());
     p.setText(getText());
     p.setResult(getReturn());
-    p.setStatus(getStatus());
+    p.setStatus(Status.READY);
     p.setId(getId());
     addUser(p, user);
     return p;
   }
 
+  public void clearUserParagraphs() {
+    userParagraphMap.clear();
+  }
+
   public void addUser(Paragraph p, String user) {
     userParagraphMap.put(user, p);
   }
@@ -370,6 +378,10 @@ public class Paragraph extends Job implements Serializable, Cloneable {
       }
     }
 
+    for (Paragraph p : userParagraphMap.values()) {
+      p.setText(getText());
+    }
+
     String script = getScriptBody();
     // inject form
     if (repl.getFormType() == FormType.NATIVE) {
@@ -401,13 +413,9 @@ public class Paragraph extends Job implements Serializable, Cloneable {
       List<InterpreterResultMessage> resultMessages = context.out.toInterpreterResultMessage();
       resultMessages.addAll(ret.message());
 
-      for (Paragraph p : userParagraphMap.values()) {
-        p.setText(getText());
-      }
-
       InterpreterResult res = new InterpreterResult(ret.code(), resultMessages);
 
-      Paragraph p = userParagraphMap.get(getUser());
+      Paragraph p = getUserParagraph(getUser());
       if (null != p) {
         p.setResult(res);
         p.settings.setParams(settings.getParams());
@@ -526,12 +534,12 @@ public class Paragraph extends Job implements Serializable, Cloneable {
     Credentials credentials = note.getCredentials();
     if (authenticationInfo != null) {
       UserCredentials userCredentials =
-              credentials.getUserCredentials(authenticationInfo.getUser());
+          credentials.getUserCredentials(authenticationInfo.getUser());
       authenticationInfo.setUserCredentials(userCredentials);
     }
 
     InterpreterContext interpreterContext =
-            new InterpreterContext(note.getId(), getId(), getRequiredReplName(), this.getTitle(),
+        new InterpreterContext(note.getId(), getId(), getRequiredReplName(), this.getTitle(),
             this.getText(), this.getAuthenticationInfo(), this.getConfig(), this.settings, registry,
             resourcePool, runners, output);
     return interpreterContext;
@@ -574,7 +582,15 @@ public class Paragraph extends Job implements Serializable, Cloneable {
     return new ParagraphRunner(note, note.getId(), getId());
   }
 
+  public void setStatusToUserParagraph(Status status) {
+    String user = getUser();
+    if (null != user) {
+      getUserParagraph(getUser()).setStatus(status);
+    }
+  }
+
   static class ParagraphRunner extends InterpreterContextRunner {
+
     private transient Note note;
 
     public ParagraphRunner(Note note, String noteId, String paragraphId) {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/42385220/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/ParagraphTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/ParagraphTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/ParagraphTest.java
index 69577e9..0e77846 100644
--- a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/ParagraphTest.java
+++ b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/ParagraphTest.java
@@ -19,22 +19,48 @@ package org.apache.zeppelin.notebook;
 
 
 import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.mockito.Matchers.any;
+import static org.mockito.Matchers.anyObject;
 import static org.mockito.Matchers.anyString;
 import static org.mockito.Matchers.eq;
+import static org.mockito.Mockito.doNothing;
+import static org.mockito.Mockito.doReturn;
 import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.spy;
 import static org.mockito.Mockito.verify;
 import static org.mockito.Mockito.when;
 
+import com.google.common.collect.Lists;
+import java.util.List;
 import org.apache.zeppelin.display.AngularObject;
 import org.apache.zeppelin.display.AngularObjectBuilder;
 import org.apache.zeppelin.display.AngularObjectRegistry;
 import org.apache.zeppelin.display.Input;
 import org.apache.zeppelin.interpreter.Interpreter;
+import org.apache.zeppelin.interpreter.Interpreter.FormType;
+import org.apache.zeppelin.interpreter.InterpreterContext;
 import org.apache.zeppelin.interpreter.InterpreterFactory;
+import org.apache.zeppelin.interpreter.InterpreterGroup;
+import org.apache.zeppelin.interpreter.InterpreterOption;
+import org.apache.zeppelin.interpreter.InterpreterResult;
+import org.apache.zeppelin.interpreter.InterpreterResult.Code;
+import org.apache.zeppelin.interpreter.InterpreterResult.Type;
+import org.apache.zeppelin.interpreter.InterpreterResultMessage;
+import org.apache.zeppelin.interpreter.InterpreterSetting;
+import org.apache.zeppelin.interpreter.InterpreterSetting.Status;
+import org.apache.zeppelin.interpreter.InterpreterSettingManager;
+import org.apache.zeppelin.resource.ResourcePool;
+import org.apache.zeppelin.scheduler.JobListener;
+import org.apache.zeppelin.user.AuthenticationInfo;
+import org.apache.zeppelin.user.Credentials;
 import org.junit.Test;
 
 import java.util.HashMap;
 import java.util.Map;
+import org.mockito.ArgumentCaptor;
+import org.mockito.Mockito;
 
 public class ParagraphTest {
   @Test
@@ -125,4 +151,85 @@ public class ParagraphTest {
     verify(registry).get("age", noteId, null);
     assertEquals(actual, expected);
   }
+
+  @Test
+  public void returnDefaultParagraphWithNewUser() {
+    Paragraph p = new Paragraph("para_1", null, null, null, null);
+    Object defaultValue = "Default Value";
+    p.setResult(defaultValue);
+    Paragraph newUserParagraph = p.getUserParagraph("new_user");
+    assertNotNull(newUserParagraph);
+    assertEquals(defaultValue, newUserParagraph.getReturn());
+  }
+
+  @Test
+  public void returnUnchangedResultsWithDifferentUser() throws Throwable {
+    InterpreterSettingManager mockInterpreterSettingManager = mock(InterpreterSettingManager.class);
+    Note mockNote = mock(Note.class);
+    when(mockNote.getCredentials()).thenReturn(mock(Credentials.class));
+    Paragraph spyParagraph = spy(new Paragraph("para_1", mockNote,  null, null, mockInterpreterSettingManager));
+
+    doReturn("spy").when(spyParagraph).getRequiredReplName();
+
+
+    Interpreter mockInterpreter = mock(Interpreter.class);
+    doReturn(mockInterpreter).when(spyParagraph).getRepl(anyString());
+
+    InterpreterGroup mockInterpreterGroup = mock(InterpreterGroup.class);
+    when(mockInterpreter.getInterpreterGroup()).thenReturn(mockInterpreterGroup);
+    when(mockInterpreterGroup.getId()).thenReturn("mock_id_1");
+    when(mockInterpreterGroup.getAngularObjectRegistry()).thenReturn(mock(AngularObjectRegistry.class));
+    when(mockInterpreterGroup.getResourcePool()).thenReturn(mock(ResourcePool.class));
+
+    List<InterpreterSetting> spyInterpreterSettingList = spy(Lists.<InterpreterSetting>newArrayList());
+    InterpreterSetting mockInterpreterSetting = mock(InterpreterSetting.class);
+    InterpreterOption mockInterpreterOption = mock(InterpreterOption.class);
+    when(mockInterpreterSetting.getOption()).thenReturn(mockInterpreterOption);
+    when(mockInterpreterOption.permissionIsSet()).thenReturn(false);
+    when(mockInterpreterSetting.getStatus()).thenReturn(Status.READY);
+    when(mockInterpreterSetting.getId()).thenReturn("mock_id_1");
+    when(mockInterpreterSetting.getInterpreterGroup(anyString(), anyString())).thenReturn(mockInterpreterGroup);
+    spyInterpreterSettingList.add(mockInterpreterSetting);
+    when(mockNote.getId()).thenReturn("any_id");
+    when(mockInterpreterSettingManager.getInterpreterSettings(anyString())).thenReturn(spyInterpreterSettingList);
+
+    doReturn("spy script body").when(spyParagraph).getScriptBody();
+
+    when(mockInterpreter.getFormType()).thenReturn(FormType.NONE);
+
+    ParagraphJobListener mockJobListener = mock(ParagraphJobListener.class);
+    doReturn(mockJobListener).when(spyParagraph).getListener();
+    doNothing().when(mockJobListener).onOutputUpdateAll(Mockito.<Paragraph>any(), Mockito.anyList());
+
+    InterpreterResult mockInterpreterResult = mock(InterpreterResult.class);
+    when(mockInterpreter.interpret(anyString(), Mockito.<InterpreterContext>any())).thenReturn(mockInterpreterResult);
+    when(mockInterpreterResult.code()).thenReturn(Code.SUCCESS);
+
+
+    // Actual test
+    List<InterpreterResultMessage> result1 = Lists.newArrayList();
+    result1.add(new InterpreterResultMessage(Type.TEXT, "result1"));
+    when(mockInterpreterResult.message()).thenReturn(result1);
+
+    AuthenticationInfo user1 = new AuthenticationInfo("user1");
+    spyParagraph.setAuthenticationInfo(user1);
+    spyParagraph.jobRun();
+    Paragraph p1 = spyParagraph.getUserParagraph(user1.getUser());
+
+    List<InterpreterResultMessage> result2 = Lists.newArrayList();
+    result2.add(new InterpreterResultMessage(Type.TEXT, "result2"));
+    when(mockInterpreterResult.message()).thenReturn(result2);
+
+    AuthenticationInfo user2 = new AuthenticationInfo("user2");
+    spyParagraph.setAuthenticationInfo(user2);
+    spyParagraph.jobRun();
+    Paragraph p2 = spyParagraph.getUserParagraph(user2.getUser());
+
+    assertNotEquals(p1.getReturn().toString(), p2.getReturn().toString());
+
+    assertEquals(p1, spyParagraph.getUserParagraph(user1.getUser()));
+
+
+
+  }
 }


[09/23] zeppelin git commit: [ZEPPELIN-2172] Redirect to home if notebook authentication fails in realtime

Posted by mi...@apache.org.
[ZEPPELIN-2172] Redirect to home if notebook authentication fails in realtime

### What is this PR for?
Redirect to home page, if a user declines the access failure message on a notebook

### What type of PR is it?
[Bug Fix]

### What is the Jira issue?
* [ZEPPELIN-2172](https://issues.apache.org/jira/browse/ZEPPELIN-2172)
### How should this be tested?
1. Create a notebook with qa_user user as the owner
2. Give write permissions to user test_user1, and read permissions to user test_user3
3. Now in another tab, open the notebook with test_user1 user who has write permissions
4. In the original tab, have user qa_user (owner of the notebook) remove the write permissions from test_user1 user and grant it to some other user test_user5.
5. Goto the other tab where user test_user1 was logged in. It shows an error message
6. On click of close button, UI should redirect to homepage

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: ess_ess <sr...@gmail.com>

Closes #2087 from sravan-s/ZEPPELIN-2172 and squashes the following commits:

0b9b1dd [ess_ess] Add explanatory comment
b73cc73 [ess_ess] [ZEPPELIN-2172] Redirect to home if auth fails

(cherry picked from commit 616301ae4762264a33623b3005a4397db23b215b)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/c7847c11
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/c7847c11
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/c7847c11

Branch: refs/heads/branch-0.7
Commit: c7847c1194ad008970c1e5cc3fcd0c9bf08ebfd7
Parents: 4238522
Author: ess_ess <sr...@gmail.com>
Authored: Sun Mar 5 16:17:39 2017 +0530
Committer: Lee moon soo <mo...@apache.org>
Committed: Tue Mar 7 15:29:13 2017 +0900

----------------------------------------------------------------------
 .../src/components/websocketEvents/websocketEvents.factory.js  | 6 +++++-
 1 file changed, 5 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/c7847c11/zeppelin-web/src/components/websocketEvents/websocketEvents.factory.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/components/websocketEvents/websocketEvents.factory.js b/zeppelin-web/src/components/websocketEvents/websocketEvents.factory.js
index d41daaa..a4a789f 100644
--- a/zeppelin-web/src/components/websocketEvents/websocketEvents.factory.js
+++ b/zeppelin-web/src/components/websocketEvents/websocketEvents.factory.js
@@ -89,7 +89,11 @@ function websocketEvents($rootScope, $websocket, $location, baseUrlSrv) {
           label: 'Cancel',
           action: function(dialog) {
             dialog.close();
-            $location.path('/');
+            // using $rootScope.apply to trigger angular digest cycle
+            // changing $location.path inside bootstrap modal wont trigger digest
+            $rootScope.$apply(function() {
+              $location.path('/');
+            });
           }
         }];
       }


[21/23] zeppelin git commit: [ZEPPELIN-2179] "clear output" paragraph doesn't work in personalized mode (branch-0.7)

Posted by mi...@apache.org.
[ZEPPELIN-2179] "clear output" paragraph doesn't work in personalized mode (branch-0.7)

### What is this PR for?

`clear output` (`cmd` + `opt` + `L`) doesn't work in the personalized mode.

- CI failure might be related with https://github.com/apache/zeppelin/pull/2103

### What type of PR is it?
[Bug Fix]

### Todos

NONE

### What is the Jira issue?

[ZEPPELIN-2179](https://issues.apache.org/jira/browse/ZEPPELIN-2179)

### How should this be tested?

1. Execute a paragraph
2. Click `clear output` button (`CMD` + `OPT` + `L)

### Screenshots (if appropriate)

NONE

### Questions:
* Does the licenses files need update? - NO
* Is there breaking changes for older versions? - NO
* Does this needs documentation? - NO

Author: 1ambda <1a...@gmail.com>

Closes #2116 from 1ambda/ZEPPELIN-2179/clear-output-doesnt-work-in-person-mode and squashes the following commits:

e2593f6 [1ambda] fix Remove unused param
65d1147 [1ambda] fix: Remove unused parameter
0386c5c [1ambda] fix: Clear personalized paragraph output


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/75cf72e9
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/75cf72e9
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/75cf72e9

Branch: refs/heads/branch-0.7
Commit: 75cf72e9b374819fee550a33aa41258bd25b03e4
Parents: 56fa8b5
Author: 1ambda <1a...@gmail.com>
Authored: Thu Mar 16 06:17:15 2017 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Fri Mar 17 00:07:28 2017 +0900

----------------------------------------------------------------------
 .../apache/zeppelin/socket/NotebookServer.java    | 17 +++++++++--------
 .../java/org/apache/zeppelin/notebook/Note.java   | 18 ++++++++++++++----
 .../apache/zeppelin/notebook/NotebookTest.java    |  2 +-
 3 files changed, 24 insertions(+), 13 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/75cf72e9/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
----------------------------------------------------------------------
diff --git a/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java b/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
index 6791b63..3034d13 100644
--- a/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
+++ b/zeppelin-server/src/main/java/org/apache/zeppelin/socket/NotebookServer.java
@@ -621,14 +621,13 @@ public class NotebookServer extends WebSocketServlet
 
   public void broadcastParagraph(Note note, Paragraph p) {
     if (note.isPersonalizedMode()) {
-      broadcastParagraphs(p.getUserParagraphMap(), p);
+      broadcastParagraphs(p.getUserParagraphMap());
     } else {
       broadcast(note.getId(), new Message(OP.PARAGRAPH).put("paragraph", p));
     }
   }
 
-  public void broadcastParagraphs(Map<String, Paragraph> userParagraphMap,
-      Paragraph defaultParagraph) {
+  public void broadcastParagraphs(Map<String, Paragraph> userParagraphMap) {
     if (null != userParagraphMap) {
       for (String user : userParagraphMap.keySet()) {
         multicastToUser(user,
@@ -1143,7 +1142,7 @@ public class NotebookServer extends WebSocketServlet
     if (note.isPersonalizedMode()) {
       Map<String, Paragraph> userParagraphMap =
           note.getParagraph(paragraphId).getUserParagraphMap();
-      broadcastParagraphs(userParagraphMap, p);
+      broadcastParagraphs(userParagraphMap);
     } else {
       broadcastParagraph(note, p);
     }
@@ -1239,9 +1238,11 @@ public class NotebookServer extends WebSocketServlet
           notebookAuthorization.getWriters(noteId));
       return;
     }
-    note.clearParagraphOutput(paragraphId);
-    Paragraph paragraph = note.getParagraph(paragraphId);
-    broadcastParagraph(note, paragraph);
+
+    String user = (note.isPersonalizedMode()) ?
+            new AuthenticationInfo(fromMessage.principal).getUser() : null;
+    Paragraph p = note.clearParagraphOutput(paragraphId, user);
+    broadcastParagraph(note, p);
   }
 
   private void completion(NotebookSocket conn, HashSet<String> userAndRoles, Notebook notebook,
@@ -1806,7 +1807,7 @@ public class NotebookServer extends WebSocketServlet
   public void onOutputClear(String noteId, String paragraphId) {
     Notebook notebook = notebook();
     final Note note = notebook.getNote(noteId);
-    note.clearParagraphOutput(paragraphId);
+    note.clearParagraphOutput(paragraphId, null);
     Paragraph paragraph = note.getParagraph(paragraphId);
     broadcastParagraph(note, paragraph);
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/75cf72e9/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
index f341e16..e019ee5 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/Note.java
@@ -395,15 +395,25 @@ public class Note implements Serializable, ParagraphJobListener {
    * Clear paragraph output by id.
    *
    * @param paragraphId ID of paragraph
+   * @param user not null if personalized mode is enabled
    * @return Paragraph
    */
-  public Paragraph clearParagraphOutput(String paragraphId) {
+  public Paragraph clearParagraphOutput(String paragraphId, String user) {
     synchronized (paragraphs) {
       for (Paragraph p : paragraphs) {
-        if (p.getId().equals(paragraphId)) {
-          p.setReturn(null, null);
-          return p;
+        if (!p.getId().equals(paragraphId)) {
+          continue;
         }
+
+        /** `broadcastParagraph` requires original paragraph */
+        Paragraph originParagraph = p;
+
+        if (user != null) {
+          p = p.getUserParagraphMap().get(user);
+        }
+
+        p.setReturn(null, null);
+        return originParagraph;
       }
     }
     return null;

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/75cf72e9/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
index 48a4e2e..5b394ee 100644
--- a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
+++ b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/NotebookTest.java
@@ -284,7 +284,7 @@ public class NotebookTest implements JobListenerFactory{
     assertEquals("repl1: hello world", p1.getResult().message().get(0).getData());
 
     // clear paragraph output/result
-    note.clearParagraphOutput(p1.getId());
+    note.clearParagraphOutput(p1.getId(), null);
     assertNull(p1.getResult());
     notebook.removeNote(note.getId(), anonymous);
   }


[19/23] zeppelin git commit: [HOTFIX][ZEPPELIN-2037][ZEPPELIN-1832] "Restart" button does not work

Posted by mi...@apache.org.
[HOTFIX][ZEPPELIN-2037][ZEPPELIN-1832] "Restart" button does not work

### What is this PR for?
Fixing restarting interpreters work correctly. All restart buttons runs restarting only user's interpreter instance including "scoped" and "isolated". If you shutdown the server, Zeppelin terminates all interpreters' processes

### What type of PR is it?
[Bug Fix | Hot Fix]

### Todos
* [x] - Make "Restart" button work properly

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-2037
* https://issues.apache.org/jira/browse/ZEPPELIN-1832

### How should this be tested?
1. Enable shiro
1. Login with "admin"
1. Set "Per user" to "scoped"
1. Run "sc.version" in note1 with "admin"
1. Login with "user1"
1. Run "sc.version" in note1 with "user1"
1. Click the "restart" button in note1 page with "admin"
1. Check the process with 'ps aux | grep RemoteInterpreterServer'. Will find one process
1. Click the "restart" button in note1 page with "user1"
1. Check the process with 'ps aux | grep RemoteInterpreterServer'. Won't find any process

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jongyoul Lee <jo...@gmail.com>

Closes #2140 from jongyoul/ZEPPELIN-2037 and squashes the following commits:

3aece9f [Jongyoul Lee] Fixed the style
4926567 [Jongyoul Lee] Reverted wrong changes
a8a884a [Jongyoul Lee] Fixed test cases
24d1958 [Jongyoul Lee] Fixed to remove interpreterGroup if it's empty
4d7ea0c [Jongyoul Lee] Changed the logic of closing interpreter Changed closing logic of lazyinterpreter to synchronous execution to guarantee the order of execution
559c78f [Jongyoul Lee] WIP Added unit test for all modes Fixed dereference bug

(cherry picked from commit 970b8117a48a31a9375bf7f76142117fd9b3bd86)
Signed-off-by: Jongyoul Lee <jo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/adf2b12e
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/adf2b12e
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/adf2b12e

Branch: refs/heads/branch-0.7
Commit: adf2b12e9781e79ea1ab234a53947e9956bcf38e
Parents: ae45495
Author: Jongyoul Lee <jo...@gmail.com>
Authored: Thu Mar 16 01:23:54 2017 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Thu Mar 16 23:53:26 2017 +0900

----------------------------------------------------------------------
 .../zeppelin/interpreter/InterpreterGroup.java  |  87 ++++++++-----
 .../interpreter/LazyOpenInterpreter.java        |  13 +-
 .../interpreter/remote/RemoteInterpreter.java   |   8 ++
 .../interpreter/InterpreterSetting.java         |  41 ++----
 .../interpreter/InterpreterSettingTest.java     | 128 +++++++++++++++++++
 5 files changed, 215 insertions(+), 62 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/adf2b12e/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterGroup.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterGroup.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterGroup.java
index 32504dd..7367588 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterGroup.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterGroup.java
@@ -17,15 +17,21 @@
 
 package org.apache.zeppelin.interpreter;
 
-import java.util.*;
+import java.util.Collection;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.Random;
 import java.util.concurrent.ConcurrentHashMap;
 
-import org.apache.log4j.Logger;
 import org.apache.zeppelin.display.AngularObjectRegistry;
 import org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess;
 import org.apache.zeppelin.resource.ResourcePool;
 import org.apache.zeppelin.scheduler.Scheduler;
 import org.apache.zeppelin.scheduler.SchedulerFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
 
 /**
  * InterpreterGroup is list of interpreters in the same interpreter group.
@@ -43,7 +49,7 @@ import org.apache.zeppelin.scheduler.SchedulerFactory;
 public class InterpreterGroup extends ConcurrentHashMap<String, List<Interpreter>> {
   String id;
 
-  Logger LOGGER = Logger.getLogger(InterpreterGroup.class);
+  private static final Logger LOGGER = LoggerFactory.getLogger(InterpreterGroup.class);
 
   AngularObjectRegistry angularObjectRegistry;
   InterpreterHookRegistry hookRegistry;
@@ -165,47 +171,70 @@ public class InterpreterGroup extends ConcurrentHashMap<String, List<Interpreter
    */
   public void close(String sessionId) {
     LOGGER.info("Close interpreter group " + getId() + " for session: " + sessionId);
-    List<Interpreter> intpForSession = this.get(sessionId);
+    final List<Interpreter> intpForSession = this.get(sessionId);
+
     close(intpForSession);
+  }
 
-    if (remoteInterpreterProcess != null) {
-      remoteInterpreterProcess.dereference();
-      if (remoteInterpreterProcess.referenceCount() <= 0) {
-        remoteInterpreterProcess = null;
-        allInterpreterGroups.remove(id);
-      }
-    }
+  private void close(final Collection<Interpreter> intpToClose) {
+    close(null, null, null, intpToClose);
   }
 
-  private void close(Collection<Interpreter> intpToClose) {
+  public void close(final Map<String, InterpreterGroup> interpreterGroupRef,
+      final String processKey, final String sessionKey) {
+    close(interpreterGroupRef, processKey, sessionKey, this.get(sessionKey));
+  }
+
+  private void close(final Map<String, InterpreterGroup> interpreterGroupRef,
+      final String processKey, final String sessionKey, final Collection<Interpreter> intpToClose) {
     if (intpToClose == null) {
       return;
     }
-    List<Thread> closeThreads = new LinkedList<>();
+    Thread t = new Thread() {
+      public void run() {
+        for (Interpreter interpreter : intpToClose) {
+          Scheduler scheduler = interpreter.getScheduler();
+          interpreter.close();
 
-    for (final Interpreter intp : intpToClose) {
-      Thread t = new Thread() {
-        public void run() {
-          Scheduler scheduler = intp.getScheduler();
-          intp.close();
-
-          if (scheduler != null) {
+          if (null != scheduler) {
             SchedulerFactory.singleton().removeScheduler(scheduler.getName());
           }
         }
-      };
 
-      t.start();
-      closeThreads.add(t);
-    }
+        if (remoteInterpreterProcess != null) {
+          //TODO(jl): Because interpreter.close() runs as a seprate thread, we cannot guarantee
+          // refernceCount is a proper value. And as the same reason, we must not call
+          // remoteInterpreterProcess.dereference twice - this method also be called by
+          // interpreter.close().
 
-    for (Thread t : closeThreads) {
-      try {
-        t.join();
-      } catch (InterruptedException e) {
-        LOGGER.error("Can't close interpreter", e);
+          // remoteInterpreterProcess.dereference();
+          if (remoteInterpreterProcess.referenceCount() <= 0) {
+            remoteInterpreterProcess = null;
+            allInterpreterGroups.remove(id);
+          }
+        }
+
+        // TODO(jl): While closing interpreters in a same session, we should remove after all
+        // interpreters are removed. OMG. It's too dirty!!
+        if (null != interpreterGroupRef && null != processKey && null != sessionKey) {
+          InterpreterGroup interpreterGroup = interpreterGroupRef.get(processKey);
+          if (1 == interpreterGroup.size() && interpreterGroup.containsKey(sessionKey)) {
+            interpreterGroupRef.remove(processKey);
+          } else {
+            interpreterGroup.remove(sessionKey);
+          }
+        }
       }
+    };
+
+    t.start();
+    try {
+      t.join();
+    } catch (InterruptedException e) {
+      LOGGER.error("Can't close interpreter: {}", getId(), e);
     }
+
+
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/adf2b12e/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java
index 6e11604..ebecd10 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/LazyOpenInterpreter.java
@@ -74,12 +74,10 @@ public class LazyOpenInterpreter
 
   @Override
   public void close() {
-    synchronized (intp) {
-      if (opened == true) {
-        intp.close();
-        opened = false;
-      }
-    }
+    // To close interpreter, you should open it first.
+    open();
+    intp.close();
+    opened = false;
   }
 
   public boolean isOpen() {
@@ -102,6 +100,9 @@ public class LazyOpenInterpreter
 
   @Override
   public FormType getFormType() {
+    // RemoteInterpreter's this method calls init() internally, and which cause to increase the
+    // number of referenceCount and it affects incorrectly
+    open();
     return intp.getFormType();
   }
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/adf2b12e/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
index 9162c88..c6dbb84 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
@@ -244,6 +244,14 @@ public class RemoteInterpreter extends Interpreter {
     synchronized (interpreterGroup) {
       // initialize all interpreters in this interpreter group
       List<Interpreter> interpreters = interpreterGroup.get(sessionKey);
+      // TODO(jl): this open method is called by LazyOpenInterpreter.open(). It, however,
+      // initializes all of interpreters with same sessionKey. But LazyOpenInterpreter assumes if it
+      // doesn't call open method, it's not open. It causes problem while running intp.close()
+      // In case of Spark, this method initializes all of interpreters and init() method increases
+      // reference count of RemoteInterpreterProcess. But while closing this interpreter group, all
+      // other interpreters doesn't do anything because those LazyInterpreters aren't open.
+      // But for now, we have to initialise all of interpreters for some reasons.
+      // See Interpreter.getInterpreterInTheSameSessionByClassName(String)
       for (Interpreter intp : new ArrayList<>(interpreters)) {
         Interpreter p = intp;
         while (p instanceof WrappedInterpreter) {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/adf2b12e/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
index 3e20d80..57c6acc 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterSetting.java
@@ -172,7 +172,7 @@ public class InterpreterSetting {
     }
   }
 
-  private String getInterpreterSessionKey(String user, String noteId) {
+  String getInterpreterSessionKey(String user, String noteId) {
     InterpreterOption option = getOption();
     String key;
     if (option.isExistingProcess()) {
@@ -250,15 +250,22 @@ public class InterpreterSetting {
     for (String intpKey : new HashSet<>(interpreterGroupRef.keySet())) {
       if (isEqualInterpreterKeyProcessKey(intpKey, processKey)) {
         interpreterGroupWriteLock.lock();
-        groupItem = interpreterGroupRef.remove(intpKey);
+        // TODO(jl): interpreterGroup has two or more sessionKeys inside it. thus we should not
+        // remove interpreterGroup if it has two or more values.
+        groupItem = interpreterGroupRef.get(intpKey);
         interpreterGroupWriteLock.unlock();
         groupToRemove.add(groupItem);
       }
+      for (InterpreterGroup groupToClose : groupToRemove) {
+        // TODO(jl): Fix the logic removing session. Now, it's handled into groupToClose.clsose()
+        groupToClose.close(interpreterGroupRef, intpKey, sessionKey);
+      }
+      groupToRemove.clear();
     }
 
-    for (InterpreterGroup groupToClose : groupToRemove) {
-      groupToClose.close(sessionKey);
-    }
+    //Remove session because all interpreters in this session are closed
+    //TODO(jl): Change all code to handle interpreter one by one or all at once
+
   }
 
   void closeAndRemoveAllInterpreterGroups() {
@@ -268,29 +275,9 @@ public class InterpreterSetting {
     }
   }
 
-  void shutdownAndRemoveInterpreterGroup(String interpreterGroupKey) {
-    String key = getInterpreterProcessKey("", interpreterGroupKey);
-
-    List<InterpreterGroup> groupToRemove = new LinkedList<>();
-    InterpreterGroup groupItem;
-    for (String intpKey : new HashSet<>(interpreterGroupRef.keySet())) {
-      if (isEqualInterpreterKeyProcessKey(intpKey, key)) {
-        interpreterGroupWriteLock.lock();
-        groupItem = interpreterGroupRef.remove(intpKey);
-        interpreterGroupWriteLock.unlock();
-        groupToRemove.add(groupItem);
-      }
-    }
-
-    for (InterpreterGroup groupToClose : groupToRemove) {
-      groupToClose.shutdown();
-    }
-  }
-
   void shutdownAndRemoveAllInterpreterGroups() {
-    HashSet<String> groupsToRemove = new HashSet<>(interpreterGroupRef.keySet());
-    for (String interpreterGroupKey : groupsToRemove) {
-      shutdownAndRemoveInterpreterGroup(interpreterGroupKey);
+    for (InterpreterGroup interpreterGroup : interpreterGroupRef.values()) {
+      interpreterGroup.shutdown();
     }
   }
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/adf2b12e/zeppelin-zengine/src/test/java/org/apache/zeppelin/interpreter/InterpreterSettingTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/interpreter/InterpreterSettingTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/interpreter/InterpreterSettingTest.java
new file mode 100644
index 0000000..7e40a1b
--- /dev/null
+++ b/zeppelin-zengine/src/test/java/org/apache/zeppelin/interpreter/InterpreterSettingTest.java
@@ -0,0 +1,128 @@
+package org.apache.zeppelin.interpreter;
+
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Properties;
+
+import org.junit.Test;
+
+import org.apache.zeppelin.dep.Dependency;
+import org.apache.zeppelin.interpreter.remote.RemoteInterpreter;
+
+import static org.junit.Assert.assertEquals;
+import static org.mockito.Mockito.mock;
+
+public class InterpreterSettingTest {
+
+  @Test
+  public void sharedModeCloseandRemoveInterpreterGroupTest() {
+    InterpreterOption interpreterOption = new InterpreterOption();
+    interpreterOption.setPerUser(InterpreterOption.SHARED);
+    InterpreterSetting interpreterSetting = new InterpreterSetting("", "", "", new ArrayList<InterpreterInfo>(), new Properties(), new ArrayList<Dependency>(), interpreterOption, "", null);
+
+    interpreterSetting.setInterpreterGroupFactory(new InterpreterGroupFactory() {
+      @Override
+      public InterpreterGroup createInterpreterGroup(String interpreterGroupId,
+          InterpreterOption option) {
+        return new InterpreterGroup(interpreterGroupId);
+      }
+    });
+
+    Interpreter mockInterpreter1 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList1 = new ArrayList<>();
+    interpreterList1.add(mockInterpreter1);
+    InterpreterGroup interpreterGroup = interpreterSetting.getInterpreterGroup("user1", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user1", "note1"), interpreterList1);
+
+    // This won't effect anything
+    Interpreter mockInterpreter2 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList2 = new ArrayList<>();
+    interpreterList2.add(mockInterpreter2);
+    interpreterGroup = interpreterSetting.getInterpreterGroup("user2", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user2", "note1"), interpreterList2);
+
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user1", "note1").size());
+
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user2");
+    assertEquals(0, interpreterSetting.getAllInterpreterGroups().size());
+  }
+
+  @Test
+  public void perUserScopedModeCloseAndRemoveInterpreterGroupTest() {
+    InterpreterOption interpreterOption = new InterpreterOption();
+    interpreterOption.setPerUser(InterpreterOption.SCOPED);
+    InterpreterSetting interpreterSetting = new InterpreterSetting("", "", "", new ArrayList<InterpreterInfo>(), new Properties(), new ArrayList<Dependency>(), interpreterOption, "", null);
+
+    interpreterSetting.setInterpreterGroupFactory(new InterpreterGroupFactory() {
+      @Override
+      public InterpreterGroup createInterpreterGroup(String interpreterGroupId,
+          InterpreterOption option) {
+        return new InterpreterGroup(interpreterGroupId);
+      }
+    });
+
+    Interpreter mockInterpreter1 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList1 = new ArrayList<>();
+    interpreterList1.add(mockInterpreter1);
+    InterpreterGroup interpreterGroup = interpreterSetting.getInterpreterGroup("user1", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user1", "note1"), interpreterList1);
+
+    Interpreter mockInterpreter2 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList2 = new ArrayList<>();
+    interpreterList2.add(mockInterpreter2);
+    interpreterGroup = interpreterSetting.getInterpreterGroup("user2", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user2", "note1"), interpreterList2);
+
+    assertEquals(1, interpreterSetting.getAllInterpreterGroups().size());
+    assertEquals(2, interpreterSetting.getInterpreterGroup("user1", "note1").size());
+    assertEquals(2, interpreterSetting.getInterpreterGroup("user2", "note1").size());
+
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user1");
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user2","note1").size());
+
+    // Check if non-existed key works or not
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user1");
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user2","note1").size());
+
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user2");
+    assertEquals(0, interpreterSetting.getAllInterpreterGroups().size());
+  }
+
+  @Test
+  public void perUserIsolatedModeCloseAndRemoveInterpreterGroupTest() {
+    InterpreterOption interpreterOption = new InterpreterOption();
+    interpreterOption.setPerUser(InterpreterOption.ISOLATED);
+    InterpreterSetting interpreterSetting = new InterpreterSetting("", "", "", new ArrayList<InterpreterInfo>(), new Properties(), new ArrayList<Dependency>(), interpreterOption, "", null);
+
+    interpreterSetting.setInterpreterGroupFactory(new InterpreterGroupFactory() {
+      @Override
+      public InterpreterGroup createInterpreterGroup(String interpreterGroupId,
+          InterpreterOption option) {
+        return new InterpreterGroup(interpreterGroupId);
+      }
+    });
+
+    Interpreter mockInterpreter1 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList1 = new ArrayList<>();
+    interpreterList1.add(mockInterpreter1);
+    InterpreterGroup interpreterGroup = interpreterSetting.getInterpreterGroup("user1", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user1", "note1"), interpreterList1);
+
+    Interpreter mockInterpreter2 = mock(RemoteInterpreter.class);
+    List<Interpreter> interpreterList2 = new ArrayList<>();
+    interpreterList2.add(mockInterpreter2);
+    interpreterGroup = interpreterSetting.getInterpreterGroup("user2", "note1");
+    interpreterGroup.put(interpreterSetting.getInterpreterSessionKey("user2", "note1"), interpreterList2);
+
+    assertEquals(2, interpreterSetting.getAllInterpreterGroups().size());
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user1", "note1").size());
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user2", "note1").size());
+
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user1");
+    assertEquals(1, interpreterSetting.getInterpreterGroup("user2","note1").size());
+    assertEquals(1, interpreterSetting.getAllInterpreterGroups().size());
+
+    interpreterSetting.closeAndRemoveInterpreterGroupByUser("user2");
+    assertEquals(0, interpreterSetting.getAllInterpreterGroups().size());
+  }
+}


[17/23] zeppelin git commit: [WIP] [Discuss] Make use of all grouped data to draw pie chart

Posted by mi...@apache.org.
[WIP] [Discuss] Make use of all grouped data to draw pie chart

### What is this PR for?
Now, grouped pie charts only uses the data from the first group
With this fix-
* Add data from all groups to variable d3g, so all groups could be rendered
* Rewrite for loop with map and concat
* Refactor some variables to const and let

### What type of PR is it?
[Bug Fix]

### What is the Jira issue?
*  [ZEPPELIN-2237](https://issues.apache.org/jira/browse/ZEPPELIN-2237)

### How should this be tested?
* Create a built in pie chart visualization
* Select a column to group the data
* Should display the visualization based on all the available grouped data

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: ess_ess <sr...@gmail.com>

This patch had conflicts when merged, resolved by
Committer: Lee moon soo <mo...@apache.org>

Closes #2128 from sravan-s/ZEPPELIN-2237-grouped-piechart and squashes the following commits:

652c943 [ess_ess] Make use of all grouped data to draw pie chart

(cherry picked from commit a2cd4ae4e17024501bb0654e71747a07ff68600d)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/7998dd2e
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/7998dd2e
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/7998dd2e

Branch: refs/heads/branch-0.7
Commit: 7998dd2ec749f49543fcb4ab9f6f54e537fbdb0f
Parents: 1ff2752
Author: ess_ess <sr...@gmail.com>
Authored: Fri Mar 10 09:19:55 2017 +0530
Committer: Lee moon soo <mo...@apache.org>
Committed: Wed Mar 15 08:42:21 2017 -0700

----------------------------------------------------------------------
 .../builtins/visualization-piechart.js          | 30 +++++++++++++-------
 1 file changed, 19 insertions(+), 11 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/7998dd2e/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
index 9cc7922..f74ecd0 100644
--- a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
+++ b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
@@ -35,7 +35,7 @@ export default class PiechartVisualization extends Nvd3ChartVisualization {
   render(pivot) {
     // [ZEPPELIN-2253] New chart function will be created each time inside super.render()
     this.chart = null;
-    var d3Data = this.d3DataFromPivot(
+    const d3Data = this.d3DataFromPivot(
       pivot.schema,
       pivot.rows,
       pivot.keys,
@@ -44,17 +44,25 @@ export default class PiechartVisualization extends Nvd3ChartVisualization {
       true,
       false,
       false);
-    var d = d3Data.d3g;
-    var d3g = [];
-    if (d.length > 0) {
-      for (var i = 0; i < d[0].values.length ; i++) {
-        var e = d[0].values[i];
-        d3g.push({
-          label: e.x,
-          value: e.y
-        });
-      }
+    const d = d3Data.d3g;
+
+    let generateLabel;
+    // data is grouped
+    if (pivot.groups && pivot.groups.length > 0) {
+      generateLabel = (suffix, prefix) => `${prefix}.${suffix}`;
+    } else { // data isn't grouped
+      generateLabel = suffix => suffix;
     }
+
+    let d3g = d.map(group => {
+      return group.values.map(row => ({
+        label: generateLabel(row.x, group.key),
+        value: row.y
+      }));
+    });
+    // the map function returns d3g as a nested array
+    // [].concat flattens it, http://stackoverflow.com/a/10865042/5154397
+    d3g = [].concat.apply([], d3g);
     super.render({d3g: d3g});
   };
 


[16/23] zeppelin git commit: [ZEPPELIN-2253] Piechart won't render when column selected as 'key' is changed

Posted by mi...@apache.org.
[ZEPPELIN-2253] Piechart won't render when column selected as 'key' is changed

### What is this PR for?
* Fixes issue with pie chart rendering, if user changes pie chart's domain
* When pie chart's key(domain) is changed, this error is logged:
  'Uncaught TypeError: arcs[idx] is not a function at pie.js:358'
* Even if user changes the key and values again, chart remains broken
* Fix: set this.chart to null, which makes render function to initialize new
pie chart constructor
### What type of PR is it?
[Bug Fix]

### What is the Jira issue?
* [ZEPPELIN-533](https://issues.apache.org/jira/browse/ZEPPELIN-2253)

### How should this be tested?
* Create a new pie chart using built in visualization
* Remove column selected as 'key'
* Add a new column as 'key'
* Chart is rendered perfectly

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: ess_ess <sr...@gmail.com>

Closes #2132 from sravan-s/ZEPPELIN-2253 and squashes the following commits:

310aecf [ess_ess] Initialize chart to null inside 'render()'

(cherry picked from commit f2c865aaacb3788286de076246695a35fff277f8)
Signed-off-by: Lee moon soo <mo...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/1ff2752e
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/1ff2752e
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/1ff2752e

Branch: refs/heads/branch-0.7
Commit: 1ff2752edd9ac3472109f4c589c2abb4de6b16fa
Parents: b00e27c
Author: ess_ess <sr...@gmail.com>
Authored: Tue Mar 14 13:27:16 2017 +0530
Committer: Lee moon soo <mo...@apache.org>
Committed: Wed Mar 15 08:37:54 2017 -0700

----------------------------------------------------------------------
 .../src/app/visualization/builtins/visualization-piechart.js       | 2 ++
 1 file changed, 2 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/1ff2752e/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
index 8c8f8f2..9cc7922 100644
--- a/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
+++ b/zeppelin-web/src/app/visualization/builtins/visualization-piechart.js
@@ -33,6 +33,8 @@ export default class PiechartVisualization extends Nvd3ChartVisualization {
   };
 
   render(pivot) {
+    // [ZEPPELIN-2253] New chart function will be created each time inside super.render()
+    this.chart = null;
     var d3Data = this.d3DataFromPivot(
       pivot.schema,
       pivot.rows,


[12/23] zeppelin git commit: [ZEPPELIN-2202] Disable personalized mode btn when note is running (branch-0.7)

Posted by mi...@apache.org.
[ZEPPELIN-2202] Disable personalized mode btn when note is running (branch-0.7)

### What is this PR for?

Disable the personalized mode button when a note is running.

- The same fix with https://github.com/apache/zeppelin/pull/2108 for branch-0.7
- CI failure might be related with https://github.com/apache/zeppelin/pull/2103

### What type of PR is it?
[Improvement]

### Todos

NONE

### What is the Jira issue?

[ZEPPELIN-2202](https://issues.apache.org/jira/browse/ZEPPELIN-2202)

### How should this be tested?

Refer the screenshot below.

### Screenshots (if appropriate)

![2202](https://cloud.githubusercontent.com/assets/4968473/23661339/c45dcd9e-038f-11e7-9551-6cde925aa5f4.gif)

### Questions:
* Does the licenses files need update? - NO
* Is there breaking changes for older versions? - NO
* Does this needs documentation? - NO

Author: 1ambda <1a...@gmail.com>

Closes #2115 from 1ambda/disable-person-mode-btn-when-note-is-running-for-070 and squashes the following commits:

5e0f0b0 [1ambda] feat: disable person mode btn when running


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/a90004b2
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/a90004b2
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/a90004b2

Branch: refs/heads/branch-0.7
Commit: a90004b272f6e7c13571cf6ffee6793c295729b8
Parents: aca96c3
Author: 1ambda <1a...@gmail.com>
Authored: Thu Mar 9 18:11:14 2017 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Tue Mar 14 11:51:16 2017 +0900

----------------------------------------------------------------------
 zeppelin-web/src/app/notebook/notebook-actionBar.html | 2 ++
 zeppelin-web/src/app/notebook/notebook.controller.js  | 9 ++++-----
 2 files changed, 6 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/a90004b2/zeppelin-web/src/app/notebook/notebook-actionBar.html
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/notebook-actionBar.html b/zeppelin-web/src/app/notebook/notebook-actionBar.html
index 99d7d7a..7fa6d74 100644
--- a/zeppelin-web/src/app/notebook/notebook-actionBar.html
+++ b/zeppelin-web/src/app/notebook/notebook-actionBar.html
@@ -74,6 +74,7 @@ limitations under the License.
 
       <button type="button"
               class="btn btn-primary btn-xs"
+              ng-class="isNoteRunning() ? 'disabled' : ''"
               ng-if="ticket.principal && ticket.principal !== 'anonymous'"
               ng-hide="viewOnly || note.config.personalizedMode !== 'true'"
               ng-click="toggleNotePersonalizedMode()"
@@ -83,6 +84,7 @@ limitations under the License.
       </button>
       <button type="button"
               class="btn btn-default btn-xs"
+              ng-class="isNoteRunning() ? 'disabled' : ''"
               ng-if="ticket.principal && ticket.principal !== 'anonymous'"
               ng-hide="viewOnly || note.config.personalizedMode === 'true'"
               ng-click="toggleNotePersonalizedMode()"

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/a90004b2/zeppelin-web/src/app/notebook/notebook.controller.js
----------------------------------------------------------------------
diff --git a/zeppelin-web/src/app/notebook/notebook.controller.js b/zeppelin-web/src/app/notebook/notebook.controller.js
index 63a0130..9f27d7d 100644
--- a/zeppelin-web/src/app/notebook/notebook.controller.js
+++ b/zeppelin-web/src/app/notebook/notebook.controller.js
@@ -358,15 +358,14 @@ function NotebookCtrl($scope, $route, $routeParams, $location, $rootScope,
   };
 
   $scope.isNoteRunning = function() {
-    var running = false;
     if (!$scope.note) { return false; }
     for (var i = 0; i < $scope.note.paragraphs.length; i++) {
-      if ($scope.note.paragraphs[i].status === 'PENDING' || $scope.note.paragraphs[i].status === 'RUNNING') {
-        running = true;
-        break;
+      const status = $scope.note.paragraphs[i].status;
+      if (status === 'PENDING' || status === 'RUNNING') {
+        return true;
       }
     }
-    return running;
+    return false;
   };
 
   $scope.killSaveTimer = function() {