You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@zeppelin.apache.org by zj...@apache.org on 2017/09/15 01:03:30 UTC

zeppelin git commit: ZEPPELIN-2933. Code Refactoring of ZEPPELIN-1515 follow up

Repository: zeppelin
Updated Branches:
  refs/heads/master 3fb67f9cc -> 3f591c232


ZEPPELIN-2933. Code Refactoring of ZEPPELIN-1515 follow up

### What is this PR for?

This is a refactoring PR of ZEPPELIN-1515. Because hadoop's FileSystem API not only works with hdfs, but also other hadoop compatible filesystem. So in this PR I rename it to `FileSystemNotebookRepo`

### What type of PR is it?
[Refactoring]

### Todos
* [ ] - Task

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-2933

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jeff Zhang <zj...@apache.org>

Closes #2588 from zjffdu/ZEPPELIN-2933 and squashes the following commits:

45d1e9b [Jeff Zhang] ZEPPELIN-2993. Code Refactoring of ZEPPELIN-1515 follow up


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/3f591c23
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/3f591c23
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/3f591c23

Branch: refs/heads/master
Commit: 3f591c2327532679159019b9d7486805ef6d0768
Parents: 3fb67f9
Author: Jeff Zhang <zj...@apache.org>
Authored: Thu Sep 14 13:23:33 2017 +0800
Committer: Jeff Zhang <zj...@apache.org>
Committed: Fri Sep 15 09:03:14 2017 +0800

----------------------------------------------------------------------
 conf/zeppelin-site.xml.template                 |   6 +-
 docs/setup/storage/storage.md                   |  10 +-
 .../remote/RemoteInterpreterServerTest.java     |   2 +-
 .../notebook/repo/FileSystemNotebookRepo.java   | 206 +++++++++++++++++++
 .../notebook/repo/HdfsNotebookRepo.java         | 200 ------------------
 .../zeppelin/notebook/repo/VFSNotebookRepo.java |   1 +
 .../repo/FileSystemNotebookRepoTest.java        | 101 +++++++++
 .../notebook/repo/HdfsNotebookRepoTest.java     | 101 ---------
 8 files changed, 317 insertions(+), 310 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/conf/zeppelin-site.xml.template
----------------------------------------------------------------------
diff --git a/conf/zeppelin-site.xml.template b/conf/zeppelin-site.xml.template
index ce3ffaa..f1bfb61 100755
--- a/conf/zeppelin-site.xml.template
+++ b/conf/zeppelin-site.xml.template
@@ -173,11 +173,11 @@
 </property>
 -->
 
-<!-- Notebook storage layer using hdfs file system
+<!-- Notebook storage layer using hadoop compatible file system
 <property>
   <name>zeppelin.notebook.storage</name>
-  <value>org.apache.zeppelin.notebook.repo.HdfsNotebookRepo</value>
-  <description>hdfs notebook persistence layer implementation</description>
+  <value>org.apache.zeppelin.notebook.repo.FileSystemNotebookRepo</value>
+  <description>Hadoop compatible file system notebook persistence layer implementation, such as local file system, hdfs, azure wasb, s3 and etc.</description>
 </property>
 
 <property>

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/docs/setup/storage/storage.md
----------------------------------------------------------------------
diff --git a/docs/setup/storage/storage.md b/docs/setup/storage/storage.md
index d4db50a..0b65f91 100644
--- a/docs/setup/storage/storage.md
+++ b/docs/setup/storage/storage.md
@@ -30,7 +30,7 @@ There are few notebook storage systems available for a use out of the box:
 
   * (default) use local file system and version it using local Git repository - `GitNotebookRepo`
   * all notes are saved in the notebook folder in your local File System - `VFSNotebookRepo`
-  * all notes are saved in the notebook folder in hdfs - `HdfsNotebookRepo`
+  * all notes are saved in the notebook folder in hadoop compatible file system - `FileSystemNotebookRepo`
   * storage using Amazon S3 service - `S3NotebookRepo`
   * storage using Azure service - `AzureNotebookRepo`
   * storage using MongoDB - `MongoNotebookRepo`
@@ -54,16 +54,16 @@ To enable versioning for all your local notebooks though a standard Git reposito
 
 </br>
 
-## Notebook Storage in Hdfs repository <a name="Hdfs"></a>
+## Notebook Storage in hadoop compatible file system repository <a name="Hdfs"></a>
 
-Notes may be stored in hdfs, so that multiple Zeppelin instances can share the same notes. It supports all the versions of hadoop 2.x. If you use `HdfsNotebookRepo`, then `zeppelin.notebook.dir` is the path on hdfs. And you need to specify `HADOOP_CONF_DIR` in `zeppelin-env.sh` so that zeppelin can find the right hadoop configuration files.
+Notes may be stored in hadoop compatible file system such as hdfs, so that multiple Zeppelin instances can share the same notes. It supports all the versions of hadoop 2.x. If you use `FileSystemNotebookRepo`, then `zeppelin.notebook.dir` is the path on the hadoop compatible file system. And you need to specify `HADOOP_CONF_DIR` in `zeppelin-env.sh` so that zeppelin can find the right hadoop configuration files.
 If your hadoop cluster is kerberized, then you need to specify `zeppelin.hdfs.keytab` and `zeppelin.hdfs.principal`
 
 ```
 <property>
   <name>zeppelin.notebook.storage</name>
-  <value>org.apache.zeppelin.notebook.repo.HdfsNotebookRepo</value>
-  <description>hdfs notebook persistence layer implementation</description>
+  <value>org.apache.zeppelin.notebook.repo.FileSystemNotebookRepo</value>
+  <description>hadoop compatible file system notebook persistence layer implementation</description>
 </property>
 ```
 

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-interpreter/src/test/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServerTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/test/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServerTest.java b/zeppelin-interpreter/src/test/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServerTest.java
index 1a7c2a5..b2fcae1 100644
--- a/zeppelin-interpreter/src/test/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServerTest.java
+++ b/zeppelin-interpreter/src/test/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreterServerTest.java
@@ -43,7 +43,7 @@ public class RemoteInterpreterServerTest {
   @Test
   public void testStartStop() throws InterruptedException, IOException, TException {
     RemoteInterpreterServer server = new RemoteInterpreterServer("localhost",
-        RemoteInterpreterUtils.findRandomAvailablePortOnAllLocalInterfaces());
+        RemoteInterpreterUtils.findRandomAvailablePortOnAllLocalInterfaces(), true);
     assertEquals(false, server.isRunning());
 
     server.start();

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepo.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepo.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepo.java
new file mode 100644
index 0000000..150ac26
--- /dev/null
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepo.java
@@ -0,0 +1,206 @@
+package org.apache.zeppelin.notebook.repo;
+
+import org.apache.commons.lang.StringUtils;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileStatus;
+import org.apache.hadoop.fs.FileSystem;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.io.IOUtils;
+import org.apache.hadoop.security.UserGroupInformation;
+import org.apache.zeppelin.conf.ZeppelinConfiguration;
+import org.apache.zeppelin.notebook.Note;
+import org.apache.zeppelin.notebook.NoteInfo;
+import org.apache.zeppelin.user.AuthenticationInfo;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.security.PrivilegedAction;
+import java.security.PrivilegedExceptionAction;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+
+/**
+ * NotebookRepos for hdfs.
+ *
+ * Assume the notebook directory structure is as following
+ * - notebookdir
+ *              - noteId/note.json
+ *              - noteId/note.json
+ *              - noteId/note.json
+ */
+public class FileSystemNotebookRepo implements NotebookRepo {
+  private static final Logger LOGGER = LoggerFactory.getLogger(FileSystemNotebookRepo.class);
+
+  private Configuration hadoopConf;
+  private ZeppelinConfiguration zConf;
+  private boolean isSecurityEnabled = false;
+  private FileSystem fs;
+  private Path notebookDir;
+
+  public FileSystemNotebookRepo(ZeppelinConfiguration zConf) throws IOException {
+    this.zConf = zConf;
+    this.hadoopConf = new Configuration();
+
+    this.isSecurityEnabled = UserGroupInformation.isSecurityEnabled();
+    if (isSecurityEnabled) {
+      String keytab = zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_HDFS_KEYTAB);
+      String principal = zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_HDFS_PRINCIPAL);
+      if (StringUtils.isBlank(keytab) || StringUtils.isBlank(principal)) {
+        throw new IOException("keytab and principal can not be empty, keytab: " + keytab
+            + ", principal: " + principal);
+      }
+      UserGroupInformation.loginUserFromKeytab(principal, keytab);
+    }
+
+    try {
+      this.fs = FileSystem.get(new URI(zConf.getNotebookDir()), new Configuration());
+      LOGGER.info("Creating FileSystem: " + this.fs.getClass().getCanonicalName());
+      this.notebookDir = fs.makeQualified(new Path(zConf.getNotebookDir()));
+      LOGGER.info("Using folder {} to store notebook", notebookDir);
+    } catch (URISyntaxException e) {
+      throw new IOException(e);
+    }
+    if (!fs.exists(notebookDir)) {
+      fs.mkdirs(notebookDir);
+      LOGGER.info("Create notebook dir {} in hdfs", notebookDir.toString());
+    }
+    if (fs.isFile(notebookDir)) {
+      throw new IOException("notebookDir {} is file instead of directory, please remove it or " +
+          "specify another directory");
+    }
+  }
+
+  @Override
+  public List<NoteInfo> list(AuthenticationInfo subject) throws IOException {
+    return callHdfsOperation(new HdfsOperation<List<NoteInfo>>() {
+      @Override
+      public List<NoteInfo> call() throws IOException {
+        List<NoteInfo> noteInfos = new ArrayList<>();
+        for (FileStatus status : fs.globStatus(new Path(notebookDir, "*/note.json"))) {
+          NoteInfo noteInfo = new NoteInfo(status.getPath().getParent().getName(), "", null);
+          noteInfos.add(noteInfo);
+        }
+        return noteInfos;
+      }
+    });
+  }
+
+  @Override
+  public Note get(final String noteId, AuthenticationInfo subject) throws IOException {
+    return callHdfsOperation(new HdfsOperation<Note>() {
+      @Override
+      public Note call() throws IOException {
+        Path notePath = new Path(notebookDir.toString() + "/" + noteId + "/note.json");
+        LOGGER.debug("Read note from file: " + notePath);
+        ByteArrayOutputStream noteBytes = new ByteArrayOutputStream();
+        IOUtils.copyBytes(fs.open(notePath), noteBytes, hadoopConf);
+        return Note.fromJson(new String(noteBytes.toString(
+            zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_ENCODING))));
+      }
+    });
+  }
+
+  @Override
+  public void save(final Note note, AuthenticationInfo subject) throws IOException {
+    callHdfsOperation(new HdfsOperation<Void>() {
+      @Override
+      public Void call() throws IOException {
+        Path notePath = new Path(notebookDir.toString() + "/" + note.getId() + "/note.json");
+        Path tmpNotePath = new Path(notebookDir.toString() + "/" + note.getId() + "/.note.json");
+        LOGGER.debug("Saving note to file: " + notePath);
+        if (fs.exists(tmpNotePath)) {
+          fs.delete(tmpNotePath, true);
+        }
+        InputStream in = new ByteArrayInputStream(note.toJson().getBytes(
+            zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_ENCODING)));
+        IOUtils.copyBytes(in, fs.create(tmpNotePath), hadoopConf);
+        fs.delete(notePath, true);
+        fs.rename(tmpNotePath, notePath);
+        return null;
+      }
+    });
+  }
+
+  @Override
+  public void remove(final String noteId, AuthenticationInfo subject) throws IOException {
+    callHdfsOperation(new HdfsOperation<Void>() {
+      @Override
+      public Void call() throws IOException {
+        Path noteFolder = new Path(notebookDir.toString() + "/" + noteId);
+        fs.delete(noteFolder, true);
+        return null;
+      }
+    });
+  }
+
+  @Override
+  public void close() {
+    LOGGER.warn("close is not implemented for HdfsNotebookRepo");
+  }
+
+  @Override
+  public Revision checkpoint(String noteId, String checkpointMsg, AuthenticationInfo subject)
+      throws IOException {
+    LOGGER.warn("checkpoint is not implemented for HdfsNotebookRepo");
+    return null;
+  }
+
+  @Override
+  public Note get(String noteId, String revId, AuthenticationInfo subject) throws IOException {
+    LOGGER.warn("get revId is not implemented for HdfsNotebookRepo");
+    return null;
+  }
+
+  @Override
+  public List<Revision> revisionHistory(String noteId, AuthenticationInfo subject) {
+    LOGGER.warn("revisionHistory is not implemented for HdfsNotebookRepo");
+    return null;
+  }
+
+  @Override
+  public Note setNoteRevision(String noteId, String revId, AuthenticationInfo subject)
+      throws IOException {
+    LOGGER.warn("setNoteRevision is not implemented for HdfsNotebookRepo");
+    return null;
+  }
+
+  @Override
+  public List<NotebookRepoSettingsInfo> getSettings(AuthenticationInfo subject) {
+    LOGGER.warn("getSettings is not implemented for HdfsNotebookRepo");
+    return null;
+  }
+
+  @Override
+  public void updateSettings(Map<String, String> settings, AuthenticationInfo subject) {
+    LOGGER.warn("updateSettings is not implemented for HdfsNotebookRepo");
+  }
+
+  private interface HdfsOperation<T> {
+    T call() throws IOException;
+  }
+
+  public synchronized <T> T callHdfsOperation(final HdfsOperation<T> func) throws IOException {
+    if (isSecurityEnabled) {
+      UserGroupInformation.getLoginUser().reloginFromKeytab();
+      try {
+        return UserGroupInformation.getCurrentUser().doAs(new PrivilegedExceptionAction<T>() {
+          @Override
+          public T run() throws Exception {
+            return func.call();
+          }
+        });
+      } catch (InterruptedException e) {
+        throw new IOException(e);
+      }
+    } else {
+      return func.call();
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepo.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepo.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepo.java
deleted file mode 100644
index fdaaf04..0000000
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepo.java
+++ /dev/null
@@ -1,200 +0,0 @@
-package org.apache.zeppelin.notebook.repo;
-
-import org.apache.commons.lang.StringUtils;
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FileStatus;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.io.IOUtils;
-import org.apache.hadoop.security.UserGroupInformation;
-import org.apache.zeppelin.conf.ZeppelinConfiguration;
-import org.apache.zeppelin.notebook.Note;
-import org.apache.zeppelin.notebook.NoteInfo;
-import org.apache.zeppelin.user.AuthenticationInfo;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-import java.io.ByteArrayInputStream;
-import java.io.ByteArrayOutputStream;
-import java.io.IOException;
-import java.io.InputStream;
-import java.security.PrivilegedAction;
-import java.security.PrivilegedExceptionAction;
-import java.util.ArrayList;
-import java.util.List;
-import java.util.Map;
-
-/**
- * NotebookRepos for hdfs.
- *
- * Assume the notebook directory structure is as following
- * - notebookdir
- *              - noteId/note.json
- *              - noteId/note.json
- *              - noteId/note.json
- */
-public class HdfsNotebookRepo implements NotebookRepo {
-  private static final Logger LOGGER = LoggerFactory.getLogger(HdfsNotebookRepo.class);
-
-
-  private Configuration hadoopConf;
-  private ZeppelinConfiguration zConf;
-  private boolean isSecurityEnabled = false;
-  private FileSystem fs;
-  private Path notebookDir;
-
-  public HdfsNotebookRepo(ZeppelinConfiguration zConf) throws IOException {
-    this.zConf = zConf;
-    this.hadoopConf = new Configuration();
-    this.notebookDir = new Path(zConf.getNotebookDir());
-    LOGGER.info("Use hdfs directory {} to store notebook", notebookDir);
-    this.isSecurityEnabled = UserGroupInformation.isSecurityEnabled();
-    if (isSecurityEnabled) {
-      String keytab = zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_HDFS_KEYTAB);
-      String principal = zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_HDFS_PRINCIPAL);
-      if (StringUtils.isBlank(keytab) || StringUtils.isBlank(principal)) {
-        throw new IOException("keytab and principal can not be empty, keytab: " + keytab
-            + ", principal: " + principal);
-      }
-      UserGroupInformation.loginUserFromKeytab(principal, keytab);
-    }
-
-    this.fs = FileSystem.get(new Configuration());
-    if (!fs.exists(notebookDir)) {
-      fs.mkdirs(notebookDir);
-      LOGGER.info("Create notebook dir {} in hdfs", notebookDir.toString());
-    }
-    if (fs.isFile(notebookDir)) {
-      throw new IOException("notebookDir {} is file instead of directory, please remove it or " +
-          "specify another directory");
-    }
-
-  }
-
-  @Override
-  public List<NoteInfo> list(AuthenticationInfo subject) throws IOException {
-    return callHdfsOperation(new HdfsOperation<List<NoteInfo>>() {
-      @Override
-      public List<NoteInfo> call() throws IOException {
-        List<NoteInfo> noteInfos = new ArrayList<>();
-        for (FileStatus status : fs.globStatus(new Path(notebookDir, "*/note.json"))) {
-          NoteInfo noteInfo = new NoteInfo(status.getPath().getParent().getName(), "", null);
-          noteInfos.add(noteInfo);
-        }
-        return noteInfos;
-      }
-    });
-  }
-
-  @Override
-  public Note get(final String noteId, AuthenticationInfo subject) throws IOException {
-    return callHdfsOperation(new HdfsOperation<Note>() {
-      @Override
-      public Note call() throws IOException {
-        Path notePath = new Path(notebookDir.toString() + "/" + noteId + "/note.json");
-        LOGGER.debug("Read note from file: " + notePath);
-        ByteArrayOutputStream noteBytes = new ByteArrayOutputStream();
-        IOUtils.copyBytes(fs.open(notePath), noteBytes, hadoopConf);
-        return Note.fromJson(new String(noteBytes.toString(
-            zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_ENCODING))));
-      }
-    });
-  }
-
-  @Override
-  public void save(final Note note, AuthenticationInfo subject) throws IOException {
-    callHdfsOperation(new HdfsOperation<Void>() {
-      @Override
-      public Void call() throws IOException {
-        Path notePath = new Path(notebookDir.toString() + "/" + note.getId() + "/note.json");
-        Path tmpNotePath = new Path(notebookDir.toString() + "/" + note.getId() + "/.note.json");
-        LOGGER.debug("Saving note to file: " + notePath);
-        if (fs.exists(tmpNotePath)) {
-          fs.delete(tmpNotePath, true);
-        }
-        InputStream in = new ByteArrayInputStream(note.toJson().getBytes(
-            zConf.getString(ZeppelinConfiguration.ConfVars.ZEPPELIN_ENCODING)));
-        IOUtils.copyBytes(in, fs.create(tmpNotePath), hadoopConf);
-        fs.delete(notePath, true);
-        fs.rename(tmpNotePath, notePath);
-        return null;
-      }
-    });
-  }
-
-  @Override
-  public void remove(final String noteId, AuthenticationInfo subject) throws IOException {
-    callHdfsOperation(new HdfsOperation<Void>() {
-      @Override
-      public Void call() throws IOException {
-        Path noteFolder = new Path(notebookDir.toString() + "/" + noteId);
-        fs.delete(noteFolder, true);
-        return null;
-      }
-    });
-  }
-
-  @Override
-  public void close() {
-    LOGGER.warn("close is not implemented for HdfsNotebookRepo");
-  }
-
-  @Override
-  public Revision checkpoint(String noteId, String checkpointMsg, AuthenticationInfo subject)
-      throws IOException {
-    LOGGER.warn("checkpoint is not implemented for HdfsNotebookRepo");
-    return null;
-  }
-
-  @Override
-  public Note get(String noteId, String revId, AuthenticationInfo subject) throws IOException {
-    LOGGER.warn("get revId is not implemented for HdfsNotebookRepo");
-    return null;
-  }
-
-  @Override
-  public List<Revision> revisionHistory(String noteId, AuthenticationInfo subject) {
-    LOGGER.warn("revisionHistory is not implemented for HdfsNotebookRepo");
-    return null;
-  }
-
-  @Override
-  public Note setNoteRevision(String noteId, String revId, AuthenticationInfo subject)
-      throws IOException {
-    LOGGER.warn("setNoteRevision is not implemented for HdfsNotebookRepo");
-    return null;
-  }
-
-  @Override
-  public List<NotebookRepoSettingsInfo> getSettings(AuthenticationInfo subject) {
-    LOGGER.warn("getSettings is not implemented for HdfsNotebookRepo");
-    return null;
-  }
-
-  @Override
-  public void updateSettings(Map<String, String> settings, AuthenticationInfo subject) {
-    LOGGER.warn("updateSettings is not implemented for HdfsNotebookRepo");
-  }
-
-  private interface HdfsOperation<T> {
-    T call() throws IOException;
-  }
-
-  public <T> T callHdfsOperation(final HdfsOperation<T> func) throws IOException {
-    if (isSecurityEnabled) {
-      UserGroupInformation.getLoginUser().reloginFromKeytab();
-      try {
-        return UserGroupInformation.getCurrentUser().doAs(new PrivilegedExceptionAction<T>() {
-          @Override
-          public T run() throws Exception {
-            return func.call();
-          }
-        });
-      } catch (InterruptedException e) {
-        throw new IOException(e);
-      }
-    } else {
-      return func.call();
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/VFSNotebookRepo.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/VFSNotebookRepo.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/VFSNotebookRepo.java
index 4006d13..3fe5dab 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/VFSNotebookRepo.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/notebook/repo/VFSNotebookRepo.java
@@ -218,6 +218,7 @@ public class VFSNotebookRepo implements NotebookRepo {
 
   @Override
   public synchronized void save(Note note, AuthenticationInfo subject) throws IOException {
+    LOG.info("Saving note:" + note.getId());
     String json = note.toJson();
 
     FileObject rootDir = getRootDir();

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepoTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepoTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepoTest.java
new file mode 100644
index 0000000..79eb387
--- /dev/null
+++ b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/FileSystemNotebookRepoTest.java
@@ -0,0 +1,101 @@
+package org.apache.zeppelin.notebook.repo;
+
+
+import org.apache.commons.io.FileUtils;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileSystem;
+import org.apache.hadoop.fs.Path;
+import org.apache.zeppelin.conf.ZeppelinConfiguration;
+import org.apache.zeppelin.notebook.Note;
+import org.apache.zeppelin.user.AuthenticationInfo;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.File;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.nio.file.Files;
+import java.util.HashMap;
+import java.util.Map;
+
+import static org.junit.Assert.assertEquals;
+
+public class FileSystemNotebookRepoTest {
+
+  private ZeppelinConfiguration zConf;
+  private Configuration hadoopConf;
+  private FileSystem fs;
+  private FileSystemNotebookRepo hdfsNotebookRepo;
+  private String notebookDir;
+  private AuthenticationInfo authInfo = AuthenticationInfo.ANONYMOUS;
+
+  @Before
+  public void setUp() throws IOException {
+    notebookDir = Files.createTempDirectory("FileSystemNotebookRepoTest").toFile().getAbsolutePath();
+    zConf = new ZeppelinConfiguration();
+    System.setProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_NOTEBOOK_DIR.getVarName(), notebookDir);
+    hadoopConf = new Configuration();
+    fs = FileSystem.get(hadoopConf);
+    hdfsNotebookRepo = new FileSystemNotebookRepo(zConf);
+  }
+
+  @After
+  public void tearDown() throws IOException {
+    FileUtils.deleteDirectory(new File(notebookDir));
+  }
+
+  @Test
+  public void testBasics() throws IOException {
+    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
+
+    // create a new note
+    Note note = new Note();
+    note.setName("title_1");
+
+    Map<String, Object> config = new HashMap<>();
+    config.put("config_1", "value_1");
+    note.setConfig(config);
+    hdfsNotebookRepo.save(note, authInfo);
+    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
+
+    // read this note from hdfs
+    Note note_copy = hdfsNotebookRepo.get(note.getId(), authInfo);
+    assertEquals(note.getName(), note_copy.getName());
+    assertEquals(note.getConfig(), note_copy.getConfig());
+
+    // update this note
+    note.setName("title_2");
+    hdfsNotebookRepo.save(note, authInfo);
+    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
+    note_copy = hdfsNotebookRepo.get(note.getId(), authInfo);
+    assertEquals(note.getName(), note_copy.getName());
+    assertEquals(note.getConfig(), note_copy.getConfig());
+
+    // delete this note
+    hdfsNotebookRepo.remove(note.getId(), authInfo);
+    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
+  }
+
+  @Test
+  public void testComplicatedScenarios() throws IOException {
+    // scenario_1: notebook_dir is not clean. There're some unrecognized dir and file under notebook_dir
+    fs.mkdirs(new Path(notebookDir, "1/2"));
+    OutputStream out = fs.create(new Path(notebookDir, "1/a.json"));
+    out.close();
+
+    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
+
+    // scenario_2: note_folder is existed.
+    // create a new note
+    Note note = new Note();
+    note.setName("title_1");
+    Map<String, Object> config = new HashMap<>();
+    config.put("config_1", "value_1");
+    note.setConfig(config);
+
+    fs.mkdirs(new Path(notebookDir, note.getId()));
+    hdfsNotebookRepo.save(note, authInfo);
+    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
+  }
+}

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/3f591c23/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepoTest.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepoTest.java b/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepoTest.java
deleted file mode 100644
index 952d744..0000000
--- a/zeppelin-zengine/src/test/java/org/apache/zeppelin/notebook/repo/HdfsNotebookRepoTest.java
+++ /dev/null
@@ -1,101 +0,0 @@
-package org.apache.zeppelin.notebook.repo;
-
-
-import org.apache.commons.io.FileUtils;
-import org.apache.hadoop.conf.Configuration;
-import org.apache.hadoop.fs.FileSystem;
-import org.apache.hadoop.fs.Path;
-import org.apache.zeppelin.conf.ZeppelinConfiguration;
-import org.apache.zeppelin.notebook.Note;
-import org.apache.zeppelin.user.AuthenticationInfo;
-import org.junit.After;
-import org.junit.Before;
-import org.junit.Test;
-
-import java.io.File;
-import java.io.IOException;
-import java.io.OutputStream;
-import java.nio.file.Files;
-import java.util.HashMap;
-import java.util.Map;
-
-import static org.junit.Assert.assertEquals;
-
-public class HdfsNotebookRepoTest {
-
-  private ZeppelinConfiguration zConf;
-  private Configuration hadoopConf;
-  private FileSystem fs;
-  private HdfsNotebookRepo hdfsNotebookRepo;
-  private String notebookDir;
-  private AuthenticationInfo authInfo = AuthenticationInfo.ANONYMOUS;
-
-  @Before
-  public void setUp() throws IOException {
-    notebookDir = Files.createTempDirectory("HdfsNotebookRepoTest").toFile().getAbsolutePath();
-    zConf = new ZeppelinConfiguration();
-    System.setProperty(ZeppelinConfiguration.ConfVars.ZEPPELIN_NOTEBOOK_DIR.getVarName(), notebookDir);
-    hadoopConf = new Configuration();
-    fs = FileSystem.get(hadoopConf);
-    hdfsNotebookRepo = new HdfsNotebookRepo(zConf);
-  }
-
-  @After
-  public void tearDown() throws IOException {
-    FileUtils.deleteDirectory(new File(notebookDir));
-  }
-
-  @Test
-  public void testBasics() throws IOException {
-    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
-
-    // create a new note
-    Note note = new Note();
-    note.setName("title_1");
-
-    Map<String, Object> config = new HashMap<>();
-    config.put("config_1", "value_1");
-    note.setConfig(config);
-    hdfsNotebookRepo.save(note, authInfo);
-    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
-
-    // read this note from hdfs
-    Note note_copy = hdfsNotebookRepo.get(note.getId(), authInfo);
-    assertEquals(note.getName(), note_copy.getName());
-    assertEquals(note.getConfig(), note_copy.getConfig());
-
-    // update this note
-    note.setName("title_2");
-    hdfsNotebookRepo.save(note, authInfo);
-    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
-    note_copy = hdfsNotebookRepo.get(note.getId(), authInfo);
-    assertEquals(note.getName(), note_copy.getName());
-    assertEquals(note.getConfig(), note_copy.getConfig());
-
-    // delete this note
-    hdfsNotebookRepo.remove(note.getId(), authInfo);
-    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
-  }
-
-  @Test
-  public void testComplicatedScenarios() throws IOException {
-    // scenario_1: notebook_dir is not clean. There're some unrecognized dir and file under notebook_dir
-    fs.mkdirs(new Path(notebookDir, "1/2"));
-    OutputStream out = fs.create(new Path(notebookDir, "1/a.json"));
-    out.close();
-
-    assertEquals(0, hdfsNotebookRepo.list(authInfo).size());
-
-    // scenario_2: note_folder is existed.
-    // create a new note
-    Note note = new Note();
-    note.setName("title_1");
-    Map<String, Object> config = new HashMap<>();
-    config.put("config_1", "value_1");
-    note.setConfig(config);
-
-    fs.mkdirs(new Path(notebookDir, note.getId()));
-    hdfsNotebookRepo.save(note, authInfo);
-    assertEquals(1, hdfsNotebookRepo.list(authInfo).size());
-  }
-}