You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@zeppelin.apache.org by jo...@apache.org on 2016/05/31 02:50:18 UTC

incubator-zeppelin git commit: ZEPPELIN-804 Refactoring registration mechanism on Interpreters

Repository: incubator-zeppelin
Updated Branches:
  refs/heads/master f291b2619 -> f8e1f6c4f


ZEPPELIN-804 Refactoring registration mechanism on Interpreters

### What is this PR for?
This PR enable Zeppelin server register Interpreters without any dependencies of their own. For instance, we should build `spark` with `spark-dependencies` even we use our own Spark cluster because current initialisation mechanism needs to all of its dependencies.

### What type of PR is it?
[Improvement]

### Todos
* [x] - Add RegisteredInterpreter from interpreter-setting.json in a jar or interpreter/{interpreter}/interpreter-setting.json
* [x] - Adjust it to Spark*Interpreter

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-804

### How should this be tested?
1. Prepare your own spark cluster - e.g. standalone, Yarn, Mesos -
1. rm -rf interpreter
1. mvn clean package -DskipTests -pl 'zeppelin-display,zeppelin-interpreter,zeppelin-server,zeppelin-web,zeppelin-zengine,angular,jdbc,spark'
1. bin/zeppelin-daemon.sh start
1. Check error in log
1. apply patch
1. mvn clean package -DskipTests -pl 'zeppelin-display,zeppelin-interpreter,zeppelin-server,zeppelin-web,zeppelin-zengine,angular,jdbc,spark'
1. bin/zeppelin-daemon.sh start
1. run some paragraph with simple command like `sc.version`

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

### Description

#### This PR introduce three initialisation mechanism including current one.
* {interpreter_dir}/{interpreter_group}/interpreter-setting.json
* interpreter-settings.json in your interpreter jar
* Current static initialization

#### Initialization step
1. Get {interpreter_dir} from Configuration
1. Get the list of {interpreter_dir}/[{interpreter_group1},{interpreter_group2}...]
1. Find {interpreter_dir}/{interpreter_group1}/interpreter-setting.json
1. Find interpreter-setting.json in the resources of {interpreter_dir}/{interpreter_group1}/**/*.jar
1. Adopt static init
1. Repeat them from the second step with {interpreter_group2}

Author: Jongyoul Lee <jo...@gmail.com>

Closes #835 from jongyoul/ZEPPELIN-804 and squashes the following commits:

823321e [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Added documentation
25bc501 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Ignored while registering a new interpreter with existing interpreter key
312dd77 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Reverted log4j setting
81ab361 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Changed logger setting only for test. This will be reverted after test
e8f990f [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Fixed some unicode characters in interpreter-setting.json
3ad41bb [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Checked if path exists or not
1b3cd0c [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Checked if path exists or not
844dccb [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Fixed logic to check for supporting legacy mechanism
c5b7d54 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Extracted new initialization logic into another methods
5d63d91 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Removed static initialisation of Spark*Interpreter
0d720c0 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Removed a interpreter which fails to initialize
d343636 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Excluded SparkRInterpreter from interpreter-setting.json
519f057 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Excluded interpreter-setting.json from rat check
00f55a8 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Excluded interpreter-setting.json from rat check
1fa2e52 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Fixed test environments
8a90fe4 [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Fixed test environments
d54f98e [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Changed Spark*Interpreter to use interpreter-setting.json
48ac41d [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Fixed the style
ca7b96c [Jongyoul Lee] ZEPPELIN-804 Refactoring registration mechanism on Interpreters - Added a new initialization mechanism to use interpreter-setting.json - Adjusted new mechanism to SparkInterpreter for verification


Project: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/commit/f8e1f6c4
Tree: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/tree/f8e1f6c4
Diff: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/diff/f8e1f6c4

Branch: refs/heads/master
Commit: f8e1f6c4fa751716e61b9aa8894f7e0599c1faf8
Parents: f291b26
Author: Jongyoul Lee <jo...@gmail.com>
Authored: Sun May 29 20:52:40 2016 +0900
Committer: Jongyoul Lee <jo...@apache.org>
Committed: Tue May 31 11:50:08 2016 +0900

----------------------------------------------------------------------
 docs/development/writingzeppelininterpreter.md  |  40 ++++-
 spark/pom.xml                                   |   1 +
 .../apache/zeppelin/spark/DepInterpreter.java   |  17 --
 .../zeppelin/spark/PySparkInterpreter.java      |  11 --
 .../apache/zeppelin/spark/SparkInterpreter.java |  32 ----
 .../zeppelin/spark/SparkRInterpreter.java       |  26 ---
 .../zeppelin/spark/SparkSqlInterpreter.java     |  21 ---
 .../src/main/resources/interpreter-setting.json | 146 +++++++++++++++++
 .../zeppelin/spark/DepInterpreterTest.java      |   9 +-
 .../zeppelin/spark/SparkInterpreterTest.java    |  16 +-
 .../zeppelin/spark/SparkSqlInterpreterTest.java |   4 +
 .../zeppelin/interpreter/Interpreter.java       |  57 ++++---
 .../interpreter/InterpreterProperty.java        |  53 ++++++-
 .../interpreter/remote/RemoteInterpreter.java   |  13 +-
 .../zeppelin/conf/ZeppelinConfiguration.java    |   5 +
 .../interpreter/InterpreterFactory.java         | 157 +++++++++++++++----
 16 files changed, 437 insertions(+), 171 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/docs/development/writingzeppelininterpreter.md
----------------------------------------------------------------------
diff --git a/docs/development/writingzeppelininterpreter.md b/docs/development/writingzeppelininterpreter.md
index 0842fe6..e7bf635 100644
--- a/docs/development/writingzeppelininterpreter.md
+++ b/docs/development/writingzeppelininterpreter.md
@@ -36,14 +36,48 @@ In 'Separate Interpreter for each note' mode, new Interpreter instance will be c
 ### Make your own Interpreter
 
 Creating a new interpreter is quite simple. Just extend [org.apache.zeppelin.interpreter](https://github.com/apache/incubator-zeppelin/blob/master/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java) abstract class and implement some methods.
-You can include `org.apache.zeppelin:zeppelin-interpreter:[VERSION]` artifact in your build system.
-Your interpreter name is derived from the static register method.
-
+You can include `org.apache.zeppelin:zeppelin-interpreter:[VERSION]` artifact in your build system. And you should your jars under your interpreter directory with specific directory name. Zeppelin server reads interpreter directories recursively and initializes interpreters including your own interpreter.
+
+There are three locations where you can store your interpreter group, name and other information. Zeppelin server tries to find the location below. Next, Zeppelin tries to find `interpareter-setting.json` in your interpreter jar. 
+```
+{ZEPPELIN_INTERPRETER_DIR}/{YOUR_OWN_INTERPRETER_DIR}/interpreter-setting.json
+```
+
+Here is an example of `interpareter-setting.json` on your own interpreter.
+```json
+[
+  {
+    "interpreterGroup": "your-group",
+    "interpreterName": "your-name",
+    "interpreterClassName": "your.own.interpreter.class",
+    "properties": {
+      "propertiies1": {
+        "envName": null,
+        "propertyName": "property.1.name",
+        "defaultValue": "propertyDefaultValue",
+        "description": "Property description"
+      },
+      "properties2": {
+        "envName": PROPERTIES_2,
+        "propertyName": null,
+        "defaultValue": "property2DefaultValue",
+        "description": "Property 2 description"
+      }, ...
+    }
+  },
+  {
+    ...
+  } 
+]
+```
+
+Finally, Zeppelin uses static initialization with the following:
 ```
 static {
     Interpreter.register("MyInterpreterName", MyClassName.class.getName());
   }
 ```
+**Static initialization is deprecated and will be supported until 0.6.0.**
 
 The name will appear later in the interpreter name option box during the interpreter configuration process.
 The name of the interpreter is what you later write to identify a paragraph which should be interpreted using this interpreter.

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/pom.xml
----------------------------------------------------------------------
diff --git a/spark/pom.xml b/spark/pom.xml
index e3332d6..14bc9a7 100644
--- a/spark/pom.xml
+++ b/spark/pom.xml
@@ -313,6 +313,7 @@
             <exclude>**/metastore_db/</exclude>
             <exclude>**/README.md</exclude>
             <exclude>**/dependency-reduced-pom.xml</exclude>
+            <exclude>**/interpreter-setting.json</exclude>
           </excludes>
         </configuration>
       </plugin>

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/java/org/apache/zeppelin/spark/DepInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/DepInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/DepInterpreter.java
index 2586955..f7c164c 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/DepInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/DepInterpreter.java
@@ -62,23 +62,6 @@ import scala.tools.nsc.settings.MutableSettings.PathSetting;
  *
  */
 public class DepInterpreter extends Interpreter {
-
-  static {
-    Interpreter.register(
-        "dep",
-        "spark",
-        DepInterpreter.class.getName(),
-        new InterpreterPropertyBuilder()
-            .add("zeppelin.dep.localrepo",
-                getSystemDefault("ZEPPELIN_DEP_LOCALREPO", null, "local-repo"),
-                "local repository for dependency loader")
-            .add("zeppelin.dep.additionalRemoteRepository",
-                "spark-packages,http://dl.bintray.com/spark-packages/maven,false;",
-                "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository.")
-            .build());
-
-  }
-
   private SparkIMain intp;
   private ByteArrayOutputStream out;
   private SparkDependencyContext depc;

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
index ea00541..1e2ef2e 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
@@ -77,17 +77,6 @@ public class PySparkInterpreter extends Interpreter implements ExecuteResultHand
   private String scriptPath;
   boolean pythonscriptRunning = false;
 
-  static {
-    Interpreter.register(
-        "pyspark",
-        "spark",
-        PySparkInterpreter.class.getName(),
-        new InterpreterPropertyBuilder()
-          .add("zeppelin.pyspark.python",
-               SparkInterpreter.getSystemDefault("PYSPARK_PYTHON", null, "python"),
-               "Python command to run pyspark with").build());
-  }
-
   public PySparkInterpreter(Properties property) {
     super(property);
 

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
index 60613d3..0127914 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
@@ -80,38 +80,6 @@ import scala.tools.nsc.settings.MutableSettings.PathSetting;
 public class SparkInterpreter extends Interpreter {
   public static Logger logger = LoggerFactory.getLogger(SparkInterpreter.class);
 
-  static {
-    Interpreter.register(
-      "spark",
-      "spark",
-      SparkInterpreter.class.getName(),
-      new InterpreterPropertyBuilder()
-        .add("spark.app.name",
-          getSystemDefault("SPARK_APP_NAME", "spark.app.name", "Zeppelin"),
-          "The name of spark application.")
-        .add("master",
-          getSystemDefault("MASTER", "spark.master", "local[*]"),
-          "Spark master uri. ex) spark://masterhost:7077")
-        .add("spark.executor.memory",
-          getSystemDefault(null, "spark.executor.memory", ""),
-          "Executor memory per worker instance. ex) 512m, 32g")
-        .add("spark.cores.max",
-          getSystemDefault(null, "spark.cores.max", ""),
-          "Total number of cores to use. Empty value uses all available core.")
-        .add("zeppelin.spark.useHiveContext",
-          getSystemDefault("ZEPPELIN_SPARK_USEHIVECONTEXT",
-            "zeppelin.spark.useHiveContext", "true"),
-          "Use HiveContext instead of SQLContext if it is true.")
-        .add("zeppelin.spark.maxResult",
-          getSystemDefault("ZEPPELIN_SPARK_MAXRESULT", "zeppelin.spark.maxResult", "1000"),
-          "Max number of SparkSQL result to display.")
-        .add("args", "", "spark commandline args")
-        .add("zeppelin.spark.printREPLOutput", "true",
-          "Print REPL output")
-        .build()
-    );
-  }
-
   private ZeppelinContext z;
   private SparkILoop interpreter;
   private SparkIMain intp;

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
index e0ea766..021c95f 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkRInterpreter.java
@@ -42,32 +42,6 @@ public class SparkRInterpreter extends Interpreter {
   private static String renderOptions;
   private ZeppelinR zeppelinR;
 
-  static {
-    Interpreter.register(
-      "r",
-      "spark",
-      SparkRInterpreter.class.getName(),
-      new InterpreterPropertyBuilder()
-          .add("zeppelin.R.cmd",
-              SparkInterpreter.getSystemDefault("ZEPPELIN_R_CMD", "zeppelin.R.cmd", "R"),
-              "R repl path")
-          .add("zeppelin.R.knitr",
-              SparkInterpreter.getSystemDefault("ZEPPELIN_R_KNITR", "zeppelin.R.knitr", "true"),
-              "whether use knitr or not")
-          .add("zeppelin.R.image.width",
-              SparkInterpreter.getSystemDefault("ZEPPELIN_R_IMAGE_WIDTH",
-                  "zeppelin.R.image.width", "100%"),
-              "")
-          .add("zeppelin.R.render.options",
-              SparkInterpreter.getSystemDefault("ZEPPELIN_R_RENDER_OPTIONS",
-                  "zeppelin.R.render.options",
-                  "out.format = 'html', comment = NA, "
-                      + "echo = FALSE, results = 'asis', message = F, warning = F"),
-              "")
-          .build());
-  }
-
-
   public SparkRInterpreter(Properties property) {
     super(property);
   }

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java
index 3b850b4..ed2e336 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java
@@ -46,27 +46,6 @@ public class SparkSqlInterpreter extends Interpreter {
   Logger logger = LoggerFactory.getLogger(SparkSqlInterpreter.class);
   AtomicInteger num = new AtomicInteger(0);
 
-  static {
-    Interpreter.register(
-        "sql",
-        "spark",
-        SparkSqlInterpreter.class.getName(),
-        new InterpreterPropertyBuilder()
-            .add("zeppelin.spark.maxResult",
-                SparkInterpreter.getSystemDefault("ZEPPELIN_SPARK_MAXRESULT",
-                    "zeppelin.spark.maxResult", "1000"),
-                "Max number of SparkSQL result to display.")
-            .add("zeppelin.spark.concurrentSQL",
-                SparkInterpreter.getSystemDefault("ZEPPELIN_SPARK_CONCURRENTSQL",
-                    "zeppelin.spark.concurrentSQL", "false"),
-                "Execute multiple SQL concurrently if set true.")
-            .add("zeppelin.spark.sql.stacktrace",
-                SparkInterpreter.getSystemDefault("ZEPPELIN_SPARK_SQL_STACKTRACE",
-                    "zeppelin.spark.sql.stacktrace", "false"),
-                "Show full exception stacktrace for SQL queries if set to true.")
-            .build());
-  }
-
   private String getJobGroup(InterpreterContext context){
     return "zeppelin-" + context.getParagraphId();
   }

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/spark/src/main/resources/interpreter-setting.json b/spark/src/main/resources/interpreter-setting.json
new file mode 100644
index 0000000..ee7a192
--- /dev/null
+++ b/spark/src/main/resources/interpreter-setting.json
@@ -0,0 +1,146 @@
+[
+  {
+    "interpreterGroup": "spark",
+    "interpreterName": "spark",
+    "interpreterClassName": "org.apache.zeppelin.spark.SparkInterpreter",
+    "properties": {
+      "spark.executor.memory": {
+        "envName": null,
+        "propertyName": "spark.executor.memory",
+        "defaultValue": "",
+        "description": "Executor memory per worker instance. ex) 512m, 32g"
+      },
+      "args": {
+        "envName": null,
+        "propertyName": null,
+        "defaultValue": "",
+        "description": "spark commandline args"
+      },
+      "zeppelin.spark.useHiveContext": {
+        "envName": "ZEPPELIN_SPARK_USEHIVECONTEXT",
+        "propertyName": "zeppelin.spark.useHiveContext",
+        "defaultValue": "true",
+        "description": "Use HiveContext instead of SQLContext if it is true."
+      },
+      "spark.app.name": {
+        "envName": "SPARK_APP_NAME",
+
+        "propertyName": "spark.app.name",
+        "defaultValue": "Zeppelin",
+        "description": "The name of spark application."
+      },
+      "zeppelin.spark.printREPLOutput": {
+        "envName": null,
+        "propertyName": null,
+        "defaultValue": "true",
+        "description": "Print REPL output"
+      },
+      "spark.cores.max": {
+        "envName": null,
+        "propertyName": "spark.cores.max",
+        "defaultValue": "",
+        "description": "Total number of cores to use. Empty value uses all available core."
+      },
+      "zeppelin.spark.maxResult": {
+        "envName": "ZEPPELIN_SPARK_MAXRESULT",
+        "propertyName": "zeppelin.spark.maxResult",
+        "defaultValue": "1000",
+        "description": "Max number of SparkSQL result to display."
+      },
+      "master": {
+        "envName": "MASTER",
+        "propertyName": "spark.master",
+        "defaultValue": "local[*]",
+        "description": "Spark master uri. ex) spark://masterhost:7077"
+      }
+    }
+  },
+  {
+    "interpreterGroup": "spark",
+    "interpreterName": "sql",
+    "interpreterClassName": "org.apache.zeppelin.spark.SparkSqlInterpreter",
+    "properties": {
+      "zeppelin.spark.concurrentSQL": {
+        "envName": "ZEPPELIN_SPARK_CONCURRENTSQL",
+        "propertyName": "zeppelin.spark.concurrentSQL",
+        "defaultValue": "false",
+        "description": "Execute multiple SQL concurrently if set true."
+      },
+      "zeppelin.spark.sql.stacktrace": {
+        "envName": "ZEPPELIN_SPARK_SQL_STACKTRACE",
+        "propertyName": "zeppelin.spark.sql.stacktrace",
+        "defaultValue": "false",
+        "description": "Show full exception stacktrace for SQL queries if set to true."
+      },
+      "zeppelin.spark.maxResult": {
+        "envName": "ZEPPELIN_SPARK_MAXRESULT",
+        "propertyName": "zeppelin.spark.maxResult",
+        "defaultValue": "1000",
+        "description": "Max number of SparkSQL result to display."
+      }
+    }
+  },
+  {
+    "interpreterGroup": "spark",
+    "interpreterName": "dep",
+    "interpreterClassName": "org.apache.zeppelin.spark.DepInterpreter",
+    "properties": {
+      "zeppelin.dep.localrepo": {
+        "envName": "ZEPPELIN_DEP_LOCALREPO",
+        "propertyName": null,
+        "defaultValue": "local-repo",
+        "description": "local repository for dependency loader"
+      },
+      "zeppelin.dep.additionalRemoteRepository": {
+        "envName": null,
+        "propertyName": null,
+        "defaultValue": "spark-packages,http://dl.bintray.com/spark-packages/maven,false;",
+        "description": "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository."
+      }
+    }
+  },
+  {
+    "interpreterGroup": "spark",
+    "interpreterName": "pyspark",
+    "interpreterClassName": "org.apache.zeppelin.spark.PySparkInterpreter",
+    "properties": {
+      "zeppelin.pyspark.python": {
+        "envName": "PYSPARK_PYTHON",
+        "propertyName": null,
+        "defaultValue": "python",
+        "description": "Python command to run pyspark with"
+      }
+    }
+  },
+  {
+    "interpreterGroup": "spark",
+    "interpreterName": "r",
+    "interpreterClassName": "org.apache.zeppelin.spark.SparkRInterpreter",
+    "properties": {
+      "zeppelin.R.knitr": {
+        "envName": "ZEPPELIN_R_KNITR",
+        "propertyName": "zeppelin.R.knitr",
+        "defaultValue": "true",
+        "description": "whether use knitr or not"
+      },
+      "zeppelin.R.cmd": {
+        "envName": "ZEPPELIN_R_CMD",
+        "propertyName": "zeppelin.R.cmd",
+        "defaultValue": "R",
+        "description": "R repl path"
+      },
+      "zeppelin.R.image.width": {
+        "envName": "ZEPPELIN_R_IMAGE_WIDTH",
+        "propertyName": "zeppelin.R.image.width",
+        "defaultValue": "100%",
+        "description": ""
+      },
+      "zeppelin.R.render.options": {
+        "envName": "ZEPPELIN_R_RENDER_OPTIONS",
+        "propertyName": "zeppelin.R.render.options",
+        "defaultValue": "out.format = 'html', comment = NA, echo = FALSE, results = 'asis', message = F, warning = F",
+        "description": ""
+      }
+    }
+  }
+]

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/test/java/org/apache/zeppelin/spark/DepInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/DepInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/DepInterpreterTest.java
index dc8fd4c..03ecb9e 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/DepInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/DepInterpreterTest.java
@@ -39,6 +39,13 @@ public class DepInterpreterTest {
   private File tmpDir;
   private SparkInterpreter repl;
 
+  private Properties getTestProperties() {
+    Properties p = new Properties();
+    p.setProperty("zeppelin.dep.localrepo", "local-repo");
+    p.setProperty("zeppelin.dep.additionalRemoteRepository", "spark-packages,http://dl.bintray.com/spark-packages/maven,false;");
+    return p;
+  }
+
   @Before
   public void setUp() throws Exception {
     tmpDir = new File(System.getProperty("java.io.tmpdir") + "/ZeppelinLTest_" + System.currentTimeMillis());
@@ -46,7 +53,7 @@ public class DepInterpreterTest {
 
     tmpDir.mkdirs();
 
-    Properties p = new Properties();
+    Properties p = getTestProperties();
 
     dep = new DepInterpreter(p);
     dep.open();

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
index 5b13277..409f938 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/SparkInterpreterTest.java
@@ -63,6 +63,16 @@ public class SparkInterpreterTest {
     return version;
   }
 
+  public static Properties getSparkTestProperties() {
+    Properties p = new Properties();
+    p.setProperty("master", "local[*]");
+    p.setProperty("spark.app.name", "Zeppelin Test");
+    p.setProperty("zeppelin.spark.useHiveContext", "true");
+    p.setProperty("zeppelin.spark.maxResult", "1000");
+
+    return p;
+  }
+
   @Before
   public void setUp() throws Exception {
     tmpDir = new File(System.getProperty("java.io.tmpdir") + "/ZeppelinLTest_" + System.currentTimeMillis());
@@ -71,10 +81,9 @@ public class SparkInterpreterTest {
     tmpDir.mkdirs();
 
     if (repl == null) {
-      Properties p = new Properties();
       intpGroup = new InterpreterGroup();
       intpGroup.put("note", new LinkedList<Interpreter>());
-      repl = new SparkInterpreter(p);
+      repl = new SparkInterpreter(getSparkTestProperties());
       repl.setInterpreterGroup(intpGroup);
       intpGroup.get("note").add(repl);
       repl.open();
@@ -207,8 +216,7 @@ public class SparkInterpreterTest {
   @Test
   public void shareSingleSparkContext() throws InterruptedException {
     // create another SparkInterpreter
-    Properties p = new Properties();
-    SparkInterpreter repl2 = new SparkInterpreter(p);
+    SparkInterpreter repl2 = new SparkInterpreter(getSparkTestProperties());
     repl2.setInterpreterGroup(intpGroup);
     intpGroup.get("note").add(repl2);
     repl2.open();

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
----------------------------------------------------------------------
diff --git a/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java b/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
index c2cc1e6..3196cf5 100644
--- a/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
+++ b/spark/src/test/java/org/apache/zeppelin/spark/SparkSqlInterpreterTest.java
@@ -46,6 +46,10 @@ public class SparkSqlInterpreterTest {
   @Before
   public void setUp() throws Exception {
     Properties p = new Properties();
+    p.putAll(SparkInterpreterTest.getSparkTestProperties());
+    p.setProperty("zeppelin.spark.maxResult", "1000");
+    p.setProperty("zeppelin.spark.concurrentSQL", "false");
+    p.setProperty("zeppelin.spark.sql.stacktrace", "false");
 
     if (repl == null) {
 

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java
index 6475cb7..5ad0980 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/Interpreter.java
@@ -25,6 +25,7 @@ import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
+import com.google.gson.annotations.SerializedName;
 import org.apache.zeppelin.scheduler.Scheduler;
 import org.apache.zeppelin.scheduler.SchedulerFactory;
 import org.slf4j.Logger;
@@ -129,6 +130,7 @@ public abstract class Interpreter {
   protected Properties property;
 
   public Interpreter(Properties property) {
+    logger.debug("Properties: {}", property);
     this.property = property;
   }
 
@@ -140,13 +142,16 @@ public abstract class Interpreter {
     Properties p = new Properties();
     p.putAll(property);
 
-    Map<String, InterpreterProperty> defaultProperties = Interpreter
-        .findRegisteredInterpreterByClassName(getClassName()).getProperties();
-    for (String k : defaultProperties.keySet()) {
-      if (!p.containsKey(k)) {
-        String value = defaultProperties.get(k).getDefaultValue();
-        if (value != null) {
-          p.put(k, defaultProperties.get(k).getDefaultValue());
+    RegisteredInterpreter registeredInterpreter = Interpreter.findRegisteredInterpreterByClassName(
+        getClassName());
+    if (null != registeredInterpreter) {
+      Map<String, InterpreterProperty> defaultProperties = registeredInterpreter.getProperties();
+      for (String k : defaultProperties.keySet()) {
+        if (!p.containsKey(k)) {
+          String value = defaultProperties.get(k).getValue();
+          if (value != null) {
+            p.put(k, defaultProperties.get(k).getValue());
+          }
         }
       }
     }
@@ -155,17 +160,9 @@ public abstract class Interpreter {
   }
 
   public String getProperty(String key) {
-    if (property.containsKey(key)) {
-      return property.getProperty(key);
-    }
-
-    Map<String, InterpreterProperty> defaultProperties = Interpreter
-        .findRegisteredInterpreterByClassName(getClassName()).getProperties();
-    if (defaultProperties.containsKey(key)) {
-      return defaultProperties.get(key).getDefaultValue();
-    }
+    logger.debug("key: {}, value: {}", key, getProperty().getProperty(key));
 
-    return null;
+    return getProperty().getProperty(key);
   }
 
 
@@ -228,8 +225,11 @@ public abstract class Interpreter {
    * Represent registered interpreter class
    */
   public static class RegisteredInterpreter {
-    private String name;
+    @SerializedName("interpreterGroup")
     private String group;
+    @SerializedName("interpreterName")
+    private String name;
+    @SerializedName("interpreterClassName")
     private String className;
     private Map<String, InterpreterProperty> properties;
     private String path;
@@ -267,6 +267,10 @@ public abstract class Interpreter {
       return path;
     }
 
+    public String getInterpreterKey() {
+      return getGroup() + "." + getName();
+    }
+
   }
 
   /**
@@ -287,10 +291,21 @@ public abstract class Interpreter {
     register(name, group, className, new HashMap<String, InterpreterProperty>());
   }
 
+  @Deprecated
   public static void register(String name, String group, String className,
-      Map<String, InterpreterProperty> properties) {
-    registeredInterpreters.put(group + "." + name, new RegisteredInterpreter(
-        name, group, className, properties));
+                              Map<String, InterpreterProperty> properties) {
+    logger.error("Static initialization is deprecated. You should change it to use " +
+                     "interpreter-setting.json in your jar or " +
+                     "interpreter/{interpreter}/interpreter-setting.json");
+    register(new RegisteredInterpreter(name, group, className, properties));
+  }
+
+  public static void register(RegisteredInterpreter registeredInterpreter) {
+    // TODO(jongyoul): Error should occur when two same interpreter key with different settings
+    String interpreterKey = registeredInterpreter.getInterpreterKey();
+    if (!registeredInterpreters.containsKey(interpreterKey)) {
+      registeredInterpreters.put(interpreterKey, registeredInterpreter);
+    }
   }
 
   public static RegisteredInterpreter findRegisteredInterpreterByClassName(String className) {

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterProperty.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterProperty.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterProperty.java
index cc13ace..488f2a1 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterProperty.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterProperty.java
@@ -21,16 +21,39 @@ package org.apache.zeppelin.interpreter;
  * Represent property of interpreter
  */
 public class InterpreterProperty {
+  String envName;
+  String propertyName;
   String defaultValue;
   String description;
 
-  public InterpreterProperty(String defaultValue,
-      String description) {
-    super();
+  public InterpreterProperty(String envName, String propertyName, String defaultValue,
+                                String description) {
+    this.envName = envName;
+    this.propertyName = propertyName;
     this.defaultValue = defaultValue;
     this.description = description;
   }
 
+  public InterpreterProperty(String defaultValue, String description) {
+    this(null, null, defaultValue, description);
+  }
+
+  public String getEnvName() {
+    return envName;
+  }
+
+  public void setEnvName(String envName) {
+    this.envName = envName;
+  }
+
+  public String getPropertyName() {
+    return propertyName;
+  }
+
+  public void setPropertyName(String propertyName) {
+    this.propertyName = propertyName;
+  }
+
   public String getDefaultValue() {
     return defaultValue;
   }
@@ -46,4 +69,28 @@ public class InterpreterProperty {
   public void setDescription(String description) {
     this.description = description;
   }
+
+  public String getValue() {
+    //TODO(jongyoul): Remove SparkInterpreter's getSystemDefault method
+    if (envName != null && !envName.isEmpty()) {
+      String envValue = System.getenv().get(envName);
+      if (envValue != null) {
+        return envValue;
+      }
+    }
+
+    if (propertyName != null && !propertyName.isEmpty()) {
+      String propValue = System.getProperty(propertyName);
+      if (propValue != null) {
+        return propValue;
+      }
+    }
+    return defaultValue;
+  }
+
+  @Override
+  public String toString() {
+    return String.format("{envName=%s, propertyName=%s, defaultValue=%s, description=%20s", envName,
+        propertyName, defaultValue, description);
+  }
 }

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
----------------------------------------------------------------------
diff --git a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
index e4273c4..1829162 100644
--- a/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
+++ b/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/remote/RemoteInterpreter.java
@@ -177,9 +177,10 @@ public class RemoteInterpreter extends Interpreter {
         }
 
       } catch (TException e) {
-        broken = true;
+        logger.error("Failed to create interpreter: {}", getClassName());
         throw new InterpreterException(e);
       } finally {
+        // TODO(jongyoul): Fixed it when not all of interpreter in same interpreter group are broken
         interpreterProcess.releaseClient(client, broken);
       }
     }
@@ -195,12 +196,18 @@ public class RemoteInterpreter extends Interpreter {
     synchronized (interpreterGroup) {
       // initialize all interpreters in this interpreter group
       List<Interpreter> interpreters = interpreterGroup.get(noteId);
-      for (Interpreter intp : interpreters) {
+      for (Interpreter intp : new ArrayList<>(interpreters)) {
         Interpreter p = intp;
         while (p instanceof WrappedInterpreter) {
           p = ((WrappedInterpreter) p).getInnerInterpreter();
         }
-        ((RemoteInterpreter) p).init();
+        try {
+          ((RemoteInterpreter) p).init();
+        } catch (InterpreterException e) {
+          logger.error("Failed to initialize interpreter: {}. Remove it from interpreterGroup",
+              p.getClassName());
+          interpreters.remove(p);
+        }
       }
     }
   }

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/zeppelin-zengine/src/main/java/org/apache/zeppelin/conf/ZeppelinConfiguration.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/conf/ZeppelinConfiguration.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/conf/ZeppelinConfiguration.java
index 9142976..5f62a53 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/conf/ZeppelinConfiguration.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/conf/ZeppelinConfiguration.java
@@ -350,6 +350,10 @@ public class ZeppelinConfiguration extends XMLConfiguration {
     return getRelativeDir(ConfVars.ZEPPELIN_INTERPRETER_DIR);
   }
 
+  public String getInterpreterJson() {
+    return getString(ConfVars.ZEPPELIN_INTERPRETER_JSON);
+  }
+
   public String getInterpreterSettingPath() {
     return getRelativeDir(String.format("%s/interpreter.json", getConfDir()));
   }
@@ -492,6 +496,7 @@ public class ZeppelinConfiguration extends XMLConfiguration {
         + "org.apache.zeppelin.scalding.ScaldingInterpreter,"
         + "org.apache.zeppelin.jdbc.JDBCInterpreter,"
         + "org.apache.zeppelin.hbase.HbaseInterpreter"),
+    ZEPPELIN_INTERPRETER_JSON("zeppelin.interpreter.setting", "interpreter-setting.json"),
     ZEPPELIN_INTERPRETER_DIR("zeppelin.interpreter.dir", "interpreter"),
     ZEPPELIN_INTERPRETER_LOCALREPO("zeppelin.interpreter.localRepo", "local-repo"),
     ZEPPELIN_INTERPRETER_CONNECT_TIMEOUT("zeppelin.interpreter.connect.timeout", 30000),

http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/f8e1f6c4/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterFactory.java
----------------------------------------------------------------------
diff --git a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterFactory.java b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterFactory.java
index 08e6465..1772840 100644
--- a/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterFactory.java
+++ b/zeppelin-zengine/src/main/java/org/apache/zeppelin/interpreter/InterpreterFactory.java
@@ -20,6 +20,7 @@ package org.apache.zeppelin.interpreter;
 import com.google.gson.Gson;
 import com.google.gson.GsonBuilder;
 
+import com.google.gson.reflect.TypeToken;
 import org.apache.commons.io.FileUtils;
 import org.apache.commons.lang.ArrayUtils;
 import org.apache.commons.lang.NullArgumentException;
@@ -45,9 +46,14 @@ import org.sonatype.aether.repository.RemoteRepository;
 import java.io.*;
 import java.lang.reflect.Constructor;
 import java.lang.reflect.InvocationTargetException;
+import java.lang.reflect.Type;
 import java.net.MalformedURLException;
 import java.net.URL;
 import java.net.URLClassLoader;
+import java.nio.file.DirectoryStream;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
 import java.util.*;
 
 /**
@@ -111,31 +117,42 @@ public class InterpreterFactory implements InterpreterGroupFactory {
   }
 
   private void init() throws InterpreterException, IOException, RepositoryException {
-    ClassLoader oldcl = Thread.currentThread().getContextClassLoader();
+    String interpreterJson = conf.getInterpreterJson();
+    ClassLoader cl = Thread.currentThread().getContextClassLoader();
+
+    Path interpretersDir = Paths.get(conf.getInterpreterDir());
+    if (Files.exists(interpretersDir)) {
+      for (Path interpreterDir : Files.newDirectoryStream(interpretersDir,
+          new DirectoryStream.Filter<Path>() {
+            @Override
+            public boolean accept(Path entry) throws IOException {
+              return Files.exists(entry) && Files.isDirectory(entry);
+            }
+          })) {
+        String interpreterDirString = interpreterDir.toString();
 
-    // Load classes
-    File[] interpreterDirs = new File(conf.getInterpreterDir()).listFiles();
-    if (interpreterDirs != null) {
-      for (File path : interpreterDirs) {
-        logger.info("Reading " + path.getAbsolutePath());
-        URL[] urls = null;
-        try {
-          urls = recursiveBuildLibList(path);
-        } catch (MalformedURLException e1) {
-          logger.error("Can't load jars ", e1);
-        }
-        URLClassLoader ccl = new URLClassLoader(urls, oldcl);
+        registerInterpreterFromPath(interpreterDirString, interpreterJson);
+
+        registerInterpreterFromResource(cl, interpreterDirString, interpreterJson);
 
+        /**
+         * TODO(jongyoul)
+         * - Remove these codes below because of legacy code
+         * - Support ThreadInterpreter
+         */
+        URLClassLoader ccl = new URLClassLoader(recursiveBuildLibList(interpreterDir.toFile()), cl);
         for (String className : interpreterClassList) {
           try {
+            // Load classes
             Class.forName(className, true, ccl);
-            Set<String> keys = Interpreter.registeredInterpreters.keySet();
-            for (String intName : keys) {
+            Set<String> interpreterKeys = Interpreter.registeredInterpreters.keySet();
+            for (String interpreterKey : interpreterKeys) {
               if (className.equals(
-                  Interpreter.registeredInterpreters.get(intName).getClassName())) {
-                Interpreter.registeredInterpreters.get(intName).setPath(path.getAbsolutePath());
-                logger.info("Interpreter " + intName + " found. class=" + className);
-                cleanCl.put(path.getAbsolutePath(), ccl);
+                  Interpreter.registeredInterpreters.get(interpreterKey).getClassName())) {
+                Interpreter.registeredInterpreters.get(interpreterKey).setPath(
+                    interpreterDirString);
+                logger.info("Interpreter " + interpreterKey + " found. class=" + className);
+                cleanCl.put(interpreterDirString, ccl);
               }
             }
           } catch (ClassNotFoundException e) {
@@ -145,13 +162,19 @@ public class InterpreterFactory implements InterpreterGroupFactory {
       }
     }
 
+    for (RegisteredInterpreter registeredInterpreter :
+        Interpreter.registeredInterpreters.values()) {
+      logger.debug("Registered: {} -> {}. Properties: {}",
+          registeredInterpreter.getInterpreterKey(), registeredInterpreter.getClassName(),
+          registeredInterpreter.getProperties());
+    }
+
     loadFromFile();
 
     // if no interpreter settings are loaded, create default set
     synchronized (interpreterSettings) {
       if (interpreterSettings.size() == 0) {
-        HashMap<String, List<RegisteredInterpreter>> groupClassNameMap =
-            new HashMap<String, List<RegisteredInterpreter>>();
+        HashMap<String, List<RegisteredInterpreter>> groupClassNameMap = new HashMap<>();
 
         for (String k : Interpreter.registeredInterpreters.keySet()) {
           RegisteredInterpreter info = Interpreter.registeredInterpreters.get(k);
@@ -175,17 +198,13 @@ public class InterpreterFactory implements InterpreterGroupFactory {
               }
 
               for (String k : info.getProperties().keySet()) {
-                p.put(k, info.getProperties().get(k).getDefaultValue());
+                p.put(k, info.getProperties().get(k).getValue());
               }
             }
 
             if (found) {
               // add all interpreters in group
-              add(groupName,
-                  groupName,
-                  new LinkedList<Dependency>(),
-                  defaultOption,
-                  p);
+              add(groupName, groupName, new LinkedList<Dependency>(), defaultOption, p);
               groupClassNameMap.remove(groupName);
               break;
             }
@@ -196,11 +215,70 @@ public class InterpreterFactory implements InterpreterGroupFactory {
 
     for (String settingId : interpreterSettings.keySet()) {
       InterpreterSetting setting = interpreterSettings.get(settingId);
-      logger.info("Interpreter setting group {} : id={}, name={}",
-          setting.getGroup(), settingId, setting.getName());
+      logger.info("Interpreter setting group {} : id={}, name={}", setting.getGroup(), settingId,
+          setting.getName());
+    }
+  }
+
+  private void registerInterpreterFromResource(ClassLoader cl, String interpreterDir,
+                                                  String interpreterJson)
+      throws MalformedURLException {
+    URL[] urls = recursiveBuildLibList(new File(interpreterDir));
+    ClassLoader tempClassLoader = new URLClassLoader(urls, cl);
+
+    InputStream inputStream = tempClassLoader.getResourceAsStream(interpreterJson);
+
+    if (null != inputStream) {
+      logger.debug("Reading {} from resources in {}", interpreterJson, interpreterDir);
+      List<RegisteredInterpreter> registeredInterpreterList = getInterpreterListFromJson(
+          inputStream);
+      registerInterpreters(registeredInterpreterList, interpreterDir);
     }
   }
 
+  private void registerInterpreterFromPath(String interpreterDir,
+                                              String interpreterJson) throws IOException {
+
+    Path interpreterJsonPath = Paths.get(interpreterDir, interpreterJson);
+    if (Files.exists(interpreterJsonPath)) {
+      logger.debug("Reading {}", interpreterJsonPath);
+      List<RegisteredInterpreter> registeredInterpreterList = getInterpreterListFromJson(
+          interpreterJsonPath);
+      registerInterpreters(registeredInterpreterList, interpreterDir);
+    }
+  }
+
+  private List<RegisteredInterpreter> getInterpreterListFromJson(Path filename)
+      throws FileNotFoundException {
+    return getInterpreterListFromJson(new FileInputStream(filename.toFile()));
+  }
+
+  private List<RegisteredInterpreter> getInterpreterListFromJson(InputStream stream) {
+    Type registeredInterpreterListType = new TypeToken<List<RegisteredInterpreter>>() {
+    }.getType();
+    return gson.fromJson(new InputStreamReader(stream), registeredInterpreterListType);
+  }
+
+  private void registerInterpreters(List<RegisteredInterpreter> registeredInterpreters,
+                                       String absolutePath) {
+    for (RegisteredInterpreter registeredInterpreter : registeredInterpreters) {
+      String className = registeredInterpreter.getClassName();
+      if (validateRegisterInterpreter(registeredInterpreter) &&
+              null == Interpreter.findRegisteredInterpreterByClassName(className)) {
+        registeredInterpreter.setPath(absolutePath);
+        Interpreter.register(registeredInterpreter);
+        logger.debug("Registered. key: {}, className: {}, path: {}",
+            registeredInterpreter.getInterpreterKey(), registeredInterpreter.getClassName(),
+            registeredInterpreter.getProperties());
+      }
+    }
+  }
+
+  private boolean validateRegisterInterpreter(RegisteredInterpreter registeredInterpreter) {
+    return null != registeredInterpreter.getGroup() && null != registeredInterpreter.getName() &&
+               null != registeredInterpreter.getClassName();
+  }
+
   private void loadFromFile() throws IOException {
     GsonBuilder builder = new GsonBuilder();
     builder.setPrettyPrinting();
@@ -745,6 +823,8 @@ public class InterpreterFactory implements InterpreterGroupFactory {
       throws InterpreterException {
     logger.info("Create repl {} from {}", className, dirName);
 
+    updatePropertiesFromRegisteredInterpreter(property, className);
+
     ClassLoader oldcl = Thread.currentThread().getContextClassLoader();
     try {
 
@@ -806,6 +886,9 @@ public class InterpreterFactory implements InterpreterGroupFactory {
     int connectTimeout = conf.getInt(ConfVars.ZEPPELIN_INTERPRETER_CONNECT_TIMEOUT);
     String localRepoPath = conf.getInterpreterLocalRepoPath() + "/" + interpreterSettingId;
     int maxPoolSize = conf.getInt(ConfVars.ZEPPELIN_INTERPRETER_MAX_POOL_SIZE);
+
+    updatePropertiesFromRegisteredInterpreter(property, className);
+
     LazyOpenInterpreter intp = new LazyOpenInterpreter(new RemoteInterpreter(
         property, noteId, className, conf.getInterpreterRemoteRunnerPath(),
         interpreterPath, localRepoPath, connectTimeout,
@@ -813,6 +896,22 @@ public class InterpreterFactory implements InterpreterGroupFactory {
     return intp;
   }
 
+  private Properties updatePropertiesFromRegisteredInterpreter(Properties properties,
+                                                                  String className) {
+    RegisteredInterpreter registeredInterpreter = Interpreter.findRegisteredInterpreterByClassName(
+        className);
+    if (null != registeredInterpreter) {
+      Map<String, InterpreterProperty> defaultProperties = registeredInterpreter.getProperties();
+      for (String key : defaultProperties.keySet()) {
+        if (!properties.containsKey(key) && null != defaultProperties.get(key).getValue()) {
+          properties.setProperty(key, defaultProperties.get(key).getValue());
+        }
+      }
+    }
+
+    return properties;
+  }
+
 
   private URL[] recursiveBuildLibList(File path) throws MalformedURLException {
     URL[] urls = new URL[0];