You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@linkis.apache.org by ca...@apache.org on 2022/05/30 06:25:20 UTC

[incubator-linkis-website] branch dev updated: add 1.1.2 doc (#235)

This is an automated email from the ASF dual-hosted git repository.

casion pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/incubator-linkis-website.git


The following commit(s) were added to refs/heads/dev by this push:
     new 7e5e389a9 add 1.1.2 doc (#235)
7e5e389a9 is described below

commit 7e5e389a9929f2d32ea7566b1a35beaa70f3befb
Author: legendtkl <ta...@gmail.com>
AuthorDate: Mon May 30 14:25:15 2022 +0800

    add 1.1.2 doc (#235)
    
    * add 1.1.2 doc
    * update doc
    Co-authored-by: peacewong <pe...@apache.org>
---
 docs/deployment/deploy_linkis_without_hdfs.md      |  96 ++++++++
 .../deployment/deploy_linkis_without_hdfs.md       |  97 ++++++++
 .../version-1.1.0/api/http/jobhistory-api.md       | 251 +++++++++++++++++++++
 3 files changed, 444 insertions(+)

diff --git a/docs/deployment/deploy_linkis_without_hdfs.md b/docs/deployment/deploy_linkis_without_hdfs.md
new file mode 100644
index 000000000..c26c962c4
--- /dev/null
+++ b/docs/deployment/deploy_linkis_without_hdfs.md
@@ -0,0 +1,96 @@
+---
+title: Deploy Linkis without HDFS
+sidebar_position: 8
+---
+
+This article describes how to deploy Linkis services in an environment where HDFS is not deployed to facilitate more lightweight learning, use and debugging.
+
+For the overall deployment process, please refer to "Quick Deployment", and you need to modify the following content.
+
+## 1. Configuration modification
+Modify the `linkis-env.sh` file to modify the following:
+```bash
+export ENABLE_HDFS=false
+export ENABLE_HIVE=false
+export ENABLE_SPARK=false
+```
+
+After changing the above configuration to false, there is no need to configure HDFS/HIVE/SPARK environments separately.
+
+## 2. Copy mysql jar file
+Because the mysql-connector-java driver follows the GPL2.0 protocol and does not meet the license policy of the Apache open source protocol, we need to manually copy it to the following two directories.
+```bash
+${LINKIS_HOME}/lib/linkis-commons/public-module/
+${LINKIS_HOME}/lib/linkis-spring-cloud-services/linkis-mg-gateway/
+```
+
+It can be downloaded directly from the maven repository, such as https://mvnrepository.com/artifact/mysql/mysql-connector-java/5.1.49
+
+## 3. Deploy and start
+Refer to the "Quick Deployment" section to complete the deployment by executing the `${LINKIS_HOME}/bin/install.sh` command.
+Refering to the "Quick Deployment" section, start the Linkis services by executing the `${LINKIS_HOME}/sbin/linkis-start-all.sh` command.
+
+
+## 4. Verification
+Currently, version 1.1.2 only supports shell jobs to run in non-HDFS environments. The execution commands are as follows.
+
+```bash
+$ cd ./bin
+$ chmod +x linkis-cli
+$ ./linkis-cli -engineType shell-1 -codeType shell -code "echo \"hello\" "  -submitUser <submitUser> -proxyUser <proxyUser>
+```
+
+The following output is expected.
+```bash
+=====Java Start Command=====
+exec /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre/bin/java -server -Xms32m -Xmx2048m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/Linkis/linkis03/logs/linkis-cli -XX:ErrorFile=/Linkis/linkis03/logs/linkis-cli/ps_err_pid%p.log -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=80 -XX:+DisableExplicitGC    -classpath /Linkis/linkis03/conf/linkis-cli:/Linkis/linkis03/lib/linkis-computation-governance/linkis-client/linkis-cli/*:/Linkis/linkis03/lib/linkis-com [...]
+[INFO] LogFile path: /Linkis/linkis03/logs/linkis-cli/linkis-client.root.log.20220418221952287912946
+[INFO] User does not provide usr-configuration file. Will use default config
+[INFO] connecting to linkis gateway:http://127.0.0.1:9001
+JobId:6
+TaskId:6
+ExecId:exec_id018028linkis-cg-entranceiZbp19q51jb8p984yk2jxdZ:9104LINKISCLI_test_shell_1
+[INFO] Job is successfully submitted!
+
+2022-04-18 22:19:53.019 INFO Program is substituting variables for you
+2022-04-18 22:19:53.019 INFO Variables substitution ended successfully
+2022-04-18 22:19:53.019 WARN The code you submit will not be limited by the limit
+Job with jobId : LINKISCLI_test_shell_1 and execID : LINKISCLI_test_shell_1 submitted 
+2022-04-18 22:19:53.019 INFO You have submitted a new job, script code (after variable substitution) is
+************************************SCRIPT CODE************************************
+echo "hello"
+************************************SCRIPT CODE************************************
+2022-04-18 22:19:53.019 INFO Your job is accepted,  jobID is LINKISCLI_test_shell_1 and jobReqId is 6 in ServiceInstance(linkis-cg-entrance, iZbp19q51jb8p984yk2jxdZ:9104). Please wait it to be scheduled
+job is scheduled.
+2022-04-18 22:19:53.019 INFO Your job is Scheduled. Please wait it to run.
+Your job is being scheduled by orchestrator.
+2022-04-18 22:19:53.019 INFO job is running.
+2022-04-18 22:19:53.019 INFO Your job is Running now. Please wait it to complete.
+2022-04-18 22:19:53.019 INFO Job with jobGroupId : 6 and subJobId : 5 was submitted to Orchestrator.
+2022-04-18 22:19:53.019 INFO Background is starting a new engine for you,execId astJob_4_codeExec_4 mark id is mark_4, it may take several seconds, please wait
+2022-04-18 22:20:01.020 INFO Task submit to ec: ServiceInstance(linkis-cg-engineconn, iZbp19q51jb8p984yk2jxdZ:43213) get engineConnExecId is: 1
+2022-04-18 22:20:01.020 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, iZbp19q51jb8p984yk2jxdZ:43213) /appcom1/tmp/test/20220418/shell/cc21fbb5-3a33-471b-a565-8407ff8ebd80/logs
+iZbp19q51jb8p984yk2jxdZ:43213_0 >> echo "hello"
+2022-04-18 22:20:01.438 WARN  [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconn.computation.executor.hook.executor.ExecuteOnceHook 50 warn - execute once become effective, register lock listener
+hello
+2022-04-18 22:20:01.020 INFO Your subjob : 5 execue with state succeed, has 1 resultsets.
+2022-04-18 22:20:01.020 INFO Congratuaions! Your job : LINKISCLI_test_shell_1 executed with status succeed and 0 results.
+2022-04-18 22:20:01.020 INFO job is completed.
+2022-04-18 22:20:01.020 INFO Task creation time(任务创建时间): 2022-04-18 22:19:53, Task scheduling time(任务调度时间): 2022-04-18 22:19:53, Task start time(任务开始时间): 2022-04-18 22: Mission end time(任务结束时间): 2022-04-18 22:20:01
+2022-04-18 22:20:01.020 INFO Your mission(您的任务) 6 The total time spent is(总耗时时间为): 8.3 秒
+2022-04-18 22:20:01.020 INFO Congratulations. Your job completed with status Success.
+
+[INFO] Job execute successfully! Will try get execute result
+============Result:================
+TaskId:6
+ExecId: exec_id018028linkis-cg-entranceiZbp19q51jb8p984yk2jxdZ:9104LINKISCLI_test_shell_1
+User:test
+Current job status:SUCCEED
+extraMsg: 
+result: 
+
+[INFO] Retrieving result-set, may take time if result-set is large, please do not exit program.
+============ RESULT SET 1 ============
+hello   
+############Execute Success!!!########
+```
\ No newline at end of file
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy_linkis_without_hdfs.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy_linkis_without_hdfs.md
new file mode 100644
index 000000000..44e437302
--- /dev/null
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/deploy_linkis_without_hdfs.md
@@ -0,0 +1,97 @@
+---
+title: Linkis 去 HDFS 部署
+sidebar_position: 8
+---
+
+这篇文章介绍一下如何在没有部署 HDFS 的环境中部署 Linkis 服务,以方便更轻量化的学习使用和调试。
+
+部署整体流程参考 "快速部署",需要修改如下少许内容。
+
+## 1. 配置修改
+修改 `linkis-env.sh` 文件,修改以下内容:
+```bash
+export ENABLE_HDFS=false
+export ENABLE_HIVE=false
+export ENABLE_SPARK=false
+```
+
+将上述的配置修改为 false 之后,就不需要再单独配置 HDFS/HIVE/SPARK 等环境了。
+
+## 2. 拷贝 jar 包
+因为 mysql-connector-java 驱动是 GPL2.0 协议,不满足Apache开源协议关于license的政策,所以需要我们手动将其拷贝到如下两个目录下
+```bash
+${LINKIS_HOME}/lib/linkis-commons/public-module/
+${LINKIS_HOME}/lib/linkis-spring-cloud-services/linkis-mg-gateway/
+```
+
+可以直接从 maven 仓库下载,比如 https://mvnrepository.com/artifact/mysql/mysql-connector-java/5.1.49
+
+## 3. 部署和启动
+参考 "快速部署" 部分,通过执行 ${LINKIS_HOME}/bin/install.sh 命令完成部署。
+参考 "快速部署" 部分,通过执行 ${LINKIS_HOME}/sbin/linkis-start-all.sh 命令启动 Linkis 服务。
+
+
+## 4. 运行作业验证
+目前 1.1.2 版本只支持 shell 作业在非 HDFS 环境中运行,执行命令如下
+
+```bash
+$ cd ./bin
+$ chmod +x linkis-cli
+$ ./linkis-cli -engineType shell-1 -codeType shell -code "echo \"hello\" "  -submitUser <submitUser> -proxyUser <proxyUser>
+```
+
+正常会输出如下类似内容
+```bash
+=====Java Start Command=====
+exec /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre/bin/java -server -Xms32m -Xmx2048m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/Linkis/linkis03/logs/linkis-cli -XX:ErrorFile=/Linkis/linkis03/logs/linkis-cli/ps_err_pid%p.log -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=80 -XX:+DisableExplicitGC    -classpath /Linkis/linkis03/conf/linkis-cli:/Linkis/linkis03/lib/linkis-computation-governance/linkis-client/linkis-cli/*:/Linkis/linkis03/lib/linkis-com [...]
+[INFO] LogFile path: /Linkis/linkis03/logs/linkis-cli/linkis-client.root.log.20220418221952287912946
+[INFO] User does not provide usr-configuration file. Will use default config
+[INFO] connecting to linkis gateway:http://127.0.0.1:9001
+JobId:6
+TaskId:6
+ExecId:exec_id018028linkis-cg-entranceiZbp19q51jb8p984yk2jxdZ:9104LINKISCLI_test_shell_1
+[INFO] Job is successfully submitted!
+
+2022-04-18 22:19:53.019 INFO Program is substituting variables for you
+2022-04-18 22:19:53.019 INFO Variables substitution ended successfully
+2022-04-18 22:19:53.019 WARN The code you submit will not be limited by the limit
+Job with jobId : LINKISCLI_test_shell_1 and execID : LINKISCLI_test_shell_1 submitted 
+2022-04-18 22:19:53.019 INFO You have submitted a new job, script code (after variable substitution) is
+************************************SCRIPT CODE************************************
+echo "hello"
+************************************SCRIPT CODE************************************
+2022-04-18 22:19:53.019 INFO Your job is accepted,  jobID is LINKISCLI_test_shell_1 and jobReqId is 6 in ServiceInstance(linkis-cg-entrance, iZbp19q51jb8p984yk2jxdZ:9104). Please wait it to be scheduled
+job is scheduled.
+2022-04-18 22:19:53.019 INFO Your job is Scheduled. Please wait it to run.
+Your job is being scheduled by orchestrator.
+2022-04-18 22:19:53.019 INFO job is running.
+2022-04-18 22:19:53.019 INFO Your job is Running now. Please wait it to complete.
+2022-04-18 22:19:53.019 INFO Job with jobGroupId : 6 and subJobId : 5 was submitted to Orchestrator.
+2022-04-18 22:19:53.019 INFO Background is starting a new engine for you,execId astJob_4_codeExec_4 mark id is mark_4, it may take several seconds, please wait
+2022-04-18 22:20:01.020 INFO Task submit to ec: ServiceInstance(linkis-cg-engineconn, iZbp19q51jb8p984yk2jxdZ:43213) get engineConnExecId is: 1
+2022-04-18 22:20:01.020 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, iZbp19q51jb8p984yk2jxdZ:43213) /appcom1/tmp/test/20220418/shell/cc21fbb5-3a33-471b-a565-8407ff8ebd80/logs
+iZbp19q51jb8p984yk2jxdZ:43213_0 >> echo "hello"
+2022-04-18 22:20:01.438 WARN  [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconn.computation.executor.hook.executor.ExecuteOnceHook 50 warn - execute once become effective, register lock listener
+hello
+2022-04-18 22:20:01.020 INFO Your subjob : 5 execue with state succeed, has 1 resultsets.
+2022-04-18 22:20:01.020 INFO Congratuaions! Your job : LINKISCLI_test_shell_1 executed with status succeed and 0 results.
+2022-04-18 22:20:01.020 INFO job is completed.
+2022-04-18 22:20:01.020 INFO Task creation time(任务创建时间): 2022-04-18 22:19:53, Task scheduling time(任务调度时间): 2022-04-18 22:19:53, Task start time(任务开始时间): 2022-04-18 22: Mission end time(任务结束时间): 2022-04-18 22:20:01
+2022-04-18 22:20:01.020 INFO Your mission(您的任务) 6 The total time spent is(总耗时时间为): 8.3 秒
+2022-04-18 22:20:01.020 INFO Congratulations. Your job completed with status Success.
+
+[INFO] Job execute successfully! Will try get execute result
+============Result:================
+TaskId:6
+ExecId: exec_id018028linkis-cg-entranceiZbp19q51jb8p984yk2jxdZ:9104LINKISCLI_test_shell_1
+User:test
+Current job status:SUCCEED
+extraMsg: 
+result: 
+
+[INFO] Retrieving result-set, may take time if result-set is large, please do not exit program.
+============ RESULT SET 1 ============
+hello   
+############Execute Success!!!########
+
+```
\ No newline at end of file
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/http/jobhistory-api.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/http/jobhistory-api.md
new file mode 100644
index 000000000..624b6832e
--- /dev/null
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.1.0/api/http/jobhistory-api.md
@@ -0,0 +1,251 @@
+---
+title: 历史作业接口
+sidebar_position: 3
+---
+
+** QueryRestfulApi 类 **
+
+## governanceStationAdmin
+**接口地址**:`/api/rest_j/v1/jobhistory/governanceStationAdmin`
+
+**请求方式**:`GET`
+
+**请求数据类型**:`application/x-www-form-urlencoded`
+
+**响应数据类型**:`application/json`
+
+**接口描述**: 判断用户是否是管理员
+
+**请求参数**:
+无
+
+**响应示例**:
+
+```javascript
+{
+    "method": null,
+    "status": 0,
+    "message": "OK",
+    "data": {
+        "admin": true
+    }
+}
+```
+
+
+## getHistoryTask
+**接口地址**:`/api/rest_j/v1/jobhistory/{id}/get`
+
+**请求方式**:`GET`
+
+**请求数据类型**:`application/x-www-form-urlencoded`
+
+**响应数据类型**:`application/json`
+
+**接口描述**:获取数据源的数据库名称列表
+
+**请求参数**:
+
+| 参数名称 | 参数说明 | 请求类型    | 是否必须 | 数据类型 | schema |
+| -------- | -------- | ----- | -------- | -------- | ------ |
+|id|id|path|true|string||
+
+**响应示例**:
+```javascript
+{
+    "method": null,
+    "status": 0,
+    "message": "OK",
+    "data": {
+        "task": {
+                "taskID": 1, 
+                "instance": "xxx",
+                "execId": "exec-id-xxx",
+                "umUser": "test",
+                "engineInstance": "xxx",
+                "progress": "10%",
+                "logPath": "hdfs://xxx/xxx/xxx",
+                "resultLocation": "hdfs://xxx/xxx/xxx",
+                "status": "FAILED",
+                "createdTime": "2019-01-01 00:00:00",
+                "updatedTime": "2019-01-01 01:00:00",
+                "engineType": "spark",
+                "errorCode": 100,
+                "errDesc": "Task Failed with error code 100",
+                "executeApplicationName": "hello world",
+                "requestApplicationName": "hello world",
+                "runType": "xxx",
+                "paramJson": "{\"xxx\":\"xxx\"}",
+                "costTime": 10000,
+                "strongerExecId": "execId-xxx",
+                "sourceJson": "{\"xxx\":\"xxx\"}"
+        }
+    }
+}
+```
+
+## listHistoryTask
+**接口地址**:`/api/rest_j/v1/jobhistory/list`
+
+**请求方式**:`GET`
+
+**请求数据类型**:`application/x-www-form-urlencoded`
+
+**响应数据类型**:`application/json`
+
+**接口描述**:
+
+**请求参数**:
+
+| 参数名称 | 参数说明 | 请求类型    | 是否必须 | 数据类型 | schema |
+| -------- | -------- | ----- | -------- | -------- | ------ |
+|startDate|startDate|path|false|Long||
+|endDate|endDate|path|false|Long||
+|status|status|path|false|string||
+|pageNow|pageNow|path|false|Integer||
+|pageSize|pageSize|path|false|Integer||
+|taskID|taskID|path|false|Long||
+|executeApplicationName|executeApplicationName|path|false|string||
+|creator|creator|path|false|string||
+|proxyUser|proxyUser|path|false|string||
+|isAdminView|isAdminView|path|false|Boolean||
+
+**响应示例**:
+```javascript
+{
+    "method": null,
+        "status": 0,
+        "message": "OK",
+        "data": {
+            "tasks": [{
+                "taskID": 1,
+                "instance": "xxx",
+                "execId": "exec-id-xxx",
+                "umUser": "test",
+                "engineInstance": "xxx",
+                "progress": "10%",
+                "logPath": "hdfs://xxx/xxx/xxx",
+                "resultLocation": "hdfs://xxx/xxx/xxx",
+                "status": "FAILED",
+                "createdTime": "2019-01-01 00:00:00",
+                "updatedTime": "2019-01-01 01:00:00",
+                "engineType": "spark",
+                "errorCode": 100,
+                "errDesc": "Task Failed with error code 100",
+                "executeApplicationName": "hello world",
+                "requestApplicationName": "hello world",
+                "runType": "xxx",
+                "paramJson": "{\"xxx\":\"xxx\"}",
+                "costTime": 10000,
+                "strongerExecId": "execId-xxx",
+                "sourceJson": "{\"xxx\":\"xxx\"}"
+            },
+            {
+                "taskID": 2,
+                "instance": "xxx",
+                "execId": "exec-id-xxx",
+                "umUser": "test",
+                "engineInstance": "xxx",
+                "progress": "10%",
+                "logPath": "hdfs://xxx/xxx/xxx",
+                "resultLocation": "hdfs://xxx/xxx/xxx",
+                "status": "FAILED",
+                "createdTime": "2019-01-01 00:00:00",
+                "updatedTime": "2019-01-01 01:00:00",
+                "engineType": "spark",
+                "errorCode": 100,
+                "errDesc": "Task Failed with error code 100",
+                "executeApplicationName": "hello world",
+                "requestApplicationName": "hello world",
+                "runType": "xxx",
+                "paramJson": "{\"xxx\":\"xxx\"}",
+                "costTime": 10000,
+                "strongerExecId": "execId-xxx",
+                "sourceJson": "{\"xxx\":\"xxx\"}"
+            }],
+            "totalPage": 1
+    }
+}
+```
+
+## listUndoneHistoryTask
+**接口地址**:`/api/rest_j/v1/jobhistory/listundone`
+
+**请求方式**:`GET`
+
+**请求数据类型**:`application/x-www-form-urlencoded`
+
+**响应数据类型**:`application/json`
+
+**接口描述**:
+
+**请求参数**:
+
+| 参数名称 | 参数说明 | 请求类型    | 是否必须 | 数据类型 | schema |
+| -------- | -------- | ----- | -------- | -------- | ------ |
+|startDate|startDate|path|false|Long||
+|endDate|endDate|path|false|Long||
+|status|status|path|false|string||
+|pageNow|pageNow|path|false|Integer||
+|pageSize|pageSize|path|false|Integer||
+|startTaskID|startTaskID|path|false|Long||
+|engineType|engineType|path|false|string||
+|creator|creator|path|false|string||
+
+**响应示例**:
+```javascript
+{
+    "method": null,
+        "status": 0,
+        "message": "OK",
+        "data": {
+            "tasks": [{
+                "taskID": 1,
+                "instance": "xxx",
+                "execId": "exec-id-xxx",
+                "umUser": "test",
+                "engineInstance": "xxx",
+                "progress": "10%",
+                "logPath": "hdfs://xxx/xxx/xxx",
+                "resultLocation": "hdfs://xxx/xxx/xxx",
+                "status": "Running",
+                "createdTime": "2019-01-01 00:00:00",
+                "updatedTime": "2019-01-01 01:00:00",
+                "engineType": "spark",
+                "errorCode": 100,
+                "errDesc": "Task Failed with error code 100",
+                "executeApplicationName": "hello world",
+                "requestApplicationName": "hello world",
+                "runType": "xxx",
+                "paramJson": "{\"xxx\":\"xxx\"}",
+                "costTime": 10000,
+                "strongerExecId": "execId-xxx",
+                "sourceJson": "{\"xxx\":\"xxx\"}"
+            },
+            {
+                "taskID": 2,
+                "instance": "xxx",
+                "execId": "exec-id-xxx",
+                "umUser": "test",
+                "engineInstance": "xxx",
+                "progress": "10%",
+                "logPath": "hdfs://xxx/xxx/xxx",
+                "resultLocation": "hdfs://xxx/xxx/xxx",
+                "status": "Running",
+                "createdTime": "2019-01-01 00:00:00",
+                "updatedTime": "2019-01-01 01:00:00",
+                "engineType": "spark",
+                "errorCode": 100,
+                "errDesc": "Task Failed with error code 100",
+                "executeApplicationName": "hello world",
+                "requestApplicationName": "hello world",
+                "runType": "xxx",
+                "paramJson": "{\"xxx\":\"xxx\"}",
+                "costTime": 10000,
+                "strongerExecId": "execId-xxx",
+                "sourceJson": "{\"xxx\":\"xxx\"}"
+            }],
+            "totalPage": 1
+    }
+}
+```


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@linkis.apache.org
For additional commands, e-mail: commits-help@linkis.apache.org