You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kylin.apache.org by sh...@apache.org on 2018/08/10 01:44:43 UTC

[kylin] branch document updated (914cf30 -> 26e8e7d)

This is an automated email from the ASF dual-hosted git repository.

shaofengshi pushed a change to branch document
in repository https://gitbox.apache.org/repos/asf/kylin.git.


    from 914cf30  fix comma symbol to english version
     new bacbc58  KYLIN-3486 cube_streaming page command exception
     new 26e8e7d  update superset tutorial

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 website/_docs/tutorial/cube_spark.cn.md       |  2 +-
 website/_docs/tutorial/cube_streaming.cn.md   | 14 +++++++-------
 website/_docs/tutorial/cube_streaming.md      |  6 +++---
 website/_docs/tutorial/superset.cn.md         | 10 ----------
 website/_docs/tutorial/superset.md            | 10 ----------
 website/_docs16/tutorial/cube_streaming.md    |  6 +++---
 website/_docs20/tutorial/cube_streaming.md    |  6 +++---
 website/_docs21/tutorial/cube_streaming.md    |  6 +++---
 website/_docs23/tutorial/cube_spark.cn.md     |  2 +-
 website/_docs23/tutorial/cube_streaming.cn.md | 14 +++++++-------
 website/_docs23/tutorial/cube_streaming.md    |  6 +++---
 11 files changed, 31 insertions(+), 51 deletions(-)


[kylin] 02/02: update superset tutorial

Posted by sh...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

shaofengshi pushed a commit to branch document
in repository https://gitbox.apache.org/repos/asf/kylin.git

commit 26e8e7de9766151da44720d8a79d583b0c9416f1
Author: shaofengshi <sh...@apache.org>
AuthorDate: Fri Aug 10 09:44:32 2018 +0800

    update superset tutorial
---
 website/_docs/tutorial/superset.cn.md | 10 ----------
 website/_docs/tutorial/superset.md    | 10 ----------
 2 files changed, 20 deletions(-)

diff --git a/website/_docs/tutorial/superset.cn.md b/website/_docs/tutorial/superset.cn.md
index 80adc7a..da75530 100644
--- a/website/_docs/tutorial/superset.cn.md
+++ b/website/_docs/tutorial/superset.cn.md
@@ -33,13 +33,3 @@ Apache Kylin 和 Apache Superset 都是以为其用户提供更快和可交互
 
 ##### 其它功能
 Apache Superset 也支持导出 CSV, 共享, 以及查看 SQL 查询。
-
-### Kyligence Insight for Superset
-定制版的 Superset:Kyligence Insight for Superset,使得 Kylin 的用户多了一种选择。具体的安装步骤请在 github 上查看 [这个项目](https://github.com/Kyligence/Insight-for-Superset)。
-
-##### 相比原生 Superset, 提供了如下增强功能:
-1. 统一用户管理,用户无需在 "Superset" 上额外创建用户和赋予权限,统一在 Kylin 后端管理用户访问权限,直接使用 Kylin 账户登录 Superset。
-2. 一键同步 Kylin Cube,无需在 Superset 端重新定义数据模型,直接查询 Cube.
-3. 支持多表连接模型,支持 inner join 和 outer join.
-4. Docker 容器化部署 Superset,一键启动,降低部署和升级门槛。
-5. 自动适配 Kylin 查询语法。
\ No newline at end of file
diff --git a/website/_docs/tutorial/superset.md b/website/_docs/tutorial/superset.md
index a1d0c9e..12eeed6 100644
--- a/website/_docs/tutorial/superset.md
+++ b/website/_docs/tutorial/superset.md
@@ -34,13 +34,3 @@ Please read [this article](http://kylin.apache.org/blog/2018/01/01/kylin-and-sup
 ##### Other functionalities
 Apache Superset also supports exporting to CSV, sharing, and viewing SQL query.
 
-
-### Kyligence Insight for Superset
-A customized version of Superset: Kyligence Insight for Superset gives Kylin users a choice. Please check [this project](https://github.com/Kyligence/Insight-for-Superset) on github for specific installation steps.
-
-##### Compared to the native Superset, it offers the following enhancements:
-1. Unified user management, users do not need to create additional users and permissions on "Superset", manage user access rights on the Kylin backend, and log in to Superset directly using Kylin account.
-2. One-click synchronization Kylin Cube, no need to redefine the data model on the Superset side, directly query Cube.
-3. Support multi-table join model, support inner join and outer join.
-4. Docker containerized deployment Superset, one-click startup, reducing deployment and upgrade thresholds.
-5. Automatically adapt Kylin query syntax.


[kylin] 01/02: KYLIN-3486 cube_streaming page command exception

Posted by sh...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

shaofengshi pushed a commit to branch document
in repository https://gitbox.apache.org/repos/asf/kylin.git

commit bacbc584f114b33efba0c5a44c4144a10235bad7
Author: GinaZhai <na...@kyligence.io>
AuthorDate: Thu Aug 9 23:19:39 2018 +0800

    KYLIN-3486 cube_streaming page command exception
    
    Signed-off-by: shaofengshi <sh...@apache.org>
---
 website/_docs/tutorial/cube_spark.cn.md       |  2 +-
 website/_docs/tutorial/cube_streaming.cn.md   | 14 +++++++-------
 website/_docs/tutorial/cube_streaming.md      |  6 +++---
 website/_docs16/tutorial/cube_streaming.md    |  6 +++---
 website/_docs20/tutorial/cube_streaming.md    |  6 +++---
 website/_docs21/tutorial/cube_streaming.md    |  6 +++---
 website/_docs23/tutorial/cube_spark.cn.md     |  2 +-
 website/_docs23/tutorial/cube_streaming.cn.md | 14 +++++++-------
 website/_docs23/tutorial/cube_streaming.md    |  6 +++---
 9 files changed, 31 insertions(+), 31 deletions(-)

diff --git a/website/_docs/tutorial/cube_spark.cn.md b/website/_docs/tutorial/cube_spark.cn.md
index b1909cc..913be9f 100644
--- a/website/_docs/tutorial/cube_spark.cn.md
+++ b/website/_docs/tutorial/cube_spark.cn.md
@@ -138,7 +138,7 @@ Kylin 启动后,访问 Kylin 网站,在 "Advanced Setting" 页,编辑名
 当出现 error,您可以首先查看 "logs/kylin.log". 其中包含 Kylin 执行的所有 Spark 命令,例如:
 
 {% highlight Groff markup %}
-2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.queue=default  --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...]
+2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.queue=default  --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...]
 
 {% endhighlight %}
 
diff --git a/website/_docs/tutorial/cube_streaming.cn.md b/website/_docs/tutorial/cube_streaming.cn.md
index 00d5429..2433554 100644
--- a/website/_docs/tutorial/cube_streaming.cn.md
+++ b/website/_docs/tutorial/cube_streaming.cn.md
@@ -14,7 +14,7 @@ Kylin v1.6 发布了可扩展的 streaming cubing 功能,它利用 Hadoop 消
 ## 安装 Kafka 0.10.0.0 和 Kylin
 不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果其运行着请先停掉。
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -127,7 +127,7 @@ Streaming Cube 和普通的 cube 大致上一样. 有以下几点需要您注意
 您可以在 web GUI 触发 build,通过点击 "Actions" -> "Build",或用 'curl' 命令发送一个请求到 Kylin RESTful API:
 
 {% highlight Groff markup %}
-curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
 {% endhighlight %}
 
 请注意 API 终端和普通 cube 不一样 (这个 URL 以 "build2" 结尾)。
@@ -139,7 +139,7 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 ## 点击 "Insight" 标签,编写 SQL 运行,例如:
 
  {% highlight Groff markup %}
-select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_table group by minute_start order by minute_start
+select minute_start, count(*), sum(amount), sum(qty) from streaming_sales_table group by minute_start order by minute_start
  {% endhighlight %}
 
 结果如下。
@@ -152,7 +152,7 @@ select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_tab
 
   {% highlight Groff markup %}
 crontab -e
-*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
+*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
  {% endhighlight %}
 
 现在您可以观看 cube 从 streaming 中自动 built。当 cube segments 累积到更大的时间范围,Kylin 将会自动的将其合并到一个更大的 segment 中。
@@ -202,18 +202,18 @@ Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.producer.P
  * 如果 Kafka 里已经有一组历史 message 且您不想从最开始 build,您可以触发一个调用来将当前的结束位置设为 cube 的开始:
 
 {% highlight Groff markup %}
-curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets
 {% endhighlight %}
 
  * 如果一些 build job 出错了并且您将其 discard,Cube 中就会留有一个洞(或称为空隙)。每一次 Kylin 都会从最后的位置 build,您不可期望通过正常的 builds 将洞填补。Kylin 提供了 API 检查和填补洞 
 
 检查洞:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 如果查询结果是一个空的数组,意味着没有洞;否则,触发 Kylin 填补他们:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs/tutorial/cube_streaming.md b/website/_docs/tutorial/cube_streaming.md
index 65d99d8..0989b04 100644
--- a/website/_docs/tutorial/cube_streaming.md
+++ b/website/_docs/tutorial/cube_streaming.md
@@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S
 ## Install Kafka 0.10.0.0 and Kylin
 Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running.
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 
 Check holes:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs16/tutorial/cube_streaming.md b/website/_docs16/tutorial/cube_streaming.md
index 6909545..63d81eb 100644
--- a/website/_docs16/tutorial/cube_streaming.md
+++ b/website/_docs16/tutorial/cube_streaming.md
@@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S
 ## Install Kafka 0.10.0.0 and Kylin
 Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running.
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 
 Check holes:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs20/tutorial/cube_streaming.md b/website/_docs20/tutorial/cube_streaming.md
index 08e5bf9..115d91a 100644
--- a/website/_docs20/tutorial/cube_streaming.md
+++ b/website/_docs20/tutorial/cube_streaming.md
@@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S
 ## Install Kafka 0.10.0.0 and Kylin
 Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running.
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 
 Check holes:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs21/tutorial/cube_streaming.md b/website/_docs21/tutorial/cube_streaming.md
index fa96db5..dd5eba2 100644
--- a/website/_docs21/tutorial/cube_streaming.md
+++ b/website/_docs21/tutorial/cube_streaming.md
@@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S
 ## Install Kafka 0.10.0.0 and Kylin
 Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running.
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 
 Check holes:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs23/tutorial/cube_spark.cn.md b/website/_docs23/tutorial/cube_spark.cn.md
index ca0dd99..f456a89 100644
--- a/website/_docs23/tutorial/cube_spark.cn.md
+++ b/website/_docs23/tutorial/cube_spark.cn.md
@@ -142,7 +142,7 @@ Kylin 启动后,访问 Kylin 网站,在 "Advanced Setting" 页,编辑名
 当出现 error,您可以首先查看 "logs/kylin.log". 其中包含 Kylin 执行的所有 Spark 命令,例如:
 
 {% highlight Groff markup %}
-2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.queue=default  --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...]
+2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.queue=default  --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...]
 
 {% endhighlight %}
 
diff --git a/website/_docs23/tutorial/cube_streaming.cn.md b/website/_docs23/tutorial/cube_streaming.cn.md
index cf094e9..99bfd96 100644
--- a/website/_docs23/tutorial/cube_streaming.cn.md
+++ b/website/_docs23/tutorial/cube_streaming.cn.md
@@ -14,7 +14,7 @@ Kylin v1.6 发布了可扩展的 streaming cubing 功能,它利用 Hadoop 消
 ## 安装 Kafka 0.10.0.0 和 Kylin
 不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果其运行着请先停掉。
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -127,7 +127,7 @@ Streaming Cube 和普通的 cube 大致上一样. 有以下几点需要您注意
 您可以在 web GUI 触发 build,通过点击 "Actions" -> "Build",或用 'curl' 命令发送一个请求到 Kylin RESTful API:
 
 {% highlight Groff markup %}
-curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
 {% endhighlight %}
 
 请注意 API 终端和普通 cube 不一样 (这个 URL 以 "build2" 结尾)。
@@ -139,7 +139,7 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 ## 点击 "Insight" 标签,编写 SQL 运行,例如:
 
  {% highlight Groff markup %}
-select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_table group by minute_start order by minute_start
+select minute_start, count(*), sum(amount), sum(qty) from streaming_sales_table group by minute_start order by minute_start
  {% endhighlight %}
 
 结果如下。
@@ -152,7 +152,7 @@ select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_tab
 
   {% highlight Groff markup %}
 crontab -e
-*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
+*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2
  {% endhighlight %}
 
 现在您可以观看 cube 从 streaming 中自动 built。当 cube segments 累积到更大的时间范围,Kylin 将会自动的将其合并到一个更大的 segment 中。
@@ -202,18 +202,18 @@ Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.producer.P
  * 如果 Kafka 里已经有一组历史 message 且您不想从最开始 build,您可以触发一个调用来将当前的结束位置设为 cube 的开始:
 
 {% highlight Groff markup %}
-curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets
 {% endhighlight %}
 
  * 如果一些 build job 出错了并且您将其 discard,Cube 中就会留有一个洞(或称为空隙)。每一次 Kylin 都会从最后的位置 build,您不可期望通过正常的 builds 将洞填补。Kylin 提供了 API 检查和填补洞 
 
 检查洞:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 如果查询结果是一个空的数组,意味着没有洞;否则,触发 Kylin 填补他们:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
diff --git a/website/_docs23/tutorial/cube_streaming.md b/website/_docs23/tutorial/cube_streaming.md
index 6b42d0f..4661511 100644
--- a/website/_docs23/tutorial/cube_streaming.md
+++ b/website/_docs23/tutorial/cube_streaming.md
@@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S
 ## Install Kafka 0.10.0.0 and Kylin
 Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running.
 {% highlight Groff markup %}
-curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
+curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/
 
 cd /usr/local/kafka_2.10-0.10.0.0/
 
@@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8"
 
 Check holes:
  {% highlight Groff markup %}
-curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}
 
 If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them:
  {% highlight Groff markup %}
-curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
+curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes
 {% endhighlight %}