You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kylin.apache.org by li...@apache.org on 2017/06/27 09:36:42 UTC

svn commit: r1800042 - in /kylin/site: cn/docs20/install/manual_install_guide.html docs20/howto/howto_update_coprocessor.html docs20/install/hadoop_env.html docs20/tutorial/cube_spark.html feed.xml

Author: lidong
Date: Tue Jun 27 09:36:42 2017
New Revision: 1800042

URL: http://svn.apache.org/viewvc?rev=1800042&view=rev
Log:
Misc updates on v2.0 documentation

Modified:
    kylin/site/cn/docs20/install/manual_install_guide.html
    kylin/site/docs20/howto/howto_update_coprocessor.html
    kylin/site/docs20/install/hadoop_env.html
    kylin/site/docs20/tutorial/cube_spark.html
    kylin/site/feed.xml

Modified: kylin/site/cn/docs20/install/manual_install_guide.html
URL: http://svn.apache.org/viewvc/kylin/site/cn/docs20/install/manual_install_guide.html?rev=1800042&r1=1800041&r2=1800042&view=diff
==============================================================================
--- kylin/site/cn/docs20/install/manual_install_guide.html (original)
+++ kylin/site/cn/docs20/install/manual_install_guide.html Tue Jun 27 09:36:42 2017
@@ -1286,37 +1286,18 @@
 
 <h2 id="section-1">前提条件</h2>
 <ul>
-  <li>已安装Tomcat,输出到CATALINA_HOME(with CATALINA_HOME exported).</li>
-  <li>Kylin 二进制文件拷贝至本地并解压,之后使用$KYLIN_HOME引用</li>
+  <li>Kylin 二进制文件拷贝至本地并解压,之后使用$KYLIN_HOME引用<br />
+<code class="highlighter-rouge">export KYLIN_HOME=/path/to/kylin</code><br />
+<code class="highlighter-rouge">cd $KYLIN_HOME</code></li>
 </ul>
 
-<h2 id="section-2">步骤</h2>
-
-<h3 id="jars">准备Jars</h3>
-
-<p>Kylin会需要使用两个jar包,两个jar包和配置在默认kylin.properties:(there two jars and configured in the default kylin.properties)</p>
-
-<div class="highlighter-rouge"><pre class="highlight"><code>kylin.job.jar=/tmp/kylin/kylin-job-latest.jar
-
-</code></pre>
-</div>
-
-<p>这是Kylin用于MR jobs的job jar包。你需要复制 $KYLIN_HOME/job/target/kylin-job-latest.jar 到 /tmp/kylin/</p>
-
-<div class="highlighter-rouge"><pre class="highlight"><code>kylin.coprocessor.local.jar=/tmp/kylin/kylin-coprocessor-latest.jar
-
-</code></pre>
-</div>
-
-<p>这是一个Kylin会放在hbase上的hbase协处理jar包。它用于提高性能。你需要复制 $KYLIN_HOME/storage/target/kylin-coprocessor-latest.jar 到 /tmp/kylin/</p>
-
 <h3 id="kylin">启动Kylin</h3>
 
-<p>以<code class="highlighter-rouge">./kylin.sh start</code></p>
+<p>以<code class="highlighter-rouge">./bin/kylin.sh start</code></p>
 
 <p>启动Kylin</p>
 
-<p>并以<code class="highlighter-rouge">./Kylin.sh stop</code></p>
+<p>并以<code class="highlighter-rouge">./bin/Kylin.sh stop</code></p>
 
 <p>停止Kylin</p>
 

Modified: kylin/site/docs20/howto/howto_update_coprocessor.html
URL: http://svn.apache.org/viewvc/kylin/site/docs20/howto/howto_update_coprocessor.html?rev=1800042&r1=1800041&r2=1800042&view=diff
==============================================================================
--- kylin/site/docs20/howto/howto_update_coprocessor.html (original)
+++ kylin/site/docs20/howto/howto_update_coprocessor.html Tue Jun 27 09:36:42 2017
@@ -2704,7 +2704,7 @@
 
 <p>There’s a CLI tool to update HBase Coprocessor:</p>
 
-<div class="highlight"><pre><code class="language-groff" data-lang="groff">$KYLIN_HOME/bin/kylin.sh org.apache.kylin.storage.hbase.util.DeployCoprocessorCLI $KYLIN_HOME/lib/kylin-coprocessor-*.jar all</code></pre></div>
+<div class="highlight"><pre><code class="language-groff" data-lang="groff">$KYLIN_HOME/bin/kylin.sh org.apache.kylin.storage.hbase.util.DeployCoprocessorCLI default all</code></pre></div>
 
 
 							</article>

Modified: kylin/site/docs20/install/hadoop_env.html
URL: http://svn.apache.org/viewvc/kylin/site/docs20/install/hadoop_env.html?rev=1800042&r1=1800041&r2=1800042&view=diff
==============================================================================
--- kylin/site/docs20/install/hadoop_env.html (original)
+++ kylin/site/docs20/install/hadoop_env.html Tue Jun 27 09:36:42 2017
@@ -2700,7 +2700,7 @@
 							
 							
 							<article class="post-content" >	
-							<p>Kylin need run in a Hadoop node, to get better stability, we suggest you to deploy it a pure Hadoop client machine, on which it the command lines like <code class="highlighter-rouge">hive</code>, <code class="highlighter-rouge">hbase</code>, <code class="highlighter-rouge">hadoop</code>, <code class="highlighter-rouge">hdfs</code> already be installed and configured. The Linux account that running Kylin has got permission to the Hadoop cluster, including create/write hdfs, hive tables, hbase tables and submit MR jobs.</p>
+							<p>Kylin need run in a Hadoop node, to get better stability, we suggest you to deploy it a pure Hadoop client machine, on which  the command lines like <code class="highlighter-rouge">hive</code>, <code class="highlighter-rouge">hbase</code>, <code class="highlighter-rouge">hadoop</code>, <code class="highlighter-rouge">hdfs</code> already be installed and configured. The Linux account that running Kylin has got permission to the Hadoop cluster, including create/write hdfs, hive tables, hbase tables and submit MR jobs.</p>
 
 <h2 id="recommended-hadoop-versions">Recommended Hadoop Versions</h2>
 
@@ -2729,14 +2729,6 @@ ambari-server start
 
 <p>With both command successfully run you can go to ambari homepage at <a href="http://your_sandbox_ip:8080">http://your_sandbox_ip:8080</a> (user:admin,password:admin) to check everything’s status. <strong>By default hortonworks ambari disables Hbase, you need manually start the <code class="highlighter-rouge">Hbase</code> service at ambari homepage.</strong></p>
 
-<p><img src="https://raw.githubusercontent.com/KylinOLAP/kylinolap.github.io/master/docs/installation/starthbase.png" alt="start hbase in ambari" /></p>
-
-<p><strong>Additonal Info for setting up Hortonworks Sandbox on Virtual Box</strong></p>
-
-<div class="highlighter-rouge"><pre class="highlight"><code>Please make sure Hbase Master port [Default 60000] and Zookeeper [Default 2181] is forwarded to Host OS.
-</code></pre>
-</div>
-
 
 							</article>
 						</div>

Modified: kylin/site/docs20/tutorial/cube_spark.html
URL: http://svn.apache.org/viewvc/kylin/site/docs20/tutorial/cube_spark.html?rev=1800042&r1=1800041&r2=1800042&view=diff
==============================================================================
--- kylin/site/docs20/tutorial/cube_spark.html (original)
+++ kylin/site/docs20/tutorial/cube_spark.html Tue Jun 27 09:36:42 2017
@@ -2717,10 +2717,11 @@ export KYLIN_HOME=/usr/local/apache-kyli
 
 <h2 id="prepare-kylinenvhadoop-conf-dir">Prepare “kylin.env.hadoop-conf-dir”</h2>
 
-<p>To run Spark on Yarn, need specify <strong>HADOOP_CONF_DIR</strong> environment variable, which is the directory that contains the (client side) configuration files for Hadoop. In many Hadoop distributions the directory is “/etc/hadoop/conf”; But Kylin not only need access HDFS, Yarn and Hive, but also HBase, so the default directory might not have all necessary files. In this case, you need create a new directory and then copying or linking those client files (core-site.xml, yarn-site.xml, hive-site.xml and hbase-site.xml) there. In HDP 2.4, there is a conflict between hive-tez and Spark, so need change the default engine from “tez” to “mr” when copy for Kylin.</p>
+<p>To run Spark on Yarn, need specify <strong>HADOOP_CONF_DIR</strong> environment variable, which is the directory that contains the (client side) configuration files for Hadoop. In many Hadoop distributions the directory is “/etc/hadoop/conf”; But Kylin not only need access HDFS, Yarn and Hive, but also HBase, so the default directory might not have all necessary files. In this case, you need create a new directory and then copying or linking those client files (core-site.xml, hdfs-site.xml, yarn-site.xml, hive-site.xml and hbase-site.xml) there. In HDP 2.4, there is a conflict between hive-tez and Spark, so need change the default engine from “tez” to “mr” when copy for Kylin.</p>
 
 <div class="highlight"><pre><code class="language-groff" data-lang="groff">mkdir $KYLIN_HOME/hadoop-conf
 ln -s /etc/hadoop/conf/core-site.xml $KYLIN_HOME/hadoop-conf/core-site.xml 
+ln -s /etc/hadoop/conf/hdfs-site.xml $KYLIN_HOME/hadoop-conf/hdfs-site.xml 
 ln -s /etc/hadoop/conf/yarn-site.xml $KYLIN_HOME/hadoop-conf/yarn-site.xml 
 ln -s /etc/hbase/2.4.0.0-169/0/hbase-site.xml $KYLIN_HOME/hadoop-conf/hbase-site.xml 
 cp /etc/hive/2.4.0.0-169/0/hive-site.xml $KYLIN_HOME/hadoop-conf/hive-site.xml 
@@ -2782,7 +2783,7 @@ $KYLIN_HOME/bin/kylin.sh start</code></p
 
 <p><img src="/images/tutorial/2.0/Spark-Cubing-Tutorial/1_cube_engine.png" alt="" /></p>
 
-<p>Click “Next” to the “Configuration Overwrites” page, click “+Property” to add property “kylin.engine.spark.rdd-partition-cut-mb” with value “100” (reasons below):</p>
+<p>Click “Next” to the “Configuration Overwrites” page, click “+Property” to add property “kylin.engine.spark.rdd-partition-cut-mb” with value “500” (reasons below):</p>
 
 <p><img src="/images/tutorial/2.0/Spark-Cubing-Tutorial/2_overwrite_partition.png" alt="" /></p>
 

Modified: kylin/site/feed.xml
URL: http://svn.apache.org/viewvc/kylin/site/feed.xml?rev=1800042&r1=1800041&r2=1800042&view=diff
==============================================================================
--- kylin/site/feed.xml (original)
+++ kylin/site/feed.xml Tue Jun 27 09:36:42 2017
@@ -19,8 +19,8 @@
     <description>Apache Kylin Home</description>
     <link>http://kylin.apache.org/</link>
     <atom:link href="http://kylin.apache.org/feed.xml" rel="self" type="application/rss+xml"/>
-    <pubDate>Mon, 29 May 2017 06:59:17 -0700</pubDate>
-    <lastBuildDate>Mon, 29 May 2017 06:59:17 -0700</lastBuildDate>
+    <pubDate>Tue, 27 Jun 2017 02:35:13 -0700</pubDate>
+    <lastBuildDate>Tue, 27 Jun 2017 02:35:13 -0700</lastBuildDate>
     <generator>Jekyll v2.5.3</generator>
     
       <item>