You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by rm...@apache.org on 2014/08/25 11:14:52 UTC

[4/7] git commit: [Documentation] Fix broken links

[Documentation] Fix broken links


Project: http://git-wip-us.apache.org/repos/asf/incubator-flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-flink/commit/0d176222
Tree: http://git-wip-us.apache.org/repos/asf/incubator-flink/tree/0d176222
Diff: http://git-wip-us.apache.org/repos/asf/incubator-flink/diff/0d176222

Branch: refs/heads/release-0.6.1
Commit: 0d176222d30c5f6ae11ad24ca04bfa0ca3f3cac1
Parents: 2ecc175
Author: Robert Metzger <rm...@apache.org>
Authored: Wed Aug 20 18:13:15 2014 +0200
Committer: Robert Metzger <rm...@apache.org>
Committed: Mon Aug 25 11:10:52 2014 +0200

----------------------------------------------------------------------
 docs/_layouts/docs.html          |   1 +
 docs/img/plan_visualizer1.png    | Bin 0 -> 68108 bytes
 docs/img/plan_visualizer2.png    | Bin 0 -> 149860 bytes
 docs/internal_add_operator.md    |   2 +-
 docs/java_api_guide.md           |  20 +++++++++++---------
 docs/java_api_transformations.md |   2 +-
 docs/scala_api_examples.md       |   2 +-
 docs/scala_api_guide.md          |   4 ++--
 8 files changed, 17 insertions(+), 14 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/_layouts/docs.html
----------------------------------------------------------------------
diff --git a/docs/_layouts/docs.html b/docs/_layouts/docs.html
index a9b94be..88d081e 100644
--- a/docs/_layouts/docs.html
+++ b/docs/_layouts/docs.html
@@ -47,6 +47,7 @@
                     <li>Programming Guides
                         <ul>
                             <li><a href="java_api_guide.html">Java API</a></li>
+                            <li><a href="java_api_transformations.html">Java API Transformations</a></li>
                             <li><a href="scala_api_guide.html">Scala API</a></li>
                             <li><a href="hadoop_compatability.html">Hadoop Compatability</a></li>
                             <li><a href="iterations.html">Iterations</a></li>

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/img/plan_visualizer1.png
----------------------------------------------------------------------
diff --git a/docs/img/plan_visualizer1.png b/docs/img/plan_visualizer1.png
new file mode 100644
index 0000000..3fa45ea
Binary files /dev/null and b/docs/img/plan_visualizer1.png differ

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/img/plan_visualizer2.png
----------------------------------------------------------------------
diff --git a/docs/img/plan_visualizer2.png b/docs/img/plan_visualizer2.png
new file mode 100644
index 0000000..ef07f7e
Binary files /dev/null and b/docs/img/plan_visualizer2.png differ

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/internal_add_operator.md
----------------------------------------------------------------------
diff --git a/docs/internal_add_operator.md b/docs/internal_add_operator.md
index b24df54..40b0a69 100644
--- a/docs/internal_add_operator.md
+++ b/docs/internal_add_operator.md
@@ -74,7 +74,7 @@ void setInput(DataSet<IN> inputData);
 DataSet<OUT> createResult();
 ```
 
-The {% gh_link /flink-addons/spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java "VertexCentricIteration" %} operator is implemented that way. Below is an example how to implement the *count()* operator that way.
+The {% gh_link /flink-addons/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java "VertexCentricIteration" %} operator is implemented that way. Below is an example how to implement the *count()* operator that way.
 
 ``` java
 public class Counter<T> implements CustomUnaryOperation<T, Long> {

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/java_api_guide.md
----------------------------------------------------------------------
diff --git a/docs/java_api_guide.md b/docs/java_api_guide.md
index 76dd332..4af65f0 100644
--- a/docs/java_api_guide.md
+++ b/docs/java_api_guide.md
@@ -64,7 +64,7 @@ Linking with Flink
 
 To write programs with Flink, you need to include Flinkā€™s Java API library in your project.
 
-The simplest way to do this is to use the [quickstart scripts]({{site.baseurl}}/java_api_quickstart.html). They create a blank project from a template (a Maven Archetype), which sets up everything for you. To manually create the project, you can use the archetype and create a project by calling:
+The simplest way to do this is to use the [quickstart scripts](java_api_quickstart.html). They create a blank project from a template (a Maven Archetype), which sets up everything for you. To manually create the project, you can use the archetype and create a project by calling:
 
 ```bash
 mvn archetype:generate /
@@ -94,7 +94,7 @@ Please refer to the [downloads page]({{site.baseurl}}/downloads.html) for a list
 In order to link against the latest SNAPSHOT versions of the code, please follow [this guide]({{site.baseurl}}/downloads.html#nightly).
 
 The *flink-clients* dependency is only necessary to invoke the Flink program locally (for example to run it standalone for testing and debugging). 
-If you intend to only export the program as a JAR file and [run it on a cluster]({{site.baseurl}}/cluster_execution.html), you can skip that dependency.
+If you intend to only export the program as a JAR file and [run it on a cluster](cluster_execution.html), you can skip that dependency.
 
 [Back to top](#top)
 
@@ -131,8 +131,8 @@ Typically, you only need to use `getExecutionEnvironment()`, since this
 will do the right thing depending on the context: if you are executing
 your program inside an IDE or as a regular Java program it will create
 a local environment that will execute your program on your local machine. If
-you created a JAR file from you program, and invoke it through the [command line]({{site.baseurl}}/cli.html)
-or the [web interface]({{site.baseurl}}/web_client.html),
+you created a JAR file from you program, and invoke it through the [command line](cli.html)
+or the [web interface](web_client.html),
 the Flink cluster manager will
 execute your main method and `getExecutionEnvironment()` will return
 an execution environment for executing your program on a cluster.
@@ -219,7 +219,7 @@ Transformations
 Data transformations transform one or more DataSets into a new DataSet. Programs can combine multiple transformations into
 sophisticated assemblies.
 
-This section gives a brief overview of the available transformations. The [transformations documentation]({{site.baseurl}}/java_api_transformations.html)
+This section gives a brief overview of the available transformations. The [transformations documentation](java_api_transformations.html)
 has full description of all transformations with examples.
 
 <table class="table table-bordered">
@@ -321,7 +321,7 @@ DataSet<Tuple3<Integer, String, Double>> output = input.sum(0).andMin(2);
     </tr>
       <td><strong>Join</strong></td>
       <td>
-        Joins two data sets by creating all pairs of elements that are equal on their keys. Optionally uses a JoinFunction to turn the pair of elements into a single element, or a FlatJoinFunction to turn the pair of elements into arbitararily many (including none) elements. See [keys](#keys) on how to define join keys.
+        Joins two data sets by creating all pairs of elements that are equal on their keys. Optionally uses a JoinFunction to turn the pair of elements into a single element, or a FlatJoinFunction to turn the pair of elements into arbitararily many (including none) elements. See <a href="#keys">keys</a> on how to define join keys.
 {% highlight java %}
 result = input1.join(input2)
                .where(0)       // key of the first input (tuple field 0)
@@ -333,7 +333,7 @@ result = input1.join(input2)
     <tr>
       <td><strong>CoGroup</strong></td>
       <td>
-        <p>The two-dimensional variant of the reduce operation. Groups each input on one or more fields and then joins the groups. The transformation function is called per pair of groups. See [keys](#keys) on how to define coGroup keys.</p>
+        <p>The two-dimensional variant of the reduce operation. Groups each input on one or more fields and then joins the groups. The transformation function is called per pair of groups. See <a href="#keys">keys</a> on how to define coGroup keys.</p>
 {% highlight java %}
 data1.coGroup(data2)
      .where(0)
@@ -570,7 +570,7 @@ on iterations (see [Iterations](#iterations)).
 In particular for the `reduceGroup` transformation, using a rich
 function is the only way to define an optional `combine` function. See
 the
-[transformations documentation]({{site.baseurl}}/java_api_transformations.html)
+[transformations documentation](java_api_transformations.html)
 for a complete example.
 
 [Back to top](#top)
@@ -1170,7 +1170,9 @@ To visualize the execution plan, do the following:
 
 After these steps, a detailed execution plan will be visualized.
 
-<img alt="A flink job execution graph." src="{{site.baseurl}}/img/blog/plan_visualizer2.png" width="80%">
+<img alt="A flink job execution graph." src="img/plan_visualizer2.png" width="80%">
+
+
 __Web Interface__
 
 Flink offers a web interface for submitting and executing jobs. If you choose to use this interface to submit your packaged program, you have the option to also see the plan visualization.

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/java_api_transformations.md
----------------------------------------------------------------------
diff --git a/docs/java_api_transformations.md b/docs/java_api_transformations.md
index e4980de..6761c52 100644
--- a/docs/java_api_transformations.md
+++ b/docs/java_api_transformations.md
@@ -7,7 +7,7 @@ DataSet Transformations
 -----------------------
 
 This document gives a deep-dive into the available transformations on DataSets. For a general introduction to the
-Flink Java API, please refer to the [API guide]({{site.baseurl}}/java_api_guide.html)
+Flink Java API, please refer to the [API guide](java_api_guide.html)
 
 
 ### Map

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/scala_api_examples.md
----------------------------------------------------------------------
diff --git a/docs/scala_api_examples.md b/docs/scala_api_examples.md
index fad1919..b6689f0 100644
--- a/docs/scala_api_examples.md
+++ b/docs/scala_api_examples.md
@@ -5,7 +5,7 @@ title:  "Scala API Examples"
 The following example programs showcase different applications of Flink from simple word counting to graph algorithms.
 The code samples illustrate the use of [Flink's Scala API](scala_api_guide.html). 
 
-The full source code of the following and more examples can be found in the [flink-scala-examples](https://github.com/apache/incubator-flink/tree/ca2b287a7a78328ebf43766b9fdf39b56fb5fd4f/flink-examples/flink-scala-examples) module.
+The full source code of the following and more examples can be found in the [flink-scala-examples](https://github.com/apache/incubator-flink/tree/master/flink-examples/flink-scala-examples) module.
 
 # Word Count
 

http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/0d176222/docs/scala_api_guide.md
----------------------------------------------------------------------
diff --git a/docs/scala_api_guide.md b/docs/scala_api_guide.md
index d6d53a9..e46a898 100644
--- a/docs/scala_api_guide.md
+++ b/docs/scala_api_guide.md
@@ -105,7 +105,7 @@ following lines to your POM.
 ```
 
 To quickly get started you can use the Flink Scala quickstart available
-[here]({{site.baseurl}}/quickstart/scala.html). This will give you a
+[here](scala_api_quickstart.html). This will give you a
 completeMaven project with some working example code that you can use to explore
 the system or as basis for your own projects.
 
@@ -379,7 +379,7 @@ reading the array and returns the element read from the binary data.
 Operations on DataSet
 ---------------------
 
-As explained in [Programming Model](pmodel.html#operators),
+As explained in [Java API](java_api_guide.html#transformations),
 a Flink job is a graph of operators that process data coming from
 sources that is finally written to sinks. When you use the Scala front end
 these operators as well as the graph is created behind the scenes. For example,